You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 22, 2024. It is now read-only.
Run the cell that contains `pip install nzpy` which is the only pre-requisite for this notebook. `nzpy` lets us connect to the server and allow us to run DDL and DML SQLs.
* Open the notebook in edit mode, and in the cell with title `Connecting to the database`, provide the name of the connection that you created earlier in step 2.
### 6. Loading or Unloading data from external source
127
+
### 7. Loading or Unloading data from external source
124
128
125
129
> NOTE: Before loading or unloading the data, upload `orders.tbl` from the cloned repository folder by going to `doc/source/data`. In the project home page, on the `Assets` tab, click the data icon, and browse to upload the file. You will have to unzip the data locally first before you upload.
Data sources, like external servers, github etc. can be used by streaming data from the source through the python data pipeline. The python code is using nzpy which read data from external sources and connect that directly to `nzpy` pipe via a named pipe.
162
166
@@ -213,7 +217,7 @@ with con.cursor() as cursor:
213
217
streamer.join()
214
218
```
215
219
216
-
### 8. Load data from Object Store
220
+
### 9. Load data from Object Store
217
221
218
222
NPS can load and unload data from object stores like Amazon S3 and IBM Cloud object store. This works by using Netezza External Tables to read from and write to object store.
### 9. Loading and analyzing Australian weather station data
283
+
### 10. Loading and analyzing Australian weather station data
280
284
281
285
In this section, we will use Python and Netezza Performance Server, to load and analyze the data on `Australian temperatures and rainfall` published publically.
0 commit comments