You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 28, 2022. It is now read-only.
Copy file name to clipboardExpand all lines: README.md
+25-45Lines changed: 25 additions & 45 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,42 +2,28 @@
2
2
3
3
This package allows to connect to a remote Databricks cluster from a locally running JupyterLab.
4
4
5
-
## >>> New minor release V2.1 (Jan 2021) <<<
6
-
7
-
## 1 New features
8
-
9
-
- A new parser for ssh/config that aims for minimum changes (including whitespaces and comments). For verification it shows the diff view to the original version:
- SSH tunnels are now supported by setting the environment variable SSH_TUNNEL to `address:port` of the tunnel service. See above where a standard AWS Databricks hostname and port (`ec2-11-22-33-44.eu-central-1.compute.amazonaws.com`, `2200`) got replaced by a SSH tunnel at `111.222.333.444` and port `2222`.
27
-
For the ssh tunnel one can use a managed service like [ngrok](https://ngrok.com/).
28
-
Alternatively, build your own tunneling service based on e.g. [Fast Reverse Proxy (fpr)](https://github.com/fatedier/frp) as described in .
29
-
In both cases, one will use it as
5
+
6
+
## >>> New minor release V2.2 (Apr 2021) <<<
7
+
8
+
### 1 New features
9
+
10
+
-**Jupyter Lab 3 support**
11
+
12
+
Jupyter Integration now works with JupyterLab 3. This drastically simplified the installation. The bootstrap step (`dj -b`) is gone. Simply:
30
13
31
14
```bash
32
-
export SSH_TUNNEL=52.52.52.52:2222
33
-
dj <profile> -k
15
+
(base)$ conda create -n dj python=3.8 # you might need to add "pywin32" if you are on Windows
- Support of Databricks Runtimes 6.4, 7.3 and 7.5 (standard and ML)
37
-
38
-
- A new flag "--env" allows to add extra environment variables available in the notebooks
20
+
The following packages get installed:
21
+
- databrickslabs-jupyterlab
22
+
- databrickslabs-jupyterlab-status (providing the lab extension)
23
+
- ssh-ipykernel
24
+
- ssh-ipykernel-interrupt (providing the lab extension)
39
25
40
-
- Many bug fixes
26
+
- Support of Databricks Runtimes 6.4, 6.4(ESR), 7.3 and 7.5, 8.0, 8.1 (standard and ML)
41
27
42
28
## 2 Overview
43
29

@@ -90,12 +76,14 @@ In both cases, one will use it as
90
76
- *'6.4'* and *'6.4 ML'*
91
77
- *'7.3'* and *'7.3 ML'*
92
78
- *'7.5'* and *'7.5 ML'*
79
+
- *'8-0'* and *'8.0 ML'*
80
+
- *'8.1'* and *'8.1 ML'*
93
81
94
82
Newer runtimes might work, however are subject to own tests.
95
83
96
84
## 4 Running with docker
97
85
98
-
A docker image ready for working with *Jupyterlab Integration* is available from Dockerhub. It is recommended to prepare your environment by pulling the repository: `docker pull bwalter42/databrickslabs_jupyterlab:2.1.0-rc4`
86
+
A docker image ready for working with *Jupyterlab Integration* is available from Dockerhub. It is recommended to prepare your environment by pulling the repository: `docker pull bwalter42/databrickslabs_jupyterlab:2.2.0-rc4`
99
87
100
88
There are two scripts in the folder `docker`:
101
89
@@ -115,7 +103,7 @@ Alternatively, under macOS and Linux one can use the following bash functions:
The prefix `(db-jlab)$`forall command examplesin this document assumes that the conda enviromnent `db-jlab` is activated.
@@ -187,14 +175,6 @@ in both commands.
187
175
188
176
It comes with a batch file `dj.bat`for Windows. On MacOS or Linux both `dj` and `databrickslabs-jupyterlab` exist
189
177
190
-
3. **Bootstrap *Jupyterlab Integration***
191
-
192
-
Bootstrap the environment for*Jupyterlab Integration* with the following command (which will show the usage after successfully configuring *Juypterlab Integration*):
193
-
194
-
```bash
195
-
(db-jlab)$ dj -b
196
-
```
197
-
198
178
## 6 Getting started with local installation or docker
199
179
200
180
Ensure, ssh access is correctly configured, see [Configure SSH access](docs/ssh-configurations.md)
0 commit comments