Skip to content
This repository was archived by the owner on Jun 28, 2022. It is now read-only.

Commit 1abf394

Browse files
committed
Latest release: 2.2.1
1 parent e7767ce commit 1abf394

File tree

17 files changed

+1132
-369
lines changed

17 files changed

+1132
-369
lines changed

.bumpversion.cfg

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[bumpversion]
2-
current_version = 2.2.0
2+
current_version = 2.2.1
33
commit = False
44
tag = False
55
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)\-{0,1}(?P<release>\D*)(?P<build>\d*)

HISTORY.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
### V2.2.1 (June 2021)
2+
3+
- **Support of Databricks Runtimes 6.4(ESR) and 7.3, 7.6, 8.0, 8.1, 8.2, 8.3 (both standard and ML)**
4+
- Security fixes for the Javascript Jupyterlab extension of ssh_ipykernel and databrickslabs-jupyterlab
5+
16
### V2.2.0 (Apr 2021)
27

38
- **Jupyter Lab support**

README.md

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,11 @@
22

33
This package allows to connect to a remote Databricks cluster from a locally running JupyterLab.
44

5+
## >>> New minor release V2.2.1 (June 2021) <<<
6+
7+
- Support of Databricks Runtimes 6.4(ESR) and 7.3, 7.6, 8.0, 8.1, 8.2, 8.3 (both standard and ML)
8+
- Upgrade to ssh_ipykernel 1.2.3 (security fixes for the Javascript Jupyterlab extension of ssh_ipykernel)
9+
- Security fixes for the Javascript Jupyterlab extension of databrickslabs-jupyterlab
510

611
## >>> New minor release V2.2 (Apr 2021) <<<
712

@@ -14,7 +19,7 @@ This package allows to connect to a remote Databricks cluster from a locally run
1419
```bash
1520
(base)$ conda create -n dj python=3.8 # you might need to add "pywin32" if you are on Windows
1621
(base)$ conda activate dj
17-
(dj)$ pip install --upgrade databrickslabs-jupyterlab[cli]==2.2.0
22+
(dj)$ pip install --upgrade databrickslabs-jupyterlab[cli]==2.2.1
1823
```
1924

2025
The following packages get installed:
@@ -83,7 +88,7 @@ This package allows to connect to a remote Databricks cluster from a locally run
8388

8489
## 4 Running with docker
8590

86-
A docker image ready for working with *Jupyterlab Integration* is available from Dockerhub. It is recommended to prepare your environment by pulling the repository: `docker pull bwalter42/databrickslabs_jupyterlab:2.2.0`
91+
A docker image ready for working with *Jupyterlab Integration* is available from Dockerhub. It is recommended to prepare your environment by pulling the repository: `docker pull bwalter42/databrickslabs_jupyterlab:2.2.1`
8792

8893
There are two scripts in the folder `docker`:
8994

@@ -103,7 +108,7 @@ Alternatively, under macOS and Linux one can use the following bash functions:
103108
-v $HOME/.ssh/:/home/dbuser/.ssh \
104109
-v $HOME/.databrickscfg:/home/dbuser/.databrickscfg \
105110
-v $(pwd):/home/dbuser/notebooks \
106-
bwalter42/databrickslabs_jupyterlab:2.2.0 /opt/conda/bin/databrickslabs-jupyterlab $@
111+
bwalter42/databrickslabs_jupyterlab:2.2.1 /opt/conda/bin/databrickslabs-jupyterlab $@
107112
}
108113
```
109114

@@ -118,7 +123,7 @@ Alternatively, under macOS and Linux one can use the following bash functions:
118123
-v $HOME/.ssh/:/home/dbuser/.ssh \
119124
-v $HOME/.databrickscfg:/home/dbuser/.databrickscfg \
120125
-v $(pwd):/home/dbuser/notebooks \
121-
bwalter42/databrickslabs_jupyterlab:2.2.0 /opt/conda/bin/jupyter $@
126+
bwalter42/databrickslabs_jupyterlab:2.2.1 /opt/conda/bin/jupyter $@
122127
}
123128
```
124129

@@ -166,7 +171,7 @@ in both commands.
166171
```bash
167172
(base)$ conda create -n dj python=3.8 # you might need to add "pywin32" if you are on Windows
168173
(base)$ conda activate dj
169-
(dj)$ pip install --upgrade databrickslabs-jupyterlab[cli]==2.2.0
174+
(dj)$ pip install --upgrade databrickslabs-jupyterlab[cli]==2.2.1
170175
```
171176

172177
The prefix `(db-jlab)$` for all command examples in this document assumes that the conda enviromnent `db-jlab` is activated.

databrickslabs_jupyterlab/_version.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,5 +12,5 @@ def get_version(version):
1212
return VersionInfo(major, minor, patch, release, build)
1313

1414

15-
__version__ = "2.2.0" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
15+
__version__ = "2.2.1" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
1616
__version_info__ = get_version(__version__)

databrickslabs_jupyterlab/connect.py

Lines changed: 28 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -26,12 +26,31 @@
2626

2727
from pyspark.conf import SparkConf # pylint: disable=import-error,wrong-import-position
2828

29+
NEW_DBUTILS = False
30+
2931
with warnings.catch_warnings():
3032
warnings.simplefilter("ignore")
3133
# Suppress py4j loading message on stderr by redirecting sys.stderr
3234
stderr_orig = sys.stderr
3335
sys.stderr = io.StringIO()
34-
from PythonShell import get_existing_gateway, RemoteContext # pylint: disable=import-error
36+
try:
37+
# Up to DBR 7.x
38+
from PythonShell import get_existing_gateway, RemoteContext # pylint: disable=import-error,wrong-import-position
39+
except:
40+
# Above DBR 8.0
41+
sys.path.insert(0, "/databricks/python_shell")
42+
sys.path.insert(0, "/databricks/python_shell/scripts/")
43+
44+
from dbruntime.spark_connection import RemoteContext, get_existing_gateway # pylint: disable=import-error,wrong-import-position
45+
46+
try:
47+
# up to DBR 8.2
48+
from dbutils import DBUtils # pylint: disable=import-error,wrong-import-position
49+
NEW_DBUTILS = False
50+
except:
51+
# above DBR 8.3
52+
from dbruntime.dbutils import DBUtils # pylint: disable=import-error,wrong-import-position
53+
NEW_DBUTILS = True
3554

3655
out = sys.stderr.getvalue()
3756
# Restore sys.stderr
@@ -40,8 +59,6 @@
4059
if not "py4j imported" in out:
4160
print(out, file=sys.stderr)
4261

43-
from dbutils import DBUtils # pylint: disable=import-error,wrong-import-position
44-
4562

4663
class JobInfo:
4764
"""Job info class for Spark jobs
@@ -109,8 +126,13 @@ def new_group_id(self):
109126

110127

111128
class DbjlUtils:
112-
def __init__(self, shell, entry_point):
113-
self._dbutils = DBUtils(shell, entry_point)
129+
def __init__(self, shell, entry_point, sc, sqlContext, displayHTML):
130+
# ugly, but not possible to differentiate <= 8.2 from >= 8.3
131+
try:
132+
self._dbutils = DBUtils(shell, entry_point)
133+
except:
134+
self._dbutils = DBUtils(shell, entry_point, sc, sqlContext, displayHTML)
135+
114136
self.fs = self._dbutils.fs
115137
self.secrets = self._dbutils.secrets
116138
self.notebook = Notebook()
@@ -464,7 +486,7 @@ def get_config(self): # pylint: disable=unused-argument
464486

465487
# Initialize dbutils
466488
#
467-
dbutils = DbjlUtils(shell, entry_point)
489+
dbutils = DbjlUtils(shell, entry_point, sc, sqlContext, shell.displayHTML)
468490

469491
# Setting up Spark progress bar
470492
#

databrickslabs_jupyterlab_status/.bumpversion.cfg

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[bumpversion]
2-
current_version = 2.2.0
2+
current_version = 2.2.1
33
commit = False
44
tag = False
55
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)\-{0,1}(?P<release>\D*)(?P<build>\d*)

0 commit comments

Comments
 (0)