Skip to content

Commit 72a592a

Browse files
authored
RHAIENG-948: fix(jupyter datascience on IBM/Power): build failure due to AIPCC base (missing curl and openssl -devel packages) and AIPCC wheels (some prereqs are missing) (opendatahub-io#2678)
* RHAIENG-948: fix(ppc): build failure due to missing curl add `libcurl-devel` to build dependencies for `ppc64le` and `s390x` in pyarrow and onnx builds ``` -- Building AWS C++ SDK from source CMake Error at /usr/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:230 (message): Could NOT find CURL (missing: CURL_LIBRARY CURL_INCLUDE_DIR) ``` * RHAIENG-948: fix(ppc): build failure due to missing openssl add `libcurl-devel` to build dependencies for `ppc64le` and `s390x` in pyarrow and onnx builds ``` -- Configuring done (1.8s) -- Generating done (0.1s) CMake Error at cmake_modules/ThirdpartyToolchain.cmake:5282 (set_property): The link interface of target "AWS::aws-c-cal" contains: OpenSSL::Crypto but the target was not found. Possible reasons include: * There is a typo in the target name. * A find_package call is missing for an IMPORTED target. * An ALIAS target is missing. ``` * RHAIENG-948: fix(ppc): build failure due to missing cython in aipcc wheel index ``` + pip install --no-cache-dir -r requirements-build.txt Looking in indexes: https://console.redhat.com/api/pypi/public-rhai/rhoai/3.0/cpu-ubi9/simple/ Ignoring oldest-supported-numpy: markers 'python_version < "3.9"' don't match your environment ERROR: Could not find a version that satisfies the requirement cython>=0.29.31 (from versions: none) ERROR: No matching distribution found for cython>=0.29.31 ``` * RHAIENG-948: fix(ppc): build failure due to missing numpy in aipcc wheel index ``` + pip install --no-cache-dir -r requirements.txt Looking in indexes: https://console.redhat.com/api/pypi/public-rhai/rhoai/3.0/cpu-ubi9/simple/ ERROR: Could not find a version that satisfies the requirement numpy>=1.22 (from versions: none) ERROR: No matching distribution found for numpy>=1.22 subprocess exited with status 1 ``` * RHAIENG-948: fix(ppc): build failure due to missing protobuf>=4.25.1 in aipcc wheel index ``` + pip wheel . -w /root/onnx_wheel ... ERROR: Could not find a version that satisfies the requirement protobuf>=4.25.1 (from versions: none) ERROR: No matching distribution found for protobuf>=4.25.1 [end of output] ``` * RHAIENG-948: fix(ppc): build failure due to missing numpy>=1.16.6 in aipcc wheel index ``` + pip install --no-cache-dir /tmp/wheels/pyarrow-17.0.0-cp312-cp312-linux_ppc64le.whl Looking in indexes: https://console.redhat.com/api/pypi/public-rhai/rhoai/3.0/cpu-ubi9/simple/ Processing /tmp/wheels/pyarrow-17.0.0-cp312-cp312-linux_ppc64le.whl INFO: pip is looking at multiple versions of pyarrow to determine which version is compatible with other requirements. This could take a while. ERROR: Could not find a version that satisfies the requirement numpy>=1.16.6 (from pyarrow) (from versions: none) ERROR: No matching distribution found for numpy>=1.16.6 ``` * RHOAIENG-32541: chore(jupyter/datascience): add `subscription-manager refresh` step to CPU Dockerfile for base image https://issues.redhat.com/browse/RHOAIENG-32541 * RHAIENG-948: fix(ppc): build failure due to missing cryptography build dependencies in aipcc wheel index ``` × Failed to download and build `cryptography==43.0.3` ├─▶ Failed to resolve requirements from `build-system.requires` ├─▶ No solution found when resolving: `maturin>=1, <2`, `cffi>=1.12 │ ; platform_python_implementation != 'PyPy'`, `setuptools!=74.0.0, │ !=74.1.0, !=74.1.1, !=74.1.2, !=74.1.3, !=75.0.0, !=75.1.0, !=75.2.0` ╰─▶ Because only maturin==1.9.6 is available and maturin==1.9.6 has no wheels with a matching platform tag (e.g., `manylinux_2_34_ppc64le`), we can conclude that maturin==1.9.6 cannot be used. And because you require maturin==1.9.6, we can conclude that your requirements are unsatisfiable. hint: Wheels are available for `maturin` (v1.9.6) on the following platforms: `linux_aarch64`, `linux_x86_64` ``` * RHAIENG-948: fix(ppc): build failure due to missing zlib-devel to build pillow ``` line 20, in run_setup return super().run_setup(setup_script) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/tmp/.tmpoAXecA/builds-v0/.tmpeN0QF3/lib64/python3.12/site-packages/setuptools/build_meta.py", line 317, in run_setup exec(code, locals()) File "<string>", line 1108, in <module> RequiredDependencyException: The headers or library files could not be found for zlib, a required dependency when compiling Pillow from source. ``` * RHAIENG-948: fix(ppc): build failure due to missing libjpeg-turbo-devel to build pillow ``` The headers or library files could not be found for jpeg, a required dependency when compiling Pillow from source. Please see the install instructions at: https://pillow.readthedocs.io/en/latest/installation/basic-installation.html ``` * RHAIENG-948: fix(ppc): build failure due to missing openssl-devel to build maturin ``` Could not find openssl via pkg-config: pkg-config exited with status code 1 > PKG_CONFIG_PATH=/usr/local/lib/pkgconfig/ PKG_CONFIG_ALLOW_SYSTEM_CFLAGS=1 pkg-config --libs --cflags openssl The system library `openssl` required by crate `openssl-sys` was not found. The file `openssl.pc` needs to be installed and the PKG_CONFIG_PATH environment variable must contain its parent directory. PKG_CONFIG_PATH contains the following: - /usr/local/lib/pkgconfig/ HINT: you may need to install a package such as openssl, openssl-dev or openssl-devel. ... Could not find directory of OpenSSL installation, and this `-sys` crate cannot proceed without this knowledge. If OpenSSL is installed and this crate had trouble finding it, you can set the `OPENSSL_DIR` environment variable for the compilation process. Make sure you also have the development packages of openssl installed. For example, `libssl-dev` on Ubuntu or `openssl-devel` on Fedora. ... warning: build failed, waiting for other jobs to finish... 💥 maturin failed Caused by: Failed to build a native library through cargo Caused by: Cargo build finished with "exit status: ``` * RHAIENG-948: fix(ppc): build failure due to missing fortran to build scikit-learn ``` × Failed to download and build `scikit-learn==1.7.2` ├─▶ Failed to install requirements from `build-system.requires` ├─▶ Failed to build `scipy==1.16.3` ├─▶ The build backend returned an error ╰─▶ Call to `mesonpy.build_wheel` failed (exit status: 1) [stdout] + meson setup ... ../meson.build:88:0: ERROR: Unknown compiler(s): [['gfortran'], ['flang-new'], ['flang'], ['nvfortran'], ['pgfortran'], ['ifort'], ['ifx'], ['g95']] The following exception(s) were encountered: Running `gfortran --help` gave "[Errno 2] No such file or directory: 'gfortran'" ```
1 parent e9c915d commit 72a592a

File tree

1 file changed

+46
-6
lines changed

1 file changed

+46
-6
lines changed

jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu

Lines changed: 46 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,13 @@ EOF
4040
# cpu-base #
4141
####################
4242
FROM ${BASE_IMAGE} AS cpu-base
43+
USER 0
44+
RUN /bin/bash <<'EOF'
45+
set -Eeuxo pipefail
46+
if command -v subscription-manager &> /dev/null; then
47+
subscription-manager identity &>/dev/null && subscription-manager refresh || echo "Not registered, skipping refresh."
48+
fi
49+
EOF
4350

4451
WORKDIR /opt/app-root/bin
4552

@@ -144,7 +151,13 @@ RUN --mount=type=cache,target=/root/.cache/pip \
144151
set -Eeuxo pipefail
145152
if [ "$TARGETARCH" = "ppc64le" ] || [ "$TARGETARCH" = "s390x" ]; then
146153
# Install build dependencies (shared for pyarrow and onnx)
147-
dnf install -y cmake make gcc-c++ pybind11-devel wget
154+
packages=(
155+
cmake make gcc-c++ pybind11-devel wget
156+
# pyarrow -DARROW_S3=ON pulls in AWS C++ SDK
157+
# libcurl-devel is required to build AWS C++ SDK
158+
libcurl-devel openssl-devel
159+
)
160+
dnf install -y "${packages[@]}"
148161
dnf clean all
149162
# Build and collect pyarrow wheel
150163
git clone --depth 1 --branch "apache-arrow-17.0.0" https://github.com/apache/arrow.git
@@ -173,7 +186,8 @@ if [ "$TARGETARCH" = "ppc64le" ] || [ "$TARGETARCH" = "s390x" ]; then
173186
make -j$(nproc) VERBOSE=1
174187
make install -j$(nproc)
175188
cd ../../python
176-
pip install --no-cache-dir -r requirements-build.txt
189+
# aipcc index is missing cython, and maybe more
190+
pip install --no-cache-dir --extra-index-url https://pypi.org/simple -r requirements-build.txt
177191
PYARROW_WITH_PARQUET=1 \
178192
PYARROW_WITH_DATASET=1 \
179193
PYARROW_WITH_FILESYSTEM=1 \
@@ -224,10 +238,12 @@ if [ "${TARGETARCH}" = "ppc64le" ]; then
224238
cd onnx
225239
git checkout ${ONNX_VERSION}
226240
git submodule update --init --recursive
227-
pip install --no-cache-dir -r requirements.txt
241+
# aipcc index does not have numpy>=1.22
242+
pip install --no-cache-dir --extra-index-url https://pypi.org/simple -r requirements.txt
228243
CMAKE_ARGS="-DPython3_EXECUTABLE=$(which python3.12)"
229244
export CMAKE_ARGS
230-
pip wheel . -w /root/onnx_wheel
245+
# protobuf>=4.25.1
246+
pip wheel --extra-index-url https://pypi.org/simple . -w /root/onnx_wheel
231247
else
232248
echo "Skipping ONNX build on non-Power"
233249
mkdir -p /root/onnx_wheel
@@ -328,7 +344,8 @@ COPY --from=pyarrow-builder /tmp/wheels /tmp/wheels
328344
RUN /bin/bash <<'EOF'
329345
set -Eeuxo pipefail
330346
if [ "$TARGETARCH" = "ppc64le" ] || [ "$TARGETARCH" = "s390x" ]; then
331-
pip install --no-cache-dir /tmp/wheels/*.whl
347+
# aipcc is lacking numpy>=1.16.6
348+
pip install --no-cache-dir --extra-index-url https://pypi.org/simple /tmp/wheels/*.whl
332349
else
333350
echo "Skipping wheel install for $TARGETARCH"
334351
fi
@@ -342,7 +359,8 @@ COPY --from=onnx-builder /root/onnx_wheel/ /onnxwheels/
342359
RUN /bin/bash <<'EOF'
343360
set -Eeuxo pipefail
344361
if [ "${TARGETARCH}" = "ppc64le" ]; then
345-
pip install --no-cache-dir /onnxwheels/*.whl
362+
# aipcc is sure to lack something, so add extra index preemptively
363+
pip install --no-cache-dir --extra-index-url https://pypi.org/simple /onnxwheels/*.whl
346364
else
347365
echo "Skipping ONNX/OpenBLAS install on non-Power"
348366
fi
@@ -368,6 +386,26 @@ else
368386
fi
369387
EOF
370388

389+
RUN /bin/bash <<'EOF'
390+
set -Eeuxo pipefail
391+
if [ "${TARGETARCH}" = "ppc64le" ]; then
392+
packages=(
393+
# required to compile pillow
394+
zlib-devel libjpeg-turbo-devel
395+
# optional pillow deps https://pillow.readthedocs.io/en/latest/installation/building-from-source.html#external-libraries
396+
#libtiff-devel libwebp-devel openjpeg2-devel lcms2-devel freetype-devel
397+
#libimagequant-devel harfbuzz-devel fribidi-devel
398+
399+
# required to compile maturin
400+
openssl-devel
401+
402+
# required to compile scikit-learn
403+
gcc-gfortran
404+
)
405+
dnf install -y "${packages[@]}"
406+
fi
407+
EOF
408+
371409
USER 1001:0
372410

373411
# Install Python packages and Jupyterlab extensions from pylock.toml
@@ -382,10 +420,12 @@ echo "Installing software and packages"
382420
# we often don't know the correct hashes and `--require-hashes` would therefore fail on non amd64, where building is common.
383421
if [ "$TARGETARCH" = "ppc64le" ] || [ "$TARGETARCH" = "s390x" ]; then
384422
# We need special flags and environment variables when building packages
423+
# aipcc does not have some dependencies to build cryptography==43.0.3 on ppc64le, so we need to use pypi.org/simple
385424
GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1 \
386425
CFLAGS="-O3" CXXFLAGS="-O3" \
387426
uv pip install --strict --no-deps --no-cache --no-config --no-progress \
388427
--verify-hashes --compile-bytecode --index-strategy=unsafe-best-match \
428+
--extra-index-url https://pypi.org/simple \
389429
--requirements=./pylock.toml
390430
else
391431
# This may have to download and compile some dependencies, and as we don't lock requirements from `build-system.requires`,

0 commit comments

Comments
 (0)