-
Notifications
You must be signed in to change notification settings - Fork 123
RHAIENG-948, RHAIENG-2042: chore(build-args): odh base image references to use c9s images instead of ubi9 #2687
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughThis pull request updates Docker build configurations across multiple container images to migrate from UBI9 base images to custom C9S-based images and removes explicit UBI repository injection stages from Dockerfiles. Updates also include test file adjustments for file permission assertions. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~15 minutes
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (14)
codeserver/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/pytorch/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/tensorflow/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/build-args/rocm.conf(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (36)
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
🔇 Additional comments (13)
runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: LGTM: CPU c9s base image reference is consistentName, Python version, and c9s qualifier all look correct for the minimal runtime CPU base.
jupyter/minimal/ubi9-python-3.12/build-args/rocm.conf (1)
1-1: LGTM: ROCm c9s base image and version tag look correctROCm, Python 3.12, c9s, and v6.3 tag align with the PR’s migration pattern.
jupyter/minimal/ubi9-python-3.12/build-args/cuda.conf (1)
1-1: LGTM: CUDA c9s base image reference is correctCUDA, Python 3.12, c9s, and v12.8 tag are all consistent with the intended base-image family.
runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: LGTM: Datascience CPU runtime now aligned on c9s baseUses the same cpu-py312-c9s:latest image as other CPU configs, which keeps runtimes consistent.
jupyter/rocm/pytorch/ubi9-python-3.12/build-args/rocm.conf (1)
1-1: LGTM: Jupyter ROCm PyTorch correctly switched to c9s baseROCm c9s image and v6.3 tag match the pattern used in other ROCm build-args.
jupyter/tensorflow/ubi9-python-3.12/build-args/cuda.conf (1)
1-1: LGTM: TensorFlow CUDA build now on shared c9s baseCUDA py312 c9s v12.8 image aligns TensorFlow with the other CUDA-based images.
jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: LGTM: TrustyAI CPU config aligned to common c9s CPU baseUses the shared cpu-py312-c9s:latest image, which simplifies maintenance across CPU images.
jupyter/pytorch/ubi9-python-3.12/build-args/cuda.conf (1)
1-1: LGTM: PyTorch CUDA build-args correctly migrated to c9sCUDA py312 c9s v12.8 image is consistent with the other CUDA-based Jupyter images.
jupyter/pytorch+llmcompressor/ubi9-python-3.12/build-args/cuda.conf (1)
1-1: LGTM!The CUDA base image variant has been correctly updated from ubi9 to c9s while preserving the v12.8 version tag.
jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf (1)
1-2: LGTM!The addition of the explanatory comment and the base image migration to c9s are both correct.
jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: LGTM!The base image migration to c9s is correct and consistent with other CPU-based configurations in this PR.
runtimes/rocm-tensorflow/ubi9-python-3.12/build-args/rocm.conf (1)
1-1: LGTM!The ROCm base image variant has been correctly updated from ubi9 to c9s while preserving the v6.3 version tag.
jupyter/rocm/tensorflow/ubi9-python-3.12/build-args/rocm.conf (1)
1-1: LGTM!The ROCm base image migration to c9s is correct and consistent with the other ROCm configuration in this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (3)
jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm (1)
35-39: Update comment to clarify repository source after c9s migration.Line 35 states "Inject the official UBI 9 repository configuration" but if
BASE_IMAGEis being changed to c9s (per PR objectives), this comment becomes misleading. The configuration is still from UBI 9, but it's being used with a non-UBI (c9s) base image. Clarify that we're sourcing UBI repository configuration for use with the AIPCC c9s base image.Suggested comment update:
-# Inject the official UBI 9 repository configuration into the AIPCC base image. -# The Quay-based AIPCC image is "repo-less" by default (https://gitlab.com/redhat/rhel-ai/core/base-images/app#repositories), so dnf cannot upgrade or install packages. -# By copying ubi.repo from the public UBI 9 image, we enable package management for upgrades and installations. +# Inject the official UBI 9 repository configuration into the AIPCC c9s base image. +# The Quay-based AIPCC image is "repo-less" by default (https://gitlab.com/redhat/rhel-ai/core/base-images/app#repositories), so dnf cannot upgrade or install packages. +# By copying ubi.repo and GPG keys from the public UBI 9 image, we enable package management and verification for upgrades and installations.jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm (1)
35-39: Update comment to clarify repository source after c9s migration.Same as PyTorch variant: Line 35 states "Inject the official UBI 9 repository configuration" but if
BASE_IMAGEis being changed to c9s (per PR objectives), this comment becomes misleading. Clarify that we're sourcing UBI repository configuration for use with the AIPCC c9s base image.Suggested comment update (consistent with PyTorch variant):
-# Inject the official UBI 9 repository configuration into the AIPCC base image. -# The Quay-based AIPCC image is "repo-less" by default (https://gitlab.com/redhat/rhel-ai/core/base-images/app#repositories), so dnf cannot upgrade or install packages. -# By copying ubi.repo from the public UBI 9 image, we enable package management for upgrades and installations. +# Inject the official UBI 9 repository configuration into the AIPCC c9s base image. +# The Quay-based AIPCC image is "repo-less" by default (https://gitlab.com/redhat/rhel-ai/core/base-images/app#repositories), so dnf cannot upgrade or install packages. +# By copying ubi.repo and GPG keys from the public UBI 9 image, we enable package management and verification for upgrades and installations.runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (1)
14-27: Remove redundant ARG TARGETARCH declaration.Line 27 duplicates the ARG TARGETARCH declaration from line 14. While functionally harmless, this reduces clarity.
ARG TARGETARCH WORKDIR /opt/app-root/bin @@ -24,8 +24,6 @@ COPY --from=ubi-repos /etc/yum.repos.d/ubi.repo /etc/yum.repos.d/ubi.repo COPY --from=ubi-repos /etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release /etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release ARG TARGETARCHRemove the second declaration at line 27.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (20)
codeserver/ubi9-python-3.12/Dockerfile.cpu(2 hunks)jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu(1 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu(1 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda(1 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm(1 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(1 hunks)jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda(1 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm(1 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm(1 hunks)jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda(1 hunks)jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu(1 hunks)rstudio/rhel9-python-3.12/Dockerfile.cpu(1 hunks)rstudio/rhel9-python-3.12/Dockerfile.cuda(1 hunks)runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu(1 hunks)runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu(1 hunks)runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(1 hunks)runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (49)
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
🔇 Additional comments (20)
jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm (1)
23-23: Approve: RPM-GPG-KEY addition for signature verification.The added COPY instruction correctly brings the RPM-GPG-KEY from the ubi-repos stage, enabling package signature verification for dnf operations. This complements the existing ubi.repo copy (line 22) and aligns with the migration to c9s base images where the target image is repo-less by default. The syntax and destination path follow the established pattern.
Verify that the destination path
/etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-releaseis consistent across all Dockerfiles updated in this PR, especially other ROCM, CUDA, and Jupyter variants mentioned in the summary.runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm (1)
23-23: Good addition for package signature verification.The addition of the RPM-GPG-KEY-redhat-release COPY is appropriate and enables dnf to verify package signatures when the base image switches to c9s. This aligns with the pattern used across other Dockerfiles in the PR.
Verify that
dnfis configured (or defaults) to use this GPG key for package signature verification during thednf upgradeanddnf installcommands at lines 39 and 48. If GPG verification requires explicit configuration, ensure it's in place.jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm (2)
38-39: Approve: GPG key copy enables secure package verification.Adding the RPM-GPG-KEY-redhat-release copy alongside the UBI repository config is the correct approach. This enables
dnfto verify package signatures from UBI repositories even when running on a c9s base image (both are RHEL 9 / EL9 derivatives). The change is secure and necessary for the base image migration.
4-4: Verify that build-args files have been updated to use c9s BASE_IMAGE.Ensure that the corresponding build-args files (referenced in PR objectives as residing in
jupyter/rocm/pytorch/ubi9-python-3.12/build-args/) have been updated to reference c9s variants of the ROCm base image instead of ubi9.jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm (2)
38-39: Approve: GPG key copy enables secure package verification.Adding the RPM-GPG-KEY-redhat-release copy alongside the UBI repository config is the correct approach for the c9s base image migration. Both CentOS Stream 9 and RHEL 9 share the EL9 base, making UBI repo configurations and GPG keys compatible.
4-4: Verify that build-args files have been updated to use c9s BASE_IMAGE.Ensure that the corresponding build-args files (referenced in PR objectives) have been updated to reference c9s variants of the ROCm base image instead of ubi9.
runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm (1)
23-23: GPG key provisioning placement is sound.The addition of the RPM-GPG-KEY-redhat-release copy following the ubi.repo copy is correctly positioned to ensure signature verification is available for subsequent package operations.
jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu (1)
54-54: Proper positioning for signature verification.The GPG key copy is correctly placed after repository configuration and before package upgrade operations.
runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda (1)
25-25: GPG key copy correctly positioned in cuda-base stage.The addition follows the pattern established across the repository and ensures signature verification is available for package operations.
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (1)
25-25: GPG key copy is correctly positioned.The addition follows the established pattern across the repository.
runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (1)
23-23: GPG key copy is correctly positioned before package operations.The addition ensures signature verification is available during subsequent dnf operations.
jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu (1)
23-23: GPG key provisioning is correctly positioned.The copy follows the ubi.repo provisioning and precedes package upgrade operations.
jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda (1)
25-25: GPG key copy is correctly positioned in cuda-base stage.The addition follows the established pattern and ensures signature verification availability.
jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda (1)
41-41: GPG key addition is correctly positioned.The copy follows repository configuration and precedes package operations that require signature verification.
To ensure the changes are fully functional, please verify that the
ubi-reposbase image (registry.access.redhat.com/ubi9/ubi) contains the file/etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-releaseat the specified path. This file should be present in standard UBI9 images, but it's worth confirming as part of the build validation.runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (1)
26-27: Consistent GPG key injection in cuda-base stage.Same pattern as other files—GPG key properly positioned after repository configuration and before dnf operations.
jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda (1)
40-41: GPG key properly injected into cuda-base stage.Placement and pattern consistent with other Dockerfiles in the PR.
runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda (1)
24-25: GPG key properly positioned in cuda-base stage.Consistent with the pattern across all Dockerfiles in this PR.
codeserver/ubi9-python-3.12/Dockerfile.cpu (2)
24-25: GPG key injected into rpm-base stage for package verification.Correctly positioned after repository configuration and before the
dnfpackage fetch operation at line 36.
85-86: GPG key injected into cpu-base stage for package verification.Correctly positioned after repository configuration and before the subsequent
dnfoperations at lines 102 and 111.jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu (1)
63-64: I'll verify the GPG key path using web search and check the codebase for similar patterns.GPG key path verified—code change is correct.
The path
/etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-releaseis the standard location for Red Hat release signatures in RHEL 9 and UBI 9, and UBI 9 repository configurations use this same path for signature verification. The COPY instruction correctly ensures the dnf operations can verify package signatures in the cpu-base stage.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
tests/containers/base_image_test.py (1)
233-241: Consider removing unused UID data.Since UID is no longer checked (lines 246-250), the following are now unused:
expected_uidvariable (line 233)- The fourth element (UID) in each list entry (lines 237-238, 241)
Apply this diff to clean up the unused data:
- expected_uid = "1001" # default expected_gid = "0" # root # Directories to assert permissions and ownerships as we did in ODS-CI directories_to_check: list[list[str]] = [ - [f"{app_root_path}/lib", "775", expected_gid, expected_uid], + [f"{app_root_path}/lib", "775", expected_gid], ] if not utils.is_rstudio_image(image): # RStudio image doesn't have '/opt/app-root/share' directory - directories_to_check.append([f"{app_root_path}/share", "775", expected_gid, expected_uid]) + directories_to_check.append([f"{app_root_path}/share", "775", expected_gid])
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (21)
codeserver/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu(0 hunks)rstudio/rhel9-python-3.12/Dockerfile.cpu(0 hunks)rstudio/rhel9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu(0 hunks)runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu(0 hunks)runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm(0 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm(0 hunks)runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda(0 hunks)tests/containers/base_image_test.py(2 hunks)
💤 Files with no reviewable changes (20)
- runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
- runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
- jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm
- codeserver/ubi9-python-3.12/Dockerfile.cpu
- rstudio/rhel9-python-3.12/Dockerfile.cpu
- jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda
- runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
- jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu
- jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda
- runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
- rstudio/rhel9-python-3.12/Dockerfile.cuda
- jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm
- runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda
- runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
- jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu
- runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda
- jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu
- jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
🔇 Additional comments (2)
tests/containers/base_image_test.py (2)
236-236: LGTM! Type annotation corrected.The type annotation fix from
list[str]tolist[list[str]]correctly reflects the actual data structure, which is a list of lists containing path, permissions, gid, and uid values.
246-250: LGTM! UID check removal aligns with OpenShift security model.The change to ignore UID and check only permissions and GID is appropriate for OpenShift environments, where containers run with random UIDs (for security) but deterministic GIDs. This ensures the test remains stable across different OpenShift deployments.
Well, not a huge problem; we can start from centos image and install python ourselves, by simply stealing from either sclorg or aipcc. Either way, it's maybe for the best, because that way we can only install what AIPCC preinstalls and nothing extra. That's good for fast(er) feedback for RHDS work. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (3)
runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: Same:latesttag issue applies here.This file has the same critical blocker as flagged in
codeserver/ubi9-python-3.12/build-args/cpu.conf: the:latesttag doesn't exist during PR testing (only:on-pr-{{revision}}tags are created). Builds will fail when attempting to pull this base image during PR validation.Ensure the same resolution strategy applies to all affected config files.
codeserver/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: Critical:latesttag issue still present.The previously flagged critical blocker remains unresolved: this
:latesttag doesn't exist during PR testing and will cause build failures. Confirm the resolution strategy with maintainers before merge.jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: Same:latesttag issue applies here.This file has the same critical blocker as flagged in
codeserver/ubi9-python-3.12/build-args/cpu.conf: the:latesttag doesn't exist during PR testing (only:on-pr-{{revision}}tags are created). Builds will fail when attempting to pull this base image during PR validation.Ensure the same resolution strategy applies to all affected config files.
🧹 Nitpick comments (1)
tests/containers/base_image_test.py (1)
246-250: Clarify the comment wording and consider removing unused UID data.The comment on line 246 is confusing. It states "it's the gid that is nonrandom on openshift," but if gid is predictable (nonrandom), checking it makes sense. I believe you mean "uid is random/assigned by OpenShift, but gid is predictable," which is why you're removing UID from the check.
Additionally,
expected_uid(line 233) is still defined and added to the data structure (lines 237, 241) but is no longer used in assertions. Consider either removing it from the data structure or adding a comment explaining why it's retained.Apply this diff to clarify the comment:
- # ignore `:%u`, it does not matter what the uid is, it's the gid that is nonrandom on openshift + # ignore `:%u` - UID is assigned by OpenShift and varies; only GID should be validated _, output = container.exec(["stat", "--format='%a:%g'", f"{item[0]}"])Also, please verify that skipping UID validation is appropriate for c9s base images in OpenShift environments. If UID is truly unpredictable in the new image context, consider documenting this behavior change or removing
expected_uidfrom the data structure entirely.
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (35)
codeserver/ubi9-python-3.12/Dockerfile.cpu(0 hunks)codeserver/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/pytorch/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/tensorflow/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf(1 hunks)rstudio/rhel9-python-3.12/Dockerfile.cpu(0 hunks)rstudio/rhel9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu(0 hunks)runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu(0 hunks)runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm(0 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm(0 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/build-args/rocm.conf(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda(0 hunks)tests/containers/base_image_test.py(2 hunks)
💤 Files with no reviewable changes (20)
- jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm
- jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda
- runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
- runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
- runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda
- jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda
- jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu
- jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm
- runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
- jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda
- runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
- rstudio/rhel9-python-3.12/Dockerfile.cpu
- runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
- jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu
- jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu
- jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- rstudio/rhel9-python-3.12/Dockerfile.cuda
- jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm
- codeserver/ubi9-python-3.12/Dockerfile.cpu
🚧 Files skipped from review as they are similar to previous changes (11)
- jupyter/rocm/tensorflow/ubi9-python-3.12/build-args/rocm.conf
- jupyter/tensorflow/ubi9-python-3.12/build-args/cuda.conf
- runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf
- jupyter/minimal/ubi9-python-3.12/build-args/cuda.conf
- jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf
- runtimes/rocm-tensorflow/ubi9-python-3.12/build-args/rocm.conf
- jupyter/rocm/pytorch/ubi9-python-3.12/build-args/rocm.conf
- jupyter/minimal/ubi9-python-3.12/build-args/rocm.conf
- jupyter/pytorch/ubi9-python-3.12/build-args/cuda.conf
- jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf
- jupyter/pytorch+llmcompressor/ubi9-python-3.12/build-args/cuda.conf
🧰 Additional context used
🧬 Code graph analysis (1)
tests/containers/base_image_test.py (1)
tests/containers/kubernetes_utils.py (1)
exec(282-320)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (46)
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
tests/containers/base_image_test.py (1)
236-236: LGTM! Type annotation now correctly reflects the data structure.The change from
list[str]tolist[list[str]]fixes an incorrect type annotation, as the data structure has always been a 2D list containing[path, perms, gid, uid]entries.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (7)
codeserver/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf(1 hunks)rstudio/rhel9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- rstudio/rhel9-python-3.12/build-args/cpu.conf
🚧 Files skipped from review as they are similar to previous changes (5)
- jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf
- jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf
- runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf
- jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf
- codeserver/ubi9-python-3.12/build-args/cpu.conf
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (35)
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: I'll attempt to verify the architecture support using alternative methods sinceskopeois unavailable.Verify c9s base image architecture support via registry or PR discussion.
CentOS Stream 9 (c9s) container images natively support x86_64, aarch64, ppc64le, and s390x architectures. However, the specific architecture support of the PR-built
odh-base-image-cpu-py312-c9simage cannot be confirmed in this environment (skopeo and curl registry inspection are unavailable).Since the PR comments mention an s390x compatibility error, verify that the image build pipeline includes all required architectures (s390x, ppc64le, amd64, arm64) by checking:
- The image build configuration or Dockerfile
- The PR comment thread for architecture-related failures
- The Quay.io registry manifest for this specific tag to confirm which architectures are actually published
| @@ -1 +1 @@ | |||
| BASE_IMAGE=registry.access.redhat.com/ubi9/python-312:latest | |||
| quay.io/opendatahub/odh-base-image-cpu-py312-c9s:on-pr-b725e0c555d008b873acd02c53ff23492606617b | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Replace PR-tagged image with a stable tag before merging.
The image reference uses a temporary PR build tag (on-pr-b725e0c555d008b873acd02c53ff23492606617b). PR-tagged images are typically ephemeral and will break builds once the artifacts are cleaned up or expire.
Since the PR description indicates this depends on PR #2688, ensure that PR is merged and a stable release tag (e.g., latest, versioned tag, or commit SHA from main) is used instead.
Verify the dependency PR status and plan for updating to a stable image tag:
🏁 Script executed:
#!/bin/bash
# Check if the dependency PR #2688 is merged
gh pr view 2688 --repo opendatahub-io/notebooks --json state,mergedAt,title
# Check if the image with PR tag exists and is accessible
skopeo inspect docker://quay.io/opendatahub/odh-base-image-cpu-py312-c9s:on-pr-b725e0c555d008b873acd02c53ff23492606617b 2>&1 | head -20Length of output: 228
Let me check the codebase to understand the image tagging strategy and see if there are other stable tags being used elsewhere:
🏁 Script executed:
#!/bin/bash
# Search for other image references in build-args to understand tagging pattern
find runtimes -name "*.conf" -type f -exec grep -H "quay.io\|image" {} \; | head -30
# Look for any documentation or README about image versions
find runtimes -name "README*" -o -name "*.md" | xargs grep -l "image\|tag\|version" 2>/dev/null | head -10Length of output: 945
Replace PR-tagged image with a stable version tag to match repository pattern.
The image uses a temporary PR build tag (on-pr-b725e0c555d008b873acd02c53ff23492606617b), which is inconsistent with other base images in the repository that use stable versioned tags (e.g., v12.6, v6.2, v6.3). PR-tagged images are ephemeral and break builds once artifacts expire.
Since dependency PR #2688 is now merged (2025-11-19), a stable version tag should be available for odh-base-image-cpu-py312-c9s. Update both affected files to use a versioned tag before merging:
runtimes/datascience/ubi9-python-3.12/build-args/cpu.confruntimes/minimal/ubi9-python-3.12/build-args/cpu.conf
🤖 Prompt for AI Agents
runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf around line 1: the
file references a PR-specific image tag which is ephemeral; replace the
PR-tagged image name with the repository's stable version tag for
odh-base-image-cpu-py312-c9s (the same versioning pattern used elsewhere, e.g.,
v12.6 or the merged dependency's released tag) and make the identical change in
runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf so both files reference
the stable versioned image tag.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (7)
codeserver/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf(1 hunks)rstudio/rhel9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- rstudio/rhel9-python-3.12/build-args/cpu.conf
🚧 Files skipped from review as they are similar to previous changes (3)
- runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf
- jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf
- jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (49)
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: build (rocm-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-jupyter-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-llmcompressor-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (cuda-jupyter-pytorch-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-datascience-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (codeserver-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-runtime-tensorflow-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (jupyter-trustyai-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (rocm-jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
- GitHub Check: build (runtime-cuda-tensorflow-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/ppc64le, false) / build
- GitHub Check: build (cuda-jupyter-minimal-ubi9-python-3.12, 3.12, linux/arm64, false) / build
- GitHub Check: build (runtime-datascience-ubi9-python-3.12, 3.12, linux/s390x, false) / build
- GitHub Check: build (jupyter-minimal-ubi9-python-3.12, 3.12, linux/amd64, false) / build
🔇 Additional comments (1)
jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: Verify that the PR-specific image tag will be replaced with a stable release tag before merge.The BASE_IMAGE references a PR-specific tag (
on-pr-b725e0c555d008b873acd02c53ff23492606617b) that is used consistently across 7 image configuration files. PR-specific tags are ephemeral and typically become unavailable after the PR is merged or closed, which could cause build failures.Other base image references in the codebase consistently use stable release tags (e.g.,
v12.6,v12.8,v6.3), establishing a clear pattern that should be followed here as well.Additionally, per the architecture comments in rstudio configuration files (documenting
linux/arm64, linux/x86_64support), confirm that this base image supports all required architectures for the project.Files requiring updates:
- jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf
- codeserver/ubi9-python-3.12/build-args/cpu.conf
- jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf
- jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf
- rstudio/rhel9-python-3.12/build-args/cpu.conf
- runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf
- runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf
…images instead of ubi9 (#3020) This is to align ODH with AIPCC bases which use RHEL. Therefore, ODH should use CentOS Stream.
…on openshift (#3020)
… (#3020)
```
+ dnf install -y perl mesa-libGL skopeo gcc gcc-c++ make openssl-devel autoconf automake libtool cmake python3-devel pybind11-devel openblas-devel unixODBC-devel
Updating Subscription Management repositories.
Red Hat Universal Base Image 9 (RPMs) - BaseOS 1.4 MB/s | 433 kB 00:00
Red Hat Universal Base Image 9 (RPMs) - AppStre 6.4 MB/s | 2.2 MB 00:00
Red Hat Universal Base Image 9 (RPMs) - CodeRea 683 kB/s | 174 kB 00:00
Package mesa-libGL-24.2.8-3.el9_6.s390x is already installed.
Package skopeo-2:1.20.0-1.el9.s390x is already installed.
Package gcc-11.5.0-11.el9.s390x is already installed.
Error:
Problem: package mesa-dri-drivers-25.0.7-3.el9_7.s390x from ubi-9-appstream-rpms requires libLLVM.so.20.1()(64bit), but none of the providers can be installed
- package mesa-dri-drivers-25.0.7-3.el9_7.s390x from ubi-9-appstream-rpms requires libLLVM.so.20.1(LLVM_20.1)(64bit), but none of the providers can be installed
- cannot install both llvm-libs-20.1.8-3.el9.s390x from ubi-9-appstream-rpms and llvm-libs-19.1.7-2.el9.s390x from @System
- package mesa-libGL-25.0.7-3.el9_7.s390x from ubi-9-appstream-rpms requires libgallium-25.0.7.so()(64bit), but none of the providers can be installed
...
```
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (2)
runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: LGTM! Past concerns about PR-specific tags addressed.The base image reference now uses a stable
latesttag instead of the PR-specific tag mentioned in previous reviews. The change is consistent with the migration to c9s base images.Note: Consider using a versioned tag for reproducibility (same recommendation as in jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf).
runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf (1)
1-1: LGTM! Past concerns about PR-specific tags addressed.The base image reference now uses a stable
latesttag instead of the PR-specific tag mentioned in previous reviews. The change is consistent with the migration to c9s base images.Note: Consider using a versioned tag for reproducibility (same recommendation as in other cpu.conf files).
🧹 Nitpick comments (2)
jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf (1)
1-2: Consider using a versioned tag instead oflatestfor reproducibility.The change correctly migrates to the c9s base image. However, using the
latesttag can lead to non-reproducible builds. Other base images in this PR use versioned tags (e.g.,v12.8).If a stable version tag is available, consider updating to use it:
# Base Image : c9s with Python 3.12 -BASE_IMAGE=quay.io/opendatahub/odh-base-image-cpu-py312-c9s:latest +BASE_IMAGE=quay.io/opendatahub/odh-base-image-cpu-py312-c9s:v1.0tests/containers/base_image_test.py (1)
246-250: Clarify the comment explaining why UID is ignored.The functional change to ignore UID is correct for OpenShift environments where containers run with random UIDs. However, the comment explanation could be clearer.
The comment states "it's the gid that is nonrandom on openshift", but this phrasing is potentially confusing. On OpenShift:
- UIDs are assigned randomly by the platform (which is why we ignore them)
- GIDs are typically fixed to 0 (root group) to allow file access
Consider rewording for clarity:
- # ignore `:%u`, it does not matter what the uid is, it's the gid that is nonrandom on openshift + # ignore UID - OpenShift assigns random UIDs at runtime, but GID is fixed (typically 0) _, output = container.exec(["stat", "--format='%a:%g'", f"{item[0]}"])
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (35)
codeserver/ubi9-python-3.12/Dockerfile.cpu(0 hunks)codeserver/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/minimal/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/pytorch+llmcompressor/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/pytorch/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/rocm/pytorch/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm(0 hunks)jupyter/rocm/tensorflow/ubi9-python-3.12/build-args/rocm.conf(1 hunks)jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda(0 hunks)jupyter/tensorflow/ubi9-python-3.12/build-args/cuda.conf(1 hunks)jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu(0 hunks)jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf(1 hunks)rstudio/rhel9-python-3.12/Dockerfile.cpu(0 hunks)rstudio/rhel9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu(0 hunks)runtimes/datascience/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu(0 hunks)runtimes/minimal/ubi9-python-3.12/build-args/cpu.conf(1 hunks)runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda(0 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm(0 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm(0 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/build-args/rocm.conf(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda(0 hunks)tests/containers/base_image_test.py(2 hunks)
💤 Files with no reviewable changes (20)
- runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
- runtimes/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda
- codeserver/ubi9-python-3.12/Dockerfile.cpu
- runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
- runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
- runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm
- jupyter/pytorch+llmcompressor/ubi9-python-3.12/Dockerfile.cuda
- rstudio/rhel9-python-3.12/Dockerfile.cuda
- jupyter/rocm/pytorch/ubi9-python-3.12/Dockerfile.rocm
- jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu
- jupyter/datascience/ubi9-python-3.12/Dockerfile.cpu
- jupyter/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu
- rstudio/rhel9-python-3.12/Dockerfile.cpu
- runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
- jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda
- jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm
- runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
- jupyter/minimal/ubi9-python-3.12/Dockerfile.cuda
🚧 Files skipped from review as they are similar to previous changes (9)
- jupyter/rocm/tensorflow/ubi9-python-3.12/build-args/rocm.conf
- runtimes/rocm-tensorflow/ubi9-python-3.12/build-args/rocm.conf
- jupyter/minimal/ubi9-python-3.12/build-args/cuda.conf
- jupyter/trustyai/ubi9-python-3.12/build-args/cpu.conf
- jupyter/rocm/pytorch/ubi9-python-3.12/build-args/rocm.conf
- jupyter/minimal/ubi9-python-3.12/build-args/rocm.conf
- jupyter/datascience/ubi9-python-3.12/build-args/cpu.conf
- codeserver/ubi9-python-3.12/build-args/cpu.conf
- jupyter/pytorch/ubi9-python-3.12/build-args/cuda.conf
🧰 Additional context used
🧬 Code graph analysis (1)
tests/containers/base_image_test.py (1)
tests/containers/kubernetes_utils.py (1)
exec(282-320)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (22)
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-codeserver-datascience-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-tensorflow-rocm-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-pipeline-runtime-pytorch-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-minimal-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-trustyai-cpu-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-jupyter-pytorch-llmcompressor-cuda-py312-ubi9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cpu-py312-rhel9-on-pull-request
- GitHub Check: Red Hat Konflux / odh-workbench-rstudio-minimal-cuda-py312-rhel9-on-pull-request
- GitHub Check: build (cuda-rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
- GitHub Check: build (rstudio-rhel9-python-3.12, 3.12, linux/amd64, true) / build
🔇 Additional comments (3)
jupyter/pytorch+llmcompressor/ubi9-python-3.12/build-args/cuda.conf (1)
1-1: LGTM! Base image migration to c9s.The change is consistent with the PR objective to migrate from UBI9 to c9s base images. Image verification is requested in the previous file review.
tests/containers/base_image_test.py (1)
236-236: LGTM! Type annotation correctly reflects the data structure.The updated type annotation
list[list[str]]correctly represents the nested list structure used fordirectories_to_check.jupyter/tensorflow/ubi9-python-3.12/build-args/cuda.conf (1)
1-1: I'll search for information about ODH c9s base images and any known issues with s390x compatibility mentioned in the PR.Let me search for information about ODH base image infrastructure and whether c9s images support s390x.
Based on the verification results and web search findings, I can now generate the rewritten review comment:
Images cannot be verified accessible; clarify availability and architecture support expectations.
The migration to c9s base images cannot be verified at this time because the referenced quay.io/opendatahub images are not accessible. However, CentOS Stream 9 is compatible with 64-bit x86, 64-bit ARM, IBM Z (s390x Z14+), and IBM POWER architectures, so s390x support is not inherently limited by c9s.
Before merging, confirm:
- Whether these ODH base images are intended to be private/internal or public
- Whether images are pre-release or already published
- If published, verify they exist and their supported architectures via quay.io repository inspection
|
@jiridanek: The following tests failed, say
Full PR test history. Your PR dashboard. Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here. |
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: ide-developer The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
https://issues.redhat.com/browse/RHAIENG-2042
Description
This is to align ODH with AIPCC bases which use RHEL. Therefore, ODH should use CentOS Stream.
Related to
Requires
ppc64le(ands390xtodo later) architectures toodh-base-image-cpu-py312-c9s#2688How Has This Been Tested?
Self checklist (all need to be checked):
make test(gmakeon macOS) before asking for reviewDockerfile.konfluxfiles should be done inodh/notebooksand automatically synced torhds/notebooks. For Konflux-specific changes, modifyDockerfile.konfluxfiles directly inrhds/notebooksas these require special attention in the downstream repository and flow to the upcoming RHOAI release.Merge criteria:
Summary by CodeRabbit
Release Notes
✏️ Tip: You can customize this high-level summary in your review settings.