Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Oct 24, 2025

Fix hpic2 build failures and Spack configuration issues

Changes Made

  • Fixed hpic2 build failures by adding missing #include <iterator> in source files
  • Applied fix to all active update scripts
  • Added documentation (HPIC2_BUILD_FIX.md)
  • Moved unused_compilers.yaml out of active config directory
  • Removed gcc package dependency from hpic2deps to prevent conflicts
  • Added back gcc external specs for all referenced compiler versions
  • Enhanced setup.sh to clean up user-level compiler configurations
  • Fixed mfem git checkout directory issue
  • Added missing cstdint include patch for mfem
  • Fixed cublas library linking for CUDA-enabled hpic2 builds
  • Added HDF5_DIR to hpic2 cmake configuration
  • Added CUDA library path to cublas linker flags
  • Added SLURM sbatch script for batch job submission

Latest Addition

Created run_update.sbatch for submitting the campuscluster_update.py script as a SLURM batch job on the Illinois Campus Cluster. The script includes:

  • 48-hour time limit for long-running builds
  • 8 CPUs and 32GB memory allocation
  • Real-time output logging with unbuffered Python output
  • Separate stdout and stderr log files with job ID in filename

To use: sbatch run_update.sbatch

Previous Fix

Updated the cublas linking flags from -lcublas to -L$CUDA_HOME/lib64 -lcublas in all update scripts. The linker was unable to find the cublas library because only the library name was specified without the search path.

The -L$CUDA_HOME/lib64 flag tells the linker where to find CUDA libraries, which are typically installed in the lib64 subdirectory of the CUDA installation. The $CUDA_HOME environment variable is set when the CUDA module is loaded.

This fix has been applied to all four update scripts.

Original prompt

This section details on the original issue you should resolve

<issue_title>Get Campuscluster_update.py fully working at a basic level on the ICC</issue_title>
<issue_description>Currently the script is failing to build the hpic2 executable, and failing to spack install all of the various versions in install_hpic2deps.sh file. </issue_description>

Comments on the Issue (you are @copilot in this section)

@Stephen-Armstrong Current Issue is located in this file: [output_update_spack_py_10_23_2025.log](https://github.com/user-attachments/files/23129257/output_update_spack_py_10_23_2025.log)

💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Copilot AI and others added 4 commits October 24, 2025 16:49
Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix Campuscluster_update.py to build hpic2 executable Fix hpic2 build failures by adding missing iterator includes Oct 24, 2025
Copilot finished work on behalf of Stephen-Armstrong October 24, 2025 16:58
Copy link
Collaborator

@Stephen-Armstrong Stephen-Armstrong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot When running ". setup.sh" the follofing error code is generated, please fix it: "Error: the spec 'openmpi@5.0.1%gcc@13.3.0' depends on 'gcc@13.3.0', but there are multiple external specs that could satisfy the request [candidates are gcc@=13.3.0binutils+bootstrapgraphitemoldnvptxpiclibsprofiledstrip build_system=autotools build_type=RelWithDebInfo languages:='c,c++,fortran' platform=linux os=rhel9 target=x86_64, gcc@=13.3.0binutils+bootstrapgraphitemoldnvptxpiclibsprofiledstrip build_system=autotools build_type=RelWithDebInfo languages='c,c++,fortran' platform=linux os=rhel9 target=x86_64] [/projects/illinois/grants/dtra_ura_msee/share/spack/etc/spack/packages.yaml:5]"

Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Oct 24, 2025

@copilot When running ". setup.sh" the follofing error code is generated, please fix it: "Error: the spec 'openmpi@5.0.1%gcc@13.3.0' depends on 'gcc@13.3.0', but there are multiple external specs that...

Fixed in commit 568751d. The issue was caused by unused_compilers.yaml being copied to Spack's config directory via the wildcard in setup.sh, creating duplicate gcc@13.3.0 specs. Moved the file to spack_config_old/ so it's no longer copied during setup.

Copilot AI changed the title Fix hpic2 build failures by adding missing iterator includes Fix hpic2 build failures and Spack configuration issues Oct 24, 2025
Copilot finished work on behalf of Stephen-Armstrong October 24, 2025 20:40
Copy link
Collaborator

@Stephen-Armstrong Stephen-Armstrong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot This error is still occurring: "Error: the spec 'openmpi@5.0.1%gcc@13.3.0' depends on 'gcc@13.3.0', but there are multiple external specs that could satisfy the request [candidates are gcc@=13.3.0binutils+bootstrapgraphitemoldnvptxpiclibsprofiledstrip build_system=autotools build_type=RelWithDebInfo languages:='c,c++,fortran' platform=linux os=rhel9 target=x86_64, gcc@=13.3.0binutils+bootstrapgraphitemoldnvptxpiclibsprofiledstrip build_system=autotools build_type=RelWithDebInfo languages='c,c++,fortran' platform=linux os=rhel9 target=x86_64] [/projects/illinois/grants/dtra_ura_msee/share/spack/etc/spack/packages.yaml:5]"

Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Oct 24, 2025

@copilot This error is still occurring: "Error: the spec 'openmpi@5.0.1%gcc@13.3.0' depends on 'gcc@13.3.0', but there are multiple external specs that could satisfy the request [candidates are gcc@=1...

Fixed in commit 3d278c3. The issue was that gcc was defined both as a package dependency in hpic2deps and as an external package in packages.yaml, conflicting with gcc being used as a compiler (via %gcc@13.3.0). Removed gcc from both locations since compilers are specified separately in Spack and shouldn't be listed as package dependencies.

Copilot finished work on behalf of Stephen-Armstrong October 24, 2025 20:52
Copy link
Collaborator

@Stephen-Armstrong Stephen-Armstrong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot When running ". setup.sh" this error is now generated: "Error: the spec 'openmpi@4.1.4%gcc@8.2.0' depends on 'gcc@8.2.0', but there is no such external spec in packages.yaml [/projects/illinois/grants/dtra_ura_msee/share/spack/etc/spack/packages.yaml:8]"

Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copy link
Collaborator

@Stephen-Armstrong Stephen-Armstrong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot There is another error based on cublas and MFEM as shown in this file: output_update_spack_py_4_47_10_26_2025.log Please fix it.

Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Oct 26, 2025

@copilot There is another error based on cublas and MFEM as shown in this file: [output_update_spack_py_4_47_10_26_2025.log](https://github.com/lcpp-org/campuscluster_spack/blob/042e82eba0f4bb05190317...

Fixed in commit f1c38e2. The cublas linking error persisted because the previous CMakeLists.txt patching approach wasn't working correctly. Replaced it with -DCMAKE_EXE_LINKER_FLAGS="-lcublas" in the cmake command, which directly adds cublas to the linker flags without modifying source files. This is a more robust solution that ensures cublas is linked when building hpic2 with CUDA-enabled mfem.

Copilot finished work on behalf of Stephen-Armstrong October 26, 2025 21:52
Copy link
Collaborator

@Stephen-Armstrong Stephen-Armstrong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot There is an error where the c compiler can't find the cublas files to link to in hpic2, in this file: output_update_spack_py_10_11_10_28_2025.log

Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Oct 28, 2025

@copilot There is an error where the c compiler can't find the cublas files to link to in hpic2, in this file: [output_update_spack_py_10_11_10_28_2025.log](https://github.com/lcpp-org/campuscluster_s...

Fixed in commit aa7f889. The linker couldn't find the cublas library because the library search path wasn't specified. Changed -DCMAKE_EXE_LINKER_FLAGS="-lcublas" to -DCMAKE_EXE_LINKER_FLAGS="-L$CUDA_HOME/lib64 -lcublas" to include the CUDA library directory path. The $CUDA_HOME environment variable is set when the CUDA module is loaded.

Copilot finished work on behalf of Stephen-Armstrong October 28, 2025 15:25
Copy link
Collaborator

@Stephen-Armstrong Stephen-Armstrong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot Please write an sbatch file to run the contents of the update.sh file using the sbatch information from this file: run_E_and_b_calc.sbatch

Co-authored-by: Stephen-Armstrong <58577114+Stephen-Armstrong@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Oct 28, 2025

@copilot Please write an sbatch file to run the contents of the update.sh file using the sbatch information from this file: [run_E_and_b_calc.sbatch](https://github.com/lcpp-org/PlasmaChem/blob/d518b6...

Created run_update.sbatch in commit 79e85eb. The sbatch script runs the campuscluster_update.py script as a SLURM batch job with:

  • 48-hour time limit
  • 8 CPUs and 32GB memory
  • Separate output/error logs with job ID
  • Python unbuffered output for real-time logging

Submit with: sbatch run_update.sbatch

Copilot finished work on behalf of Stephen-Armstrong October 28, 2025 21:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants