├── LICENSE ├── README.md ├── examples ├── cscs │ ├── alps │ │ ├── gh200 │ │ │ ├── craype_config │ │ │ └── test_craype_config │ │ └── mc │ │ │ ├── craype_config │ │ │ └── test_craype_config │ └── daint │ │ ├── gpu │ │ ├── craype_config │ │ ├── craype_config_no_cudaaware │ │ ├── test_craype_config │ │ └── test_craype_config_no_cudaaware │ │ └── mc │ │ └── craype_config └── eurohpc │ └── lumi │ └── mi250x │ ├── craype_config │ └── test_craype_config └── juhpc /LICENSE: -------------------------------------------------------------------------------- 1 | BSD 3-Clause License 2 | 3 | Copyright (c) 2024, Samuel Omlin (CSCS), and contributors 4 | 5 | Redistribution and use in source and binary forms, with or without 6 | modification, are permitted provided that the following conditions are met: 7 | 8 | 1. Redistributions of source code must retain the above copyright notice, this 9 | list of conditions and the following disclaimer. 10 | 11 | 2. Redistributions in binary form must reproduce the above copyright notice, 12 | this list of conditions and the following disclaimer in the documentation 13 | and/or other materials provided with the distribution. 14 | 15 | 3. Neither the name of the copyright holder nor the names of its 16 | contributors may be used to endorse or promote products derived from 17 | this software without specific prior written permission. 18 | 19 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 20 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 21 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 22 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 23 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 24 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 25 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 26 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 27 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 28 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 29 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # $\textsf{\textbf{\color{purple}J\color{green}U\color{red}HPC}}$: Julia[up] for HPC 2 | 3 | 4 | ## Introduction: a community project for everyone - including end users 5 | 6 | JUHPC is an attempt to convert the numerous efforts at different HPC sites for defining a suitable Julia HPC setup into a community scripting project... *which suits everyone*. The objective is to gather all the experience of the Julia HPC community and transform it in an portable automatic Julia HPC setup, enhanced and maintained jointly by the Julia HPC community. JUHPC can be used stand-alone (by end users) or as part of a recipe for automated software stack generation (by HPC sites) as, e.g., the generation of modules or [uenvs](https://eth-cscs.github.io/uenv/) (used on the ALPS supercomputer at the Swiss National Supercomputing Centre, see [here](https://confluence.cscs.ch/display/KB/UENV+user+environments)). 7 | 8 | An important lesson learned by the Julia HPC community for providing Julia at HPC sites is not to preinstall any packages site wide. JUHPC pushes this insight even one step further and does not preinstall Julia either. Instead, Juliaup is leveraged and the installation of Juliaup, Julia and packages is preconfigured for being automatically executed by the end user. Furthermore, for maximal robustness, the preferences are created using the available API calls of the corresponding packages. 9 | 10 | Concretely, JUHPC creates an HPC setup for Juliaup, Julia and some HPC key packages (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl, ADIOS2.jl, ...), including 11 | - preferences for HPC key packages that require system libraries; 12 | - a wrapper for Juliaup that will install Juliaup (and latest Julia) automatically in a predefined location (e.g., scratch) when the end user calls `juliaup` the first time; 13 | - an activation script that sets environment variables for Juliaup, Julia and HPC key packages; 14 | - optional execution of a site-specific post installation Julia script, using the project where preferences were set (e.g, to modify preferences or to create an uenv view equivalent to the activation script). 15 | 16 | JUHPC allows, furthermore, to create *multiple fully independent HPC setups that do not interfere with each other* (notably, `.bashrc` and `.profile` are not modified). Thus, it is, e.g., straightforward to create a Julia[up] HPC setup per architecture (in the examples below this is achieved based on the hostname). 17 | 18 | HPC sites can install the HPC setup(s) into a folder in a location accessible to all users (which can also be part, e.g., of a uenv). HPC end users can install the HPC setup(s) into any folder to their liking, accessible from the compute nodes; it is then enough to source the activate script in this folder in order to activate the HPC setup. 19 | 20 | 21 | ## Table of contents 22 | - [Introduction: a community project for everyone - including end users](#introduction-a-community-project-for-everyone---including-end-users) 23 | - [Usage](#usage) 24 | - [1. Export environment variables for the installation of some HPC key packages](#1-export-environment-variables-for-the-installation-of-some-hpc-key-packages) 25 | - [2. Call JUHPC](#2-call-juhpc) 26 | - [Examples: HPC setup installations on the ALPS supercomputer (CSCS)](#examples-hpc-setup-installations-on-the-alps-supercomputer-cscs) 27 | - [Example 1: using Cray Programming Environment](#example-1-using-cray-programming-environment) 28 | - [Example 2: using UENV](#example-2-using-uenv) 29 | - [Test of example 1](#test-of-example-1) 30 | - [Test of example 2](#test-of-example-2) 31 | - [Your contributions](#your-contributions) 32 | - [Contributors](#contributors) 33 | 34 | 35 | ## Usage 36 | Installing a HPC setup for Juliaup, Julia and some HPC key packages, requires only two steps: 37 | 1. Export environment variables for the installation of some HPC key packages. 38 | 2. Call JUHPC. 39 | 40 | Details are given in the following two subsections. 41 | 42 | ### 1. Export environment variables for the installation of some HPC key packages 43 | 44 | - CUDA 45 | - `JUHPC_CUDA_HOME`: Activates HPC setup for CUDA and is used for CUDA.jl runtime discovery (set as `CUDA_HOME` in the activate script). 46 | - `JUHPC_CUDA_RUNTIME_VERSION`: Used to set CUDA.jl preferences (fixes runtime version enabling pre-compilation on login nodes). 47 | 48 | - AMDGPU 49 | - `JUHPC_ROCM_HOME`: Activates HPC setup for AMDGPU and is used for AMDGPU.jl runtime discovery (set as `ROCM_PATH` in the activate script). 50 | 51 | - MPI 52 | - `JUHPC_MPI_HOME`: Activates HPC setup for MPI and is used to set MPI.jl preferences. Incompatible with `JUHPC_MPI_VENDOR` 53 | - `JUHPC_MPI_VENDOR`: Activates HPC setup for MPI and is used to set MPI.jl preferences (currently only "cray" is valid, see [here](https://juliaparallel.org/MPI.jl/stable/configuration/#Notes-about-vendor-provided-MPI-backends)). Incompatible with `JUHPC_MPI_HOME`. 54 | - `JUHPC_MPI_EXEC`: Used to set MPI.jl preferences (exec command definition). Arguments are space separated, e.g., `"srun -C gpu"`. 55 | 56 | - HDF5 57 | - `JUHPC_HDF5_HOME`: Activates HPC setup for HDF5 and is used to set HDF5.jl preferences. 58 | 59 | - ADIOS2 60 | - `JUHPC_ADIOS2_HOME`: Activates HPC setup for ADIOS2 and is used to set ADIOS2.jl preferences. 61 | 62 | > [!NOTE] 63 | > The automatically defined preferences suitable for typical HPC needs can be modified with a post install Julia script (see keyword argument `--postinstall` in next section). Also preferences for other packages could be added this way if needed. Of course, any of these preferences can later be overwritten by local preferences. 64 | 65 | ### 2. Call JUHPC 66 | 67 | The `juhpc` bash script is called as follows: 68 | ```bash 69 | juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR [--postinstall=] [--verbose=] 70 | ``` 71 | I.e., it takes the following positional arguments: 72 | - `JUHPC_SETUP_INSTALLDIR`: the folder into which the HPC setup is installed, e.g., `"$SCRATCH/../julia/${HOSTNAME%%-*}/juhpc_setup"`. 73 | - `JULIAUP_INSTALLDIR`: the folder into which Juliaup and Julia will automatically be installed the first time the end user calls `juliaup`. *User environment variables should be escaped* in order not to have them expanded during HPC setup installation, but during its usage by the end user, e.g., `"\$SCRATCH/../julia/\$USER/\${HOSTNAME%%-*}/juliaup"`. 74 | 75 | > [!NOTE] 76 | > The above examples assume that `$SCRATCH/../julia` is a wipe out protected folder on scratch. 77 | 78 | > [!IMPORTANT] 79 | > Separate installation by `HOSTNAME` is required if different hosts with different architectures share the file system used for installation (e.g., daint and eiger on ALPS). 80 | 81 | Furthermore, it supports the following keyword argument: 82 | - `--postinstall=`: site-specific post installation Julia script, using the project where preferences were set (e.g, to modify preferences or to create an uenv view equivalent to the activation script). 83 | - `--verbose=`: verbosity of the output during the HPC setup installation: if set to `1`, the generated HPC package preferences and environment variables are printed; if set to `2`, all the output of all commands is printed in addition for debugging purposes (default verbosity is `0`). No matter which verbosity level is set, all output is written in to a log file (`"$JUHPC_SETUP_INSTALLDIR/hpc_setup_install.log"`). 84 | 85 | 86 | ## Examples: HPC setup installations on the ALPS supercomputer (CSCS) 87 | 88 | In the following you can find two examples of HPC setup installations. 89 | 90 | > [!TIP] 91 | > More examples are found in the folder [examples](/examples/). 92 | 93 | ### Example 1: using Cray Programming Environment 94 | 95 | ```bash 96 | # Load required modules (including correct CPU and GPU target modules) 97 | module load cray 98 | module switch PrgEnv-cray PrgEnv-gnu 99 | module load cudatoolkit craype-accel-nvidia90 100 | module load cray-hdf5-parallel 101 | module list 102 | 103 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 104 | export JUHPC_CUDA_HOME=$CUDA_HOME 105 | export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION 106 | export JUHPC_MPI_VENDOR="cray" 107 | export JUHPC_MPI_EXEC="srun -C gpu" 108 | export JUHPC_HDF5_HOME=$HDF5_DIR 109 | 110 | # Call JUHPC 111 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%-*}/juhpc_setup 112 | JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%-*}/juliaup" 113 | VERSION="v0.4.0" 114 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 115 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR 116 | ``` 117 | 118 | > [!TIP] 119 | > The corresponding file is found [here](/examples/cscs/alps/gh200/craype_config). 120 | 121 | ### Example 2: using UENV 122 | 123 | ```bash 124 | # UENV specific environment variables 125 | export ENV_MOUNT={{ env.mount }} # export ENV_MOUNT=/user-environment 126 | export ENV_META=$ENV_MOUNT/meta 127 | export ENV_EXTRA=$ENV_META/extra 128 | export ENV_JSON=$ENV_META/env.json 129 | 130 | # Environment variables for HPC key packages that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 131 | export JUHPC_CUDA_HOME=$(spack -C $ENV_MOUNT/config location -i cuda) 132 | export JUHPC_CUDA_RUNTIME_VERSION=$(spack --color=never -C $ENV_MOUNT/config find cuda | \ 133 | perl -ne 'print $1 if /cuda@([\d.]+)/') 134 | export JUHPC_MPI_HOME=$(spack -C $ENV_MOUNT/config location -i cray-mpich) 135 | export JUHPC_MPI_EXEC="srun -C gpu" 136 | export JUHPC_HDF5_HOME=$(spack -C $ENV_MOUNT/config location -i hdf5) 137 | export JUHPC_ADIOS2_HOME=$(spack -C $ENV_MOUNT/config location -i adios2) 138 | 139 | # Call JUHPC 140 | JUHPC_SETUP_INSTALLDIR=$ENV_MOUNT/juhpc_setup 141 | JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%-*}/juliaup" 142 | JUHPC_POST_INSTALL_JL=$ENV_EXTRA/uenv_view.jl 143 | VERSION="v0.4.0" 144 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 145 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR --postinstall=$JUHPC_POST_INSTALL_JL --verbose=1 146 | ``` 147 | 148 | ### Test of example 1 149 | 150 | ```bash 151 | #!/bin/bash 152 | 153 | # Variable set in craype_config 154 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%-*}/juhpc_setup 155 | 156 | # Load required modules (including correct CPU and GPU target modules) 157 | module load cray 158 | module switch PrgEnv-cray PrgEnv-gnu 159 | module load cudatoolkit craype-accel-nvidia90 160 | module load cray-hdf5-parallel 161 | module list 162 | 163 | # Activate the HPC setup environment variables 164 | . $JUHPC_SETUP_INSTALLDIR/activate 165 | 166 | # Call Juliaup to install Juliaup and latest Julia on scratch 167 | juliaup 168 | 169 | # Call Juliaup to see its options 170 | juliaup 171 | 172 | # Call Julia Pkg 173 | julia -e 'using Pkg; Pkg.status()' 174 | 175 | # Add CUDA.jl 176 | julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()' 177 | 178 | # Add MPI.jl 179 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 180 | 181 | # Add HDF5.jl 182 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 183 | 184 | # Test CUDA-aware MPI 185 | cd ~/cudaaware 186 | MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl 187 | ``` 188 | 189 | > [!TIP] 190 | > The corresponding file is found [here](/examples/cscs/alps/gh200/test_craype_config). 191 | 192 | 193 | ### Test of example 2 194 | 195 | ```bash 196 | #!/bin/bash 197 | 198 | # Start uenv with the view equivalent to the activation script 199 | uenv start --view=julia julia 200 | 201 | # Call Juliaup to install Juliaup and latest Julia on scratch 202 | juliaup 203 | 204 | # Call Juliaup to see its options 205 | juliaup 206 | 207 | # Call Julia Pkg 208 | julia -e 'using Pkg; Pkg.status()' 209 | 210 | # Add CUDA.jl 211 | julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()' 212 | 213 | # Add MPI.jl 214 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 215 | 216 | # Add HDF5.jl 217 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 218 | 219 | # Test CUDA-aware MPI 220 | cd ~/cudaaware 221 | MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl 222 | ``` 223 | 224 | 225 | ## Your contributions 226 | 227 | Any contribution is valuable to the Julia HPC community: 228 | - contribute the HPC setup config file for your cluster or supercomputer in the folder [examples](/examples/) as an example for others; 229 | - open a PR with some enhancement or fix; 230 | - open an issue for a bug or an idea for enhancement. 231 | 232 | 233 | ## Contributors 234 | 235 | The initial version of JUHPC was contributed by Samuel Omlin, Swiss National Supercomputing Centre, ETH Zurich ([@omlins](https://github.com/omlins)). The following people have provided valuable contributions over the years in the effort of defining a suitable Julia HPC setup (this is based on the list found [here](https://github.com/hlrs-tasc/julia-on-hpc-systems); please add missing people or let us know; the list is ordered alphabetically): 236 | - Carsten Bauer ([@carstenbauer](https://github.com/carstenbauer)) 237 | - Alexander Bills ([@abillscmu](https://github.com/abillscmu)) 238 | - Johannes Blaschke ([@jblaschke](https://github.com/jblaschke)) 239 | - Valentin Churavy ([@vchuravy](https://github.com/vchuravy)) 240 | - Steffen Fürst ([@s-fuerst](https://github.com/s-fuerst)) 241 | - Mosè Giordano ([@giordano](https://github.com/giordano)) 242 | - C. Brenhin Keller ([@brenhinkeller](https://github.com/brenhinkeller)) 243 | - Mirek Kratochvíl ([@exaexa](https://github.com/exaexa)) 244 | - Pedro Ojeda ([@pojeda](https://github.com/pojeda)) 245 | - Samuel Omlin ([@omlins](https://github.com/omlins)) 246 | - Ludovic Räss ([@luraess](https://github.com/luraess)) 247 | - Erik Schnetter ([@eschnett](https://github.com/eschnett)) 248 | - Michael Schlottke-Lakemper ([@sloede](https://github.com/sloede)) 249 | - Dinindu Senanayake ([@DininduSenanayake](https://github.com/DininduSenanayake)) 250 | - Kjartan Thor Wikfeldt ([@wikfeldt](https://github.com/wikfeldt)) 251 | -------------------------------------------------------------------------------- /examples/cscs/alps/gh200/craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Author: Samuel Omlin, CSCS (omlins) 4 | # 5 | # Description: Definition of site specific variables and call of JUHPC. 6 | # Site: ALPS:todi, Swiss National Supercomputing Centre (CSCS) 7 | # Base: craype 8 | 9 | 10 | # Load required modules (including correct CPU and GPU target modules) 11 | module load cray 12 | module switch PrgEnv-cray PrgEnv-gnu 13 | module load cudatoolkit craype-accel-nvidia90 14 | module load cray-hdf5-parallel 15 | module list 16 | 17 | 18 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 19 | export JUHPC_CUDA_HOME=$CUDA_HOME 20 | export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION 21 | export JUHPC_MPI_VENDOR="cray" 22 | export JUHPC_MPI_EXEC="srun -C gpu" 23 | export JUHPC_HDF5_HOME=$HDF5_DIR 24 | 25 | 26 | # Call JUHPC 27 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup 28 | JULIAUP_INSTALLDIR="\$SCRATCH/\${HOSTNAME%%-*}/juliaup" 29 | VERSION="v0.4.0" 30 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 31 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR --verbose=1 32 | -------------------------------------------------------------------------------- /examples/cscs/alps/gh200/test_craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Variable set in craype_config 4 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup # HPC setup installation environment variables must be expanded during installation. 5 | 6 | # Load required modules (including correct CPU and GPU target modules) 7 | module load cray 8 | module switch PrgEnv-cray PrgEnv-gnu 9 | module load cudatoolkit craype-accel-nvidia90 10 | module load cray-hdf5-parallel 11 | module list 12 | 13 | # Activate the HPC setup environment variables 14 | . $JUHPC_SETUP_INSTALLDIR/activate 15 | 16 | # Call juliaup to install juliaup and latest julia on scratch 17 | juliaup 18 | 19 | # Call juliaup to see its options 20 | juliaup 21 | 22 | # Call julia Pkg 23 | julia -e 'using Pkg; Pkg.status()' 24 | 25 | # Add CUDA.jl 26 | julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()' 27 | 28 | # Add MPI.jl 29 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 30 | 31 | # Add HDF5.jl 32 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 33 | 34 | # Test CUDA-aware MPI 35 | cd ~/cudaaware 36 | MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl 37 | -------------------------------------------------------------------------------- /examples/cscs/alps/mc/craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Author: Samuel Omlin, CSCS (omlins) 4 | # 5 | # Description: Definition of site specific variables and call of JUHPC. 6 | # Site: ALPS:eiger, Swiss National Supercomputing Centre (CSCS) 7 | # Base: craype 8 | 9 | 10 | # Load required modules (including correct CPU and GPU target modules) 11 | module load cray 12 | module switch PrgEnv-cray PrgEnv-gnu 13 | module load cray-hdf5-parallel 14 | module list 15 | 16 | 17 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 18 | export JUHPC_MPI_VENDOR="cray" 19 | export JUHPC_MPI_EXEC="srun -C mc" 20 | export JUHPC_HDF5_HOME=$HDF5_DIR 21 | 22 | 23 | # Call JUHPC 24 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup 25 | JULIAUP_INSTALLDIR="\$SCRATCH/\${HOSTNAME%%-*}/juliaup" 26 | VERSION="v0.4.0" 27 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 28 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR --verbose=1 29 | -------------------------------------------------------------------------------- /examples/cscs/alps/mc/test_craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Variable set in craype_config 4 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%-*}/juhpc_setup # HPC setup installation environment variables must be expanded during installation. 5 | 6 | # Load required modules (including correct CPU and GPU target modules) 7 | module load cray 8 | module switch PrgEnv-cray PrgEnv-gnu 9 | module load cray-hdf5-parallel 10 | module list 11 | 12 | # Activate the HPC setup environment variables 13 | . $JUHPC_SETUP_INSTALLDIR/activate 14 | 15 | # Call juliaup to install juliaup and latest julia on scratch 16 | juliaup 17 | 18 | # Call juliaup to see its options 19 | juliaup 20 | 21 | # Call julia Pkg 22 | julia -e 'using Pkg; Pkg.status()' 23 | 24 | # Add MPI.jl 25 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 26 | 27 | # Add HDF5.jl 28 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 29 | -------------------------------------------------------------------------------- /examples/cscs/daint/gpu/craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Author: Samuel Omlin, CSCS (omlins) 4 | # 5 | # Description: Definition of site specific variables and call of JUHPC. 6 | # Site: Piz Daint:gpu, Swiss National Supercomputing Centre (CSCS) 7 | # Base: craype 8 | 9 | 10 | # Load required modules, including correct CPU and GPU target modules (NOTE: the same modules should be loaded when running julia - JUHPC can be used in a module build recipe...) 11 | module load daint-gpu 12 | module switch PrgEnv-cray PrgEnv-gnu 13 | module load cudatoolkit/11.2.0_3.39-2.1__gf93aa1c craype-accel-nvidia60 # Load latest available cudatoolkit 14 | module load cray-hdf5-parallel 15 | module list 16 | 17 | 18 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 19 | export JUHPC_CUDA_HOME=$CUDA_HOME 20 | export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION 21 | export JUHPC_MPI_HOME=$MPICH_DIR 22 | export JUHPC_MPI_EXEC="srun -C gpu" 23 | export JUHPC_HDF5_HOME=$HDF5_DIR 24 | 25 | 26 | # Create site-specific post-install script (currently MPIPreferences does not provide an option to set required preloads if not automatically detected; JUHPC_MPI_VENDOR fails on Piz Daint...) 27 | JUHPC_POST_INSTALL_JL=./post_install.jl 28 | echo 'using Preferences 29 | set_preferences!("MPIPreferences", 30 | "preloads" => ["libcuda.so", "libcudart.so"], 31 | "preloads_env_switch" => "MPICH_RDMA_ENABLED_CUDA"; 32 | force=true 33 | )' > $JUHPC_POST_INSTALL_JL 34 | 35 | 36 | # Call JUHPC 37 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-gpu/juhpc_setup 38 | JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%[0-9]*}-gpu/juliaup" 39 | VERSION="v0.4.0" 40 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 41 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR --postinstall=$JUHPC_POST_INSTALL_JL 42 | -------------------------------------------------------------------------------- /examples/cscs/daint/gpu/craype_config_no_cudaaware: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Author: Samuel Omlin, CSCS (omlins) 4 | # 5 | # Description: Definition of site specific variables and call of JUHPC. 6 | # Site: Piz Daint:gpu, Swiss National Supercomputing Centre (CSCS) 7 | # Base: craype (not using CUDA-aware MPI because system CUDA is outdated as daint is to be decommissioned soon) 8 | 9 | 10 | # Load required modules, including correct CPU and GPU target modules (NOTE: the same modules should be loaded when running julia - JUHPC can be used in a module build recipe...) 11 | module load daint-gpu 12 | module switch PrgEnv-cray PrgEnv-gnu 13 | module load cray-hdf5-parallel 14 | module list 15 | 16 | 17 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 18 | export JUHPC_MPI_HOME=$MPICH_DIR 19 | export JUHPC_MPI_EXEC="srun -C gpu" 20 | export JUHPC_HDF5_HOME=$HDF5_DIR 21 | 22 | # Create site-specific post-install script (currently MPIPreferences does not provide an option to set required preloads if not automatically detected; JUHPC_MPI_VENDOR fails on Piz Daint...) 23 | JUHPC_POST_INSTALL_JL=./post_install.jl 24 | echo 'using Pkg; Pkg.add("CUDA_Runtime_jll") 25 | using Preferences 26 | set_preferences!("CUDA_Runtime_jll", 27 | "version" => "11.8", 28 | "local" => false; 29 | force=true 30 | )' > $JUHPC_POST_INSTALL_JL # (Set to the highest possible CUDA runtime version that can work on daint using artifacts - "local" is set to false...) 31 | 32 | 33 | # Call JUHPC 34 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-gpu-nocudaaware/juhpc_setup 35 | JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%[0-9]*}-gpu-nocudaaware/juliaup" 36 | VERSION="v0.4.0" 37 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 38 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR --postinstall=$JUHPC_POST_INSTALL_JL --verbose=1 39 | -------------------------------------------------------------------------------- /examples/cscs/daint/gpu/test_craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Variable set in craype_config 4 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-gpu/juhpc_setup 5 | 6 | # Load required modules, including correct CPU and GPU target modules (NOTE: the same modules should be loaded when running julia - JUHPC can be used in a module build recipe...) 7 | module load daint-gpu 8 | module switch PrgEnv-cray PrgEnv-gnu 9 | module load cudatoolkit/11.2.0_3.39-2.1__gf93aa1c craype-accel-nvidia60 # Load latest available cudatoolkit 10 | module load cray-hdf5-parallel 11 | module list 12 | 13 | # Activate the HPC setup environment variables 14 | . $JUHPC_SETUP_INSTALLDIR/activate 15 | 16 | # Call juliaup to install juliaup and latest julia on scratch 17 | juliaup 18 | 19 | # Call juliaup to see its options 20 | juliaup 21 | 22 | # Call julia Pkg 23 | julia -e 'using Pkg; Pkg.status()' 24 | 25 | # Add CUDA.jl 26 | julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()' 27 | 28 | # Add MPI.jl 29 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 30 | 31 | # Add HDF5.jl 32 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 33 | -------------------------------------------------------------------------------- /examples/cscs/daint/gpu/test_craype_config_no_cudaaware: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Variable set in craype_config 4 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-gpu-nocudaaware/juhpc_setup 5 | 6 | # Load required modules, including correct CPU and GPU target modules (NOTE: the same modules should be loaded when running julia - JUHPC can be used in a module build recipe...) 7 | module load daint-gpu 8 | module switch PrgEnv-cray PrgEnv-gnu 9 | module load cray-hdf5-parallel 10 | module list 11 | 12 | # Activate the HPC setup environment variables 13 | . $JUHPC_SETUP_INSTALLDIR/activate 14 | 15 | # Call juliaup to install juliaup and latest julia on scratch 16 | juliaup 17 | 18 | # Call juliaup to see its options 19 | juliaup 20 | 21 | # Call julia Pkg 22 | julia -e 'using Pkg; Pkg.status()' 23 | 24 | # Add CUDA.jl 25 | julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()' 26 | 27 | # Add MPI.jl 28 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 29 | 30 | # Add HDF5.jl 31 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 32 | -------------------------------------------------------------------------------- /examples/cscs/daint/mc/craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Author: Samuel Omlin, CSCS (omlins) 4 | # 5 | # Description: Definition of site specific variables and call of JUHPC. 6 | # Site: Piz Daint:mc, Swiss National Supercomputing Centre (CSCS) 7 | # Base: craype 8 | 9 | 10 | # Load required modules, including correct CPU and GPU target modules (NOTE: the same modules should be loaded when running julia - JUHPC can be used in a module build recipe...) 11 | module load daint-mc 12 | module switch PrgEnv-cray PrgEnv-gnu 13 | module load cray-hdf5-parallel 14 | module list 15 | 16 | 17 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 18 | export JUHPC_MPI_HOME=$MPICH_DIR 19 | export JUHPC_MPI_EXEC="srun -C mc" 20 | export JUHPC_HDF5_HOME=$HDF5_DIR 21 | 22 | 23 | # Call JUHPC 24 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-mc/juhpc_setup 25 | JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%[0-9]*}-mc/juliaup" 26 | VERSION="v0.4.0" 27 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 28 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR 29 | -------------------------------------------------------------------------------- /examples/eurohpc/lumi/mi250x/craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Author: Samuel Omlin, CSCS (omlins) 4 | # 5 | # Description: Definition of site specific variables and call of JUHPC. 6 | # Site: LUMI, EuroHPC JU 7 | # Base: craype (LUMI-flavored) 8 | 9 | # Load required modules (including correct CPU and GPU target modules) 10 | module load LUMI 11 | module load partition/G # loads CPU and GPU target modules (craype-x86-trento, craype-accel-amd-gfx90a) 12 | module load cpeGNU # LUMI-wrapper for PrgEnv-gnu 13 | module load rocm 14 | module load cray-hdf5-parallel 15 | module list 16 | 17 | # Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl and ADIOS2.jl) 18 | export JUHPC_ROCM_HOME=$ROCM_PATH 19 | export JUHPC_MPI_VENDOR="cray" 20 | export JUHPC_MPI_EXEC="srun" 21 | export JUHPC_HDF5_HOME=$HDF5_DIR 22 | 23 | 24 | # Call JUHPC 25 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/${HOSTNAME%%[0-9]*}/juhpc_setup #SCRATCH is assumed to be defined in ~/.bashrc, e.g., SCRATCH=/scratch/project_465000105/$USER 26 | JULIAUP_INSTALLDIR="\$SCRATCH/\${HOSTNAME%%[0-9]*}/juliaup" 27 | VERSION="v0.4.0" 28 | wget https://raw.githubusercontent.com/JuliaParallel/JUHPC/$VERSION/juhpc -O ./juhpc 29 | bash -l ./juhpc $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR --verbose=1 30 | -------------------------------------------------------------------------------- /examples/eurohpc/lumi/mi250x/test_craype_config: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Variable set in craype_config 4 | JUHPC_SETUP_INSTALLDIR=$SCRATCH/uan/juhpc_setup 5 | 6 | # Load required modules (including correct CPU and GPU target modules) 7 | module load LUMI 8 | module load partition/G # loads CPU and GPU target modules (craype-x86-trento, craype-accel-amd-gfx90a) 9 | module load cpeGNU # LUMI-wrapper for PrgEnv-gnu 10 | module load rocm 11 | module load cray-hdf5-parallel 12 | module list 13 | 14 | # Activate the HPC setup environment variables 15 | . $JUHPC_SETUP_INSTALLDIR/activate 16 | 17 | # Call juliaup to install juliaup and latest julia on scratch 18 | juliaup 19 | 20 | # Call juliaup to see its options 21 | juliaup 22 | 23 | # Call julia Pkg 24 | julia -e 'using Pkg; Pkg.status()' 25 | 26 | # Add AMDGPU.jl 27 | julia -e 'using Pkg; Pkg.add("AMDGPU"); using AMDGPU; AMDGPU.versioninfo()' 28 | 29 | # Add MPI.jl 30 | julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()' 31 | 32 | # Add HDF5.jl 33 | julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()' 34 | 35 | # Test AMDGPU-aware MPI 36 | julia -e 'using Pkg; Pkg.add(["ImplicitGlobalGrid", "ParallelStencil"]);' 37 | cd ~/rocmaware 38 | MPICH_GPU_SUPPORT_ENABLED=1 IGG_ROCMAWARE_MPI=1 srun --time=00:09:00 -pdev-g -Aproject_465000557 -N2 -n2 --gpus-per-node=8 julia -O3 --check-bounds=no diffusion3D.jl -------------------------------------------------------------------------------- /juhpc: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # Description: 4 | # Create an HPC setup for juliaup, julia and some HPC key packages (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl, ADIOS2.jl, ...), including 5 | # - preferences for HPC key packages that require system libraries; 6 | # - a wrapper for juliaup that installs juliaup (and latest julia) in an appropriate location (e.g., scratch) if it is not already installed; 7 | # - an activation script that sets environment variables for juliaup, julia and HPC key packages; 8 | # - optionally executing a site-specific post installation julia script, using the project where preferences were set (e.g, to modify preferences or to create an uenv view equivalent to the activation script). 9 | 10 | 11 | # Define log, info, error, cleanup, check functions and julia call for setting preferences 12 | 13 | export JUHPC_SETUP_INSTALLLOG="./hpc_setup_install.log" # must be defined at the very beginning 14 | 15 | export JUHPC="\e[1;34m[\e[0m \e[1;35mJ\e[0m\e[1;32mU\e[0m\e[1;31mH\e[0m\e[1;31mP\e[0m\e[1;31mC\e[0m\e[1;34m:\e[0m" 16 | 17 | export JUHPC_LOGO=$(cat <<'EOF' 18 | \e[1;35m ██╗\e[0m \e[1;32m██╗ ██╗\e[0m \e[1;31m██╗ ██╗██████╗ ██████╗\e[0m 19 | \e[1;35m ██║\e[0m \e[1;32m██║ ██║\e[0m \e[1;31m██║ ██║██╔══██╗██╔════╝\e[0m 20 | \e[1;35m ██║\e[0m \e[1;32m██║ ██║\e[0m \e[1;31m███████║██████╔╝██║ \e[0m 21 | \e[1;35m██╗ ██║\e[0m \e[1;32m██║ ██║\e[0m \e[1;31m██╔══██║██╔═══╝ ██║ \e[0m 22 | \e[1;35m╚█████╔╝\e[0m \e[1;32m╚██████╔╝\e[0m \e[1;31m██║ ██║██║ ╚██████╗\e[0m 23 | \e[1;35m ╚════╝ \e[0m \e[1;32m ╚═════╝ \e[0m \e[1;31m╚═╝ ╚═╝╚═╝ ╚═════╝\e[0m 24 | EOF 25 | ) 26 | 27 | print_logo() { 28 | echo -e "\n\n$JUHPC_LOGO\n\n" >&2 29 | } 30 | 31 | info() { 32 | local message="$1" 33 | local verbose_min="${2:-0}" # Default to 0 if not provided 34 | if [[ "$JUHPC_VERBOSE" -ge "$verbose_min" ]]; then 35 | echo -e "$JUHPC $message" >&2 36 | fi 37 | echo -e "$JUHPC $message" >> "$JUHPC_SETUP_INSTALLLOG" 38 | } 39 | 40 | cleanup() { 41 | info "cleaning up temporary juliaup installation in $TMP_JULIAUP_ROOTDIR." 42 | rm -rf "$TMP_JULIAUP_ROOTDIR" 43 | } 44 | 45 | error() { 46 | local message="$1" 47 | info "\e[1;31mERROR:\e[0m $message" 48 | cleanup 49 | exit 1 50 | } 51 | 52 | check_var() { 53 | for var_name in "$@"; do 54 | if [ -z "${!var_name}" ]; then 55 | error "$var_name is not set or empty." 56 | fi 57 | done 58 | } 59 | 60 | check_dir() { 61 | for dir_name in "$@"; do 62 | if [ -d "$dir_name" ]; then 63 | error "Directory $dir_name already exists. To remove it run:\n rm -rf \"$dir_name\"" 64 | fi 65 | done 66 | } 67 | 68 | julia_pref() { 69 | local cmd="$1" 70 | local progress="${2:-}" # The second argument is optional, default to an empty string if not provided 71 | 72 | if [[ "$JUHPC_VERBOSE" -gt 1 ]]; then 73 | julia --project="$JULIA_PREFDIR" -e "$cmd" 2>&1 | tee -a "$JUHPC_SETUP_INSTALLLOG" 74 | else 75 | julia --project="$JULIA_PREFDIR" -e "$cmd" >> "$JUHPC_SETUP_INSTALLLOG" 2>&1 76 | fi 77 | 78 | if [[ -n "$progress" ]]; then 79 | progress_bar "$progress" 80 | fi 81 | } 82 | 83 | percent() { 84 | local p=00$(($1*100000/$2)) 85 | printf -v "$3" %.2f ${p::-3}.${p: -3} 86 | } 87 | 88 | progress_bar() { 89 | local progress=$1 90 | local width=${2:-50} # Default to a width of 50 if not provided 91 | local percent_var 92 | percent "$progress" 100 percent_var # Always assume total is 100 93 | local filled_length=$((width * progress / 100)) 94 | local bar="" 95 | 96 | if [[ "$JUHPC_VERBOSE" -lt 2 ]]; then 97 | for ((i=0; i $JUHPC_SETUP_INSTALLLOG || { error "failed to install Juliaup (and Julia) in $TMP_JULIAUP_ROOTDIR."; } 179 | 180 | if [ ! -f "$TMP_JULIAUP_BINDIR/juliaup" ]; then error "temporary juliaup installation failed."; fi 181 | 182 | info "... done: temporary installation completed." 183 | 184 | 185 | # Create preferences for HPC key packages that require system libraries (MPI.jl, CUDA.jl, AMDGPU.jl, HDF5.jl, ADIOS2.jl, ...) 186 | 187 | info "Creating preferences for HPC key packages (this can take a few minutes, because some packages have to be temporarily installed)..." 188 | 189 | export JULIA_PREFDIR="$JUHPC_SETUP_INSTALLDIR/julia_preferences" 190 | export JULIA_PREF_PROJECT="$JULIA_PREFDIR/Project.toml" 191 | export JULIA_PREFS="$JULIA_PREFDIR/LocalPreferences.toml" 192 | mkdir -p "$JULIA_PREFDIR" || { error "failed to create directory: $JULIA_PREFDIR"; } 193 | 194 | progress_bar 1 # Initialize progress bar. 195 | julia_pref 'using Pkg' 10 # Initialize project. 196 | 197 | julia_pref 'using Pkg; Pkg.add("Preferences")' 30 # Add Preferences package in any case to avoid that it has to be installed in postinstall scripts that only want to set preferences. 198 | echo "[extras]" >> "$JULIA_PREF_PROJECT" 199 | 200 | if [ -n "${JUHPC_CUDA_HOME}" ]; then # Set preference for using the local CUDA runtime before any installation of CUDA.jl to avoid downloading of artifacts 201 | echo 'CUDA_Runtime_jll = "76a88914-d11a-5bdc-97e0-2f5a05c973a2"' >> "$JULIA_PREF_PROJECT" 202 | 203 | julia_pref 'using Preferences; set_preferences!("CUDA_Runtime_jll", "local"=>true)' 35 204 | if [ -n "${JUHPC_CUDA_RUNTIME_VERSION}" ]; then 205 | julia_pref 'using Preferences; set_preferences!("CUDA_Runtime_jll", "version"=>join(split(ENV["JUHPC_CUDA_RUNTIME_VERSION"],".")[1:2],"."))' 40 206 | fi 207 | fi 208 | 209 | if [ -n "${JUHPC_ROCM_HOME}" ]; then # Set preference for using the local ROCm runtime before any installation of AMDGPU.jl to avoid downloading of artifacts 210 | echo 'AMDGPU = "21141c5a-9bdb-4563-92ae-f87d6854732e"' >> "$JULIA_PREF_PROJECT" 211 | 212 | julia_pref 'using Preferences; set_preferences!("AMDGPU", "use_artifacts"=>false, "eager_gc"=>false)' 45 213 | fi 214 | 215 | if [ -n "${JUHPC_CUDA_HOME}" ]; then export CUDA_HOME="$JUHPC_CUDA_HOME"; fi 216 | if [ -n "${JUHPC_ROCM_HOME}" ]; then export ROCM_PATH="$JUHPC_ROCM_HOME"; fi 217 | 218 | julia_pref 'using Pkg; Pkg.add([p for (p,l) in [("MPIPreferences", "JUHPC_MPI_VENDOR"), ("MPIPreferences", "JUHPC_MPI_HOME"), ("CUDA", "JUHPC_CUDA_HOME"), ("AMDGPU", "JUHPC_ROCM_HOME"), ("HDF5", "JUHPC_HDF5_HOME")] if haskey(ENV,l) && ENV[l]!=""])' 70 219 | 220 | if [ -n "${JUHPC_CUDA_HOME}" ]; then # Set preference for using the local CUDA runtime in a more stable way (in case the previous would not be valid anymore) 221 | julia_pref 'using CUDA; CUDA.set_runtime_version!((VersionNumber(join(split(ENV[key],".")[1:2],".")) for key in ["JUHPC_CUDA_RUNTIME_VERSION"] if haskey(ENV,key) && ENV[key]!=="")...; local_toolkit=true)' 80 222 | fi 223 | 224 | if [ -n "${JUHPC_ROCM_HOME}" ]; then # Set preference for using the local ROCm runtime in a more stable way (in case the previous would not be valid anymore) 225 | julia_pref 'using AMDGPU; AMDGPU.ROCmDiscovery.use_artifacts!(false)' 85 226 | fi 227 | 228 | if [ -n "${JUHPC_MPI_VENDOR}" ]; then 229 | check_var "JUHPC_MPI_EXEC" 230 | julia_pref 'using MPIPreferences; MPIPreferences.use_system_binary(mpiexec=split(ENV["JUHPC_MPI_EXEC"]), vendor=ENV["JUHPC_MPI_VENDOR"])' 90 231 | elif [ -n "${JUHPC_MPI_HOME}" ]; then 232 | check_var "JUHPC_MPI_EXEC" 233 | julia_pref 'using MPIPreferences; MPIPreferences.use_system_binary(mpiexec=split(ENV["JUHPC_MPI_EXEC"]), extra_paths=["$(ENV["JUHPC_MPI_HOME"])/lib"])' 90 234 | fi 235 | 236 | if [ -n "${JUHPC_HDF5_HOME}" ]; then 237 | julia_pref 'using HDF5; HDF5.API.set_libraries!("$(ENV["JUHPC_HDF5_HOME"])/lib/libhdf5.so", "$(ENV["JUHPC_HDF5_HOME"])/lib/libhdf5_hl.so")' 238 | fi 239 | 240 | if [ ! -s "$JULIA_PREFS" ]; then error "preferences file is missing or empty."; fi 241 | 242 | progress_bar 100 # Finalize progress bar. 243 | 244 | info "Preferences:\n$(cat "$JULIA_PREFS")" 1 245 | 246 | info "... done: preferences created." 247 | 248 | 249 | # Create a wrapper for juliaup that installs juliaup (and latest julia) on scratch if it is not already installed 250 | 251 | info "Creating wrapper for juliaup..." 252 | 253 | export JULIAUP_WRAPPER_BINDIR="$JUHPC_SETUP_INSTALLDIR/juliaup_wrapper" 254 | export JULIAUP_WRAPPER="$JULIAUP_WRAPPER_BINDIR/juliaup" 255 | mkdir -p "$JULIAUP_WRAPPER_BINDIR" || { error "failed to create directory: $JULIAUP_WRAPPER_BINDIR"; } 256 | 257 | julia -e ' 258 | println("""#!/bin/bash 259 | 260 | print_logo() { 261 | echo -e "\n\n$(ENV["JUHPC_LOGO"])\n\n" >&2 262 | } 263 | 264 | info() { 265 | local message="\$1" 266 | echo -e "$(ENV["JUHPC"]) \$message" >&2 267 | } 268 | 269 | error() { 270 | local message="\$1" 271 | info "\e[1;31mERROR:\e[0m \$message" 272 | exit 1 273 | } 274 | 275 | JULIAUP_EXE="$(ENV["JULIAUP_BINDIR"])/juliaup" 276 | 277 | if [ ! -f "\$JULIAUP_EXE" ]; then 278 | print_logo 279 | info "This is the first call to juliaup: installing juliaup and the latest julia..." 280 | sleep 2 281 | PATH_OLD="\$PATH" 282 | export PATH=\$(echo \$PATH | perl -pe "s|[^:]*juliaup(?:_wrapper)?[^:]*:?||g") # Remove all juliaup paths from PATH 283 | curl -fsSL https://install.julialang.org | sh -s -- --add-to-path=no --yes --path="$(ENV["JULIAUP_INSTALLDIR"])" --background-selfupdate 0 --startup-selfupdate 0 || { echo "Failed to install Juliaup (and Julia)." >&2; exit 1; } 284 | export PATH="\$PATH_OLD" 285 | if [ ! -f "\$JULIAUP_EXE" ]; then error "juliaup installation failed."; fi 286 | info "... done: installation completed (location: $(ENV["JULIAUP_INSTALLDIR"])). You can now use juliaup and julia normally." 287 | else 288 | exec "\$JULIAUP_EXE" "\$@" 289 | fi 290 | """) 291 | ' > "$JULIAUP_WRAPPER" 292 | 293 | if [ ! -s "$JULIAUP_WRAPPER" ]; then error "Juliaup wrapper is missing or empty."; fi 294 | chmod +x "$JULIAUP_WRAPPER" 295 | 296 | info "... done: wrapper created." 297 | 298 | 299 | # Create an opt-in wrapper for julia that calls the juliaup wrapper to install juliaup and latest julia on scratch if it is not already installed 300 | 301 | info "Creating opt-in wrapper for julia..." 302 | 303 | export JULIA_WRAPPER_BINDIR="$JUHPC_SETUP_INSTALLDIR/julia_wrapper" 304 | export JULIA_WRAPPER="$JULIA_WRAPPER_BINDIR/julia" 305 | mkdir -p "$JULIA_WRAPPER_BINDIR" || { error "failed to create directory: $JULIA_WRAPPER_BINDIR"; } 306 | 307 | julia -e ' 308 | println("""#!/bin/bash 309 | 310 | info() { 311 | local message="\$1" 312 | echo -e "$(ENV["JUHPC"]) \$message" >&2 313 | } 314 | 315 | error() { 316 | local message="\$1" 317 | info "\e[1;31mERROR:\e[0m \$message" 318 | exit 1 319 | } 320 | 321 | JULIA_EXE="$(ENV["JULIAUP_BINDIR"])/julia" 322 | 323 | if [ ! -f "\$JULIA_EXE" ]; then 324 | info "Julia is not yet installed, calling juliaup to install it..." 325 | "$(ENV["JULIAUP_WRAPPER"])" || error "juliaup failed to install julia." 326 | fi 327 | 328 | exec "\$JULIA_EXE" "\$@" 329 | """) 330 | ' > "$JULIA_WRAPPER" 331 | 332 | if [ ! -s "$JULIA_WRAPPER" ]; then error "julia wrapper is missing or empty."; fi 333 | chmod +x "$JULIA_WRAPPER" 334 | 335 | info "... done: wrapper created." 336 | 337 | 338 | # Create an opt-in IJulia installer that installs also the basic Julia kernel; furthermore, it can install further kernels in function of the JUHPC environment variables. 339 | 340 | info "Creating an opt-in IJulia installer..." 341 | 342 | export IJULIA_INSTALLER_BINDIR="$JUHPC_SETUP_INSTALLDIR/ijulia_installer" 343 | export IJULIA_INSTALLER="$IJULIA_INSTALLER_BINDIR/install_ijulia" 344 | mkdir -p "$IJULIA_INSTALLER_BINDIR" || { error "failed to create directory: $IJULIA_INSTALLER_BINDIR"; } 345 | 346 | 347 | julia -e ' 348 | println("""#!/bin/bash 349 | 350 | info() { 351 | local message="\$1" 352 | echo -e "$(ENV["JUHPC"]) \$message" >&2 353 | } 354 | 355 | info "Installing IJulia and the basic Julia kernel..." 356 | "$(ENV["JULIA_WRAPPER"])" -e '\'' 357 | using Pkg 358 | Pkg.add("IJulia") 359 | using IJulia 360 | installkernel("Julia") 361 | '\'' 362 | info "... done: installation completed." 363 | """) 364 | ' > "$IJULIA_INSTALLER" 365 | 366 | if [ -n "${JUHPC_CUDA_HOME}" ]; then 367 | julia -e 'println(""" 368 | info "Installing a kernel for Julia under Nsight Systems..." 369 | "$(ENV["JULIA_WRAPPER"])" -e '\'' 370 | using IJulia 371 | installkernel("Julia under Nsight Systems"; julia = \`$(ENV["JUHPC_CUDA_HOME"])/bin/nsys launch --trace=cuda julia -i --color=yes --project=@.\`) 372 | '\'' 373 | info "... done: installation completed." 374 | """) 375 | ' >> "$IJULIA_INSTALLER" 376 | fi 377 | 378 | if [ ! -s "$IJULIA_INSTALLER" ]; then error "IJulia installer is missing or empty."; fi 379 | chmod +x "$IJULIA_INSTALLER" 380 | 381 | info "... done: installer created." 382 | 383 | 384 | # Create an activation script that sets environment variables for juliaup, julia and HPC key packages 385 | 386 | info "Creating activate script..." 387 | 388 | export JULIAUP_DEPOT="$JULIAUP_INSTALLDIR/depot" 389 | export JULIA_DEPOT="$JULIAUP_INSTALLDIR/depot" 390 | export ACTIVATE_SCRIPT="$JUHPC_SETUP_INSTALLDIR/activate" 391 | 392 | julia -e 'println(""" 393 | info() { 394 | local message="\$1" 395 | echo -e "$(ENV["JUHPC"]) \$message" >&2 396 | } 397 | info "Activating HPC setup for juliaup, julia and HPC key packages requiring system libraries..." 398 | export PATH="$(ENV["JULIAUP_WRAPPER_BINDIR"]):$(ENV["JULIAUP_BINDIR"]):\$PATH" # The wrapper must be before the juliaup bindir 399 | info "PATH: \$PATH" 400 | export JULIAUP_DEPOT_PATH="$(ENV["JULIAUP_DEPOT"])" 401 | info "JULIAUP_DEPOT_PATH: \$JULIAUP_DEPOT_PATH" 402 | export JULIA_DEPOT_PATH="$(ENV["JULIA_DEPOT"])" 403 | info "JULIA_DEPOT_PATH: \$JULIA_DEPOT_PATH" 404 | export JULIA_LOAD_PATH=":$(ENV["JULIA_PREFDIR"])" 405 | info "JULIA_LOAD_PATH: \$JULIA_LOAD_PATH" 406 | $(haskey(ENV,"JUHPC_CUDA_HOME") && ENV["JUHPC_CUDA_HOME"] != "" ? """ 407 | export CUDA_HOME="$(ENV["JUHPC_CUDA_HOME"])" 408 | info "CUDA_HOME: \$CUDA_HOME" 409 | export JULIA_CUDA_MEMORY_POOL=none 410 | info "JULIA_CUDA_MEMORY_POOL: \$JULIA_CUDA_MEMORY_POOL" 411 | """ : "") 412 | $(haskey(ENV,"JUHPC_ROCM_HOME") && ENV["JUHPC_ROCM_HOME"] != "" ? """ 413 | export ROCM_PATH="$(ENV["JUHPC_ROCM_HOME"])" 414 | info "ROCM_PATH: \$ROCM_PATH" 415 | """ : "") 416 | $(haskey(ENV,"JUHPC_ADIOS2_HOME") && ENV["JUHPC_ADIOS2_HOME"] != "" ? """ 417 | export JULIA_ADIOS2_PATH="$(ENV["JUHPC_ADIOS2_HOME"])" 418 | info "JULIA_ADIOS2_PATH: \$JULIA_ADIOS2_PATH" 419 | """ : "") 420 | """)' > "$ACTIVATE_SCRIPT" 421 | 422 | if [ ! -s "$ACTIVATE_SCRIPT" ]; then error "Activate script is missing or empty."; fi 423 | 424 | info "Environment variables in activate script:\n$(grep '^export ' "$ACTIVATE_SCRIPT")" 1 425 | 426 | info "... done: activate script created." 427 | 428 | 429 | # Optionally execute a site-specific post installation julia script (if passed a third argument) 430 | 431 | if [ -n "${JUHPC_POST_INSTALL_JL}" ]; then 432 | info "Executing site-specific post-installation julia script (using the project where preferences were set)..." 433 | julia --project="$JULIA_PREFDIR" "$JUHPC_POST_INSTALL_JL" 2>&1 | tee -a "$JUHPC_SETUP_INSTALLLOG" 434 | info "... done: post-installation script completed." 435 | fi 436 | 437 | 438 | # Remove temporary juliaup installation and Manifest.toml in the preferences directory 439 | 440 | cleanup 441 | mv "$JULIA_PREFDIR/Manifest.toml" "$JULIA_PREFDIR/Manifest.toml.bak" || { error "failed to move Manifest.toml in $JULIA_PREFDIR"; } 442 | mv "$JUHPC_SETUP_INSTALLLOG" "$JUHPC_SETUP_INSTALLDIR/" || { error "failed to move $JUHPC_SETUP_INSTALLLOG to $JUHPC_SETUP_INSTALLDIR"; } 443 | 444 | info "... The installation of the HPC setup for juliaup, julia and HPC key packages is complete.\n\n" --------------------------------------------------------------------------------