JASMIN Help Site logo JASMIN Help Site logo
  • Docs 
  • Guides 
  • Training 
  • Discussions   

  •   Search this site  

Can't find what you're looking for?

Try our Google custom search, across all JASMIN sites

Docs
  • getting started
    • get started with jasmin
    • generate ssh key pair
    • get jasmin portal account
    • get login account
    • beginners training workshop
    • how to contact us about jasmin issues
    • jasmin status
    • jasmin training accounts
    • tips for new users
    • how to login
    • multiple account types
    • present ssh key
    • reconfirm email address
    • reset jasmin account password
    • ssh auth
    • storage
    • understanding new jasmin storage
    • update a jasmin account
  • interactive computing
    • interactive computing overview
    • check network details
    • login servers
    • login problems
    • graphical linux desktop access using nx
    • sci servers
    • tenancy sci analysis vms
    • transfer servers
    • jasmin notebooks service
    • jasmin notebooks service with gpus
    • creating a virtual environment in the notebooks service
    • project specific servers
    • dask gateway
    • access from vscode
  • batch computing
    • lotus overview
    • slurm scheduler overview
    • slurm queues
    • lotus cluster specification
    • how to monitor slurm jobs
    • how to submit a job
    • how to submit an mpi parallel job
    • example job 2 calc md5s
    • orchid gpu cluster
    • slurm status
    • slurm quick reference
  • software on jasmin
    • software overview
    • quickstart software envs
    • python virtual environments
    • additional software
    • community software esmvaltool
    • community software checksit
    • compiling and linking
    • conda environments and python virtual environments
    • conda removal
    • creating and using miniforge environments
    • idl
    • jasmin sci software environment
    • jasmin software faqs
    • jaspy envs
    • matplotlib
    • nag library
    • name dispersion model
    • geocat replaces ncl
    • postgres databases on request
    • running python on jasmin
    • running r on jasmin
    • rocky9 migration 2024
    • share software envs
  • data transfer
    • data transfer overview
    • data transfer tools
    • globus transfers with jasmin
    • bbcp
    • ftp and lftp
    • globus command line interface
    • globus connect personal
    • gridftp ssh auth
    • rclone
    • rsync scp sftp
    • scheduling automating transfers
    • transfers from archer2
  • short term project storage
    • apply for access to a gws
    • elastic tape command line interface hints
    • faqs storage
    • gws etiquette
    • gws scanner ui
    • gws scanner
    • gws alert system
    • install xfc client
    • xfc
    • introduction to group workspaces
    • jdma
    • managing a gws
    • secondary copy using elastic tape
    • share gws data on jasmin
    • share gws data via http
    • using the jasmin object store
    • configuring cors for object storage
  • long term archive storage
    • ceda archive
  • mass
    • external access to mass faq
    • how to apply for mass access
    • moose the mass client user guide
    • setting up your jasmin account for access to mass
  • for cloud tenants
    • introduction to the jasmin cloud
    • jasmin cloud portal
    • cluster as a service
    • cluster as a service kubernetes
    • cluster as a service identity manager
    • cluster as a service slurm
    • cluster as a service pangeo
    • cluster as a service shared storage
    • adding and removing ssh keys from an external cloud vm
    • provisioning tenancy sci vm managed cloud
    • sysadmin guidance external cloud
    • best practice
  • workflow management
    • rose cylc on jasmin
    • using cron
  • uncategorized
    • mobaxterm
    • requesting resources
    • processing requests for resources
    • acknowledging jasmin
    • approving requests for access
    • working with many linux groups
    • jasmin conditions of use
  • getting started
    • get started with jasmin
    • generate ssh key pair
    • get jasmin portal account
    • get login account
    • beginners training workshop
    • how to contact us about jasmin issues
    • jasmin status
    • jasmin training accounts
    • tips for new users
    • how to login
    • multiple account types
    • present ssh key
    • reconfirm email address
    • reset jasmin account password
    • ssh auth
    • storage
    • understanding new jasmin storage
    • update a jasmin account
  • interactive computing
    • interactive computing overview
    • check network details
    • login servers
    • login problems
    • graphical linux desktop access using nx
    • sci servers
    • tenancy sci analysis vms
    • transfer servers
    • jasmin notebooks service
    • jasmin notebooks service with gpus
    • creating a virtual environment in the notebooks service
    • project specific servers
    • dask gateway
    • access from vscode
  • batch computing
    • lotus overview
    • slurm scheduler overview
    • slurm queues
    • lotus cluster specification
    • how to monitor slurm jobs
    • how to submit a job
    • how to submit an mpi parallel job
    • example job 2 calc md5s
    • orchid gpu cluster
    • slurm status
    • slurm quick reference
  • software on jasmin
    • software overview
    • quickstart software envs
    • python virtual environments
    • additional software
    • community software esmvaltool
    • community software checksit
    • compiling and linking
    • conda environments and python virtual environments
    • conda removal
    • creating and using miniforge environments
    • idl
    • jasmin sci software environment
    • jasmin software faqs
    • jaspy envs
    • matplotlib
    • nag library
    • name dispersion model
    • geocat replaces ncl
    • postgres databases on request
    • running python on jasmin
    • running r on jasmin
    • rocky9 migration 2024
    • share software envs
  • data transfer
    • data transfer overview
    • data transfer tools
    • globus transfers with jasmin
    • bbcp
    • ftp and lftp
    • globus command line interface
    • globus connect personal
    • gridftp ssh auth
    • rclone
    • rsync scp sftp
    • scheduling automating transfers
    • transfers from archer2
  • short term project storage
    • apply for access to a gws
    • elastic tape command line interface hints
    • faqs storage
    • gws etiquette
    • gws scanner ui
    • gws scanner
    • gws alert system
    • install xfc client
    • xfc
    • introduction to group workspaces
    • jdma
    • managing a gws
    • secondary copy using elastic tape
    • share gws data on jasmin
    • share gws data via http
    • using the jasmin object store
    • configuring cors for object storage
  • long term archive storage
    • ceda archive
  • mass
    • external access to mass faq
    • how to apply for mass access
    • moose the mass client user guide
    • setting up your jasmin account for access to mass
  • for cloud tenants
    • introduction to the jasmin cloud
    • jasmin cloud portal
    • cluster as a service
    • cluster as a service kubernetes
    • cluster as a service identity manager
    • cluster as a service slurm
    • cluster as a service pangeo
    • cluster as a service shared storage
    • adding and removing ssh keys from an external cloud vm
    • provisioning tenancy sci vm managed cloud
    • sysadmin guidance external cloud
    • best practice
  • workflow management
    • rose cylc on jasmin
    • using cron
  • uncategorized
    • mobaxterm
    • requesting resources
    • processing requests for resources
    • acknowledging jasmin
    • approving requests for access
    • working with many linux groups
    • jasmin conditions of use
  1.   Software on JASMIN
  1. Home
  2. Docs
  3. Software on JASMIN
  4. Compiling and linking

Compiling and linking

 

Share via
JASMIN Help Site
Link copied to clipboard

Compiling and linking code which uses libraries provided on JASMIN

On this page
Introduction   Using Jaspy and the GNU compilers   Loading the compilers   Pointing to the libraries   Using *-config scripts   Using the Intel OneAPI compilers   Loading the compilers   NetCDF libraries for use with Intel OpenAPI   Pointing to the libraries   Using the *-config scripts  

Introduction  

The Jaspy environment on JASMIN contains the GNU compilers and MPICH  , plus a wide range of libraries from conda-forge that will interoperate with these.

Separately, we also provide the Intel OneAPI compilers  and Intel MPI  , and a much more limited set of libraries built with these (currently just the netCDF libraries and their dependencies).

In either case, to compile and link software that utilises these libraries, you need to:

  • ensure that you are using the correct compiler
  • point to the location of the libraries

This page provides details on how to do this, first for Jaspy/GNU and then for Intel OneAPI.

Using Jaspy and the GNU compilers  

Loading the compilers  

  • To ensure that you are using the correct compilers, simply use the command

    module load jaspy

    to load the Jaspy module or

    module load jaspy/3.11/v20240508

    to load a specific Jaspy version.

    This will put the directory containing the GNU compilers (gcc, g++, gfortran) into your PATH. That directory also contains the corresponding MPI compiler wrappers (mpicc, mpicxx, mpif77, mpif90) which you can use instead of using gcc etc directly if you want to compile parallel code.

  • Loading the module also sets the CONDA_PREFIX environment variable to point to the root of the relevant Jaspy installation, for example /apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815.

    (The directory which gets added to PATH as described above is equivalent to $CONDA_PREFIX/bin.)

  • You can use the which command to verify that you are using the correct compiler versions, for example, type:

    which gfortran

    This should report a path that is under the directory mentioned above.

  • If instead you see a compiler path under /usr/bin, then this is a different compiler version (provided by the operating system), and is not compatible with the libraries in Jaspy or supported by the JASMIN helpdesk for running user code. In that case, check your $PATH.

Pointing to the libraries  

  • To use the libraries under the Jaspy environment, you need to add the following additional flags to the compiler command line:

    • At the compilation stage, you need to point to the include/ subdirectory containing header files. If making use of the CONDA_PREFIX environment variable, this would mean using this compiler option:
    -I$CONDA_PREFIX/include
    • At the linking stage, you need to point to the lib/ subdirectory containing the libraries themselves. This will require the following linker option:
    -L$CONDA_PREFIX/lib

    and is in addition to the relevant -l options for the specific libraries that you want to link to (for example -lfftw3 to use libfftw3.so).

    • If you are compiling and linking in a single step, put both of the above options on the same command line.

    • If you are running the compiler indirectly via an installer for another package, rather than running the compiler commands directly yourself, note that there are environment variables which are commonly used to specify these options, which you would set before invoking the installer. For example:

    export CFLAGS="-I$CONDA_PREFIX/include"
    export LDFLAGS="-L$CONDA_PREFIX/lib"

    although the names of these might differ slightly (for example COPTS instead of CFLAGS, maybe FFLAGS or F77FLAGS or F90FLAGS as appropriate), so check the instructions for the package that you are trying to build.

Using *-config scripts  

As an alternative to pointing explicitly to the relevant include and lib directories, some of the software packages provide *-config executables which report the relevant flags to be used during compilation. This includes, for exmple, the netCDF C library and also its Fortran / C++ wrappers; these provide nc-config, nf-config, and ncxx4-config. See ls $CONDA_PREFIX/bin/*-config for a list of other similar executables. Provided that the directory containing these is in your PATH (as will be the case after loading the Jaspy module), the output of these commands can be captured by a script and used to provide the relevant compiler and linker options, without you needing to specify any paths explicitly. For example, a program myprog.c that uses the netCDF C library could be compiled and linked using:

# first set some variables using the nc-config script
cc=$(nc-config --cc)
cflags=$(nc-config --cflags)
libs=$(nc-config --libs)
# now use these variables to construct the relevant commands
$cc -c $cflags myprog.c
$cc $libs myprog.o -o myprog

If you are building a third-party package that depends on netCDF and which utilises the nc-config script in this way, then after you have loaded the Jaspy module, you should not need to do anything else in order to tell it where the library is located.

(The nc-config program can also provide other information about the build of the library that you are using; type nc-config --help for more info.)

Using the Intel OneAPI compilers  

Components of the Intel OneAPI  are provided for use with the Rocky 9 sci servers and Rocky 9 LOTUS nodes.

It is advisable to unload the Jaspy module when using the Intel compilers, to avoid any compatibility issues.

Loading the compilers  

If Jaspy is already loaded, start by typing

module unload jaspy

then:

module load oneapi/compilers

this will enable the following commands:

  • the Fortran compiler ifx
  • the C compiler icx
  • the C++ compiler icpx

(Typing ifort will give a deprecation warning, so use ifx instead.)

In addition to these, the OneAPI suite also includes an MPI implementation. You can load this by typing:

module load oneapi/mpi

This provides (amongst other things):

  • compiler wrappers mpif77, mpif90, mpicc, mpicxx
  • the run-time wrapper mpirun (which can also be invoked as mpiexec)

NetCDF libraries for use with Intel OpenAPI  

CEDA has provided an installation of the netCDF C library that uses the OneAPI compilers, together with its dependencies (HDF5, pnetcdf). Fortran and C++ language wrappers are also provided.

This installation is built with support for parallel access, although the user code needs to request parallel mode when opening the file in order to utilise this. Parallel netCDF makes use of the Intel MPI library.

To use Intel OneAPI, type one (or more) of these module commands:

  • For the C library - also includes associated command-line utilities, ncdump etc:
module load netcdf/intel2024.2.0/4.9.2
  • For the C library and C++ wrappers:
module load netcdf/intel2024.2.0/c++/4.3.1
  • For the C library and Fortran wrappers:
module load netcdf/intel2024.2.0/fortran/4.6.1

Loading these netCDF modules will also load the relevant compiler and MPI modules, so you do not need to load those explicitly.

As in the case of GNU compilers described above, you will have two approaches available for compiling netCDF code: either to point to the libraries explicitly, or to use the *-config scripts. These are described in more detail below.

Pointing to the libraries  

Once you have loaded these modules, the environment variable NETCDF_ROOT is set for you, and as appropriate, NETCDFF_ROOT (for Fortran) and/or NETCDF_CXX4_ROOT (for C++). These variables are not generally used by the software, but may be useful to you when writing your own scripts in order to refer to the location of the libraries. They can be used analogously to how CONDA_PREFIX is used in the Jaspy/GNU examples above, as follows (assuming again that your program is called myprog):

  • C
## compile:
cx -c myprog.c -I$NETCDF_ROOT/include

## link:
icx -o myprog myprog.o -L$NETCDF_ROOT/lib -lnetcdf
  • Fortran:
## compile:
ifx -c myprog.f -I$NETCDFF_ROOT/include  ## (also works with F90)

## link:
ifx -o myprog myprog.o -L$NETCDFF_ROOT/lib -lnetcdff -L$NETCDF_ROOT/lib -lnetcdf
  • C++:
## compile:
icpx -c myprog.cpp -I$NETCDF_ROOT/include -I$NETCDF_CXX4_ROOT/include

## link:
icpx -o myprog myprog.o -L$NETCDF_CXX4_ROOT/lib -lnetcdf_c++4 -L$NETCDF_ROOT/lib -lnetcdf
  • Parallel example (Fortran):
## compile and link:
mpif90 -o myprog myprog.F90 -I $NETCDFF_ROOT/include -L $NETCDFF_ROOT/lib -lnetcdff

## run:
mpirun -np 4 ./myprog

A runnable example script that demonstrates these with some test programs from Unidata can be found at:

   /apps/jasmin/supported/libs/netcdf/intel2024.2.0/intel_netcdf_examples.sh

If running the parallel test, ensure that you are using a filesystem that supports parallel writes.

Using the *-config scripts  

Just as described above when using Jaspy, you will have the directories containing executables nc-config, nf-config, and ncxx4-config in your $PATH, provided that you have loaded the relevant modules. (In fact, these will be in $NETCDF_ROOT/bin, $NETCDFF_ROOT/bin, $NETCDF_CXX4_ROOT/bin, but you shouldn’t need to refer to these paths explicitly.)

Follow the same approach as described above, capturing the output of these commands when run with the relevant command-line options. (Search for nc-config above in this page.) The build script for your application should then look exactly the same as if you were using Jaspy, apart from loading a different module to start with, even though the actual compiler options will be different.

Last updated on 2024-11-26 as part of:  review edits (de1d0433a)
On this page:
Introduction   Using Jaspy and the GNU compilers   Loading the compilers   Pointing to the libraries   Using *-config scripts   Using the Intel OneAPI compilers   Loading the compilers   NetCDF libraries for use with Intel OpenAPI   Pointing to the libraries   Using the *-config scripts  
Follow us

Social media & development

   

Useful links

  • CEDA Archive 
  • CEDA Catalogue 
  • JASMIN 
  • JASMIN Accounts Portal 
  • JASMIN Projects Portal 
  • JASMIN Cloud Portal 
  • JASMIN Notebooks Service 
  • JASMIN Community Discussions 

Contact us

  • Helpdesk
UKRI/STFC logo
UKRI/NERC logo
NCAS logo
NCEO logo
Accessibility | Terms and Conditions | Privacy and Cookies
Copyright © 2025 Science and Technology Facilities Council.
Hinode theme for Hugo licensed under Creative Commons (CC BY-NC-SA 4.0).
JASMIN Help Site
Code copied to clipboard