wiki:Doc/ComputingCenters/LSCE

Version 4 (modified by jgipsl, 9 years ago) (diff)

--

How to use the IPSL models and tools at LSCE


The LSCE computing environment is detailed here: https://w3.lsce.ipsl.fr/informatique/util/index.php. You can only access this webpage via the LSCE network.

  • The interactivity

The network includes a cluster for the interactive mode. This cluster is considered as a unique machine called asterix.lscelb.extra.cea.fr which can be shorten into asterix.lscelb.

The direct access to the cluster is only possible from the LSCE or from the CCRT machines. The cluster can be accessed via ssh and xdmcp protocols.

  • The computing cluster

The LSCE has a small computing cluster. See its users' manual here: https://w3.lsce.ipsl.fr/informatique/util/calcul/batch.php. This cluster is considered as a unique machine called obelix.lscelb.extra.cea.fr which can be shorten into obelix.lscelb.

1. Modipsl and compiling

By default the compiling is done for MPI parallel mode. The compiler is ifort (Intel compiler). The lxiv8 target in modipsl/util/AA_make.gdef is used on obelix. Currently only ORCHIDEE, LMDZ, IOIPSL and XIOS are installed on obelix.

2. libIGCM and environment

libIGCM can be used on the LSCE computing cluster.

The default shell at LSCE is tcsh, which syntax is different from the ksh syntax used by libIGCM. To configure your environment correctly in order to correctly run libIGCM in ksh, the easiest is to copy the files /home/users/igcmg/.bashrc in your $HOME.

3. Example of parallel MPI job

Here is an example of a simple job to run the orchidee_ol executable. All input files and the executable must be in the directory before running the executable.

######################
## OBELIX      LSCE ##
######################
#PBS -N MyTest
#PBS -m a
#PBS -j oe
#PBS -q medium
#PBS -o Script_Output_SECHSTOM.000001
#PBS -S /bin/ksh
#PBS -v BATCH_NUM_PROC_TOT=4
#PBS -l nodes=1:ppn=4

cd $PBS_O_WORKDIR
mpirun -np ${BATCH_NUM_PROC_TOT} orchidee_ol

To submit it you need to use the command qsub, and you can follow your simulation with the command qstat -u login