WikiPrint - from Polar Technologies

How to use the IPSL models and tools at obelix/LSCE cluster


The LSCE computing environment is detailed here: https://w3.lsce.ipsl.fr/informatique/util/index.php. You can only access this web page via the LSCE network. The direct access to the cluster is only possible from the LSCE or from the TGCC machines. The cluster can be accessed via ssh and xdmcp protocols.

1. Modipsl and compiling

Install and compile from obelix. By default the compiling is done for MPI parallel mode. The compiler is ifort (Intel compiler). The lxiv8 target in modipsl/util/AA_make.gdef is used on obelix. Currently the components ORCHIDEE, LMDZ, IOIPSL and XIOS can be installed on obelix using the default compiling options in modipsl.

2. libIGCM and environment

libIGCM can be used at obelix. Computing, rebuild and time-series are performed. No pack, atlas or monitoring is done.

The default shell at LSCE is tcsh, which syntax is different from the ksh syntax used by libIGCM. No specific configuration of your account is needed to compile and run using libIGCM. But you can use the environment proposed on the shared account by coping the file /home/users/igcmg/.bashrc in your $HOME.

3. Disk space and archive directory

The home for each login in /home/user/ have very small space. You need to have write access to another disk depending on the project you work on. All login can write to the disk /home/scratch01/login. Note that files older than 30 days can be deleted without warning on this disk.

The simulation output is stored in /home/scrath01. You need to change it to set a permanent archive by using ARCHIVE variable in config.card.

The default archive directory at obelix is set to /home/scratch01/yourlogin. To store your simulation on a permanent disk you need to set ARCHIVE=/home/diskXXX/yourlogin in config.card section [UserChoices].

Example: config.card

[UserChoices]
JobName=testO1
TagName=OL2
SpaceName=PROD
ExperimentName=clim
ARCHIVE=/home/orchidee02/login
...

Change login to your personal login. In the example according to the above config.card the simulation will be stored in /home/orchidee02/login/IGCM_OUT/OL2/PROD/clim/test01 .

4. Example of parallel MPI job

Here is an example of a simple job to run the orchidee_ol executable. All input files and the executable must be in the directory before running the executable.

######################
## OBELIX      LSCE ##
######################
#PBS -N MyTest
#PBS -m a
#PBS -j oe
#PBS -q medium
#PBS -o Script_Output
#PBS -S /bin/ksh
#PBS -l nodes=1:ppn=8


## First source following to acces the command module under ksh
source /usr/share/Modules/init/ksh

## Source same modules as used during compilation. See examples below depending on the configurations used.
# For ORCHIDEE_3 and more recent offline versions or coupled v6.2 and more recent versions:
#source ..../config/ORCHIDEE_OL/ARCH/arch-ifort_LSCE.env
#source ..../config/LMDZOR_v6/ARCH/arch-ifort_LSCE.env
# For ORCHIDEE_2_0, 2_1, 2_2 and coupled models v6.1.x:
#source /home/orchideeshare/igcmg/MachineEnvironment/obelix/env_atlas_obelix

## Go to current folder and execute
cd $PBS_O_WORKDIR
time mpirun orchidee_ol

To submit it you need to use the command qsub, and you can follow your simulation with the command qstat -u login

5. Specific installation of LMDZ