New URL for NEMO forge!   http://forge.nemo-ocean.eu

Since March 2022 along with NEMO 4.2 release, the code development moved to a self-hosted GitLab.
This present forge is now archived and remained online for history.
user/techene/TP_NEMO – NEMO
wiki:user/techene/TP_NEMO

Version 12 (modified by techene, 3 years ago) (diff)

--

Make sure proper libs are at proper places

File tree

Note that in the provided environment XIOS and softwares from INSTALL should be already compiled, arch file should have already been customized in order to be aware of the following tree.

HOME
  |-- XIOS
  |-- INSTALL
  |-- NEMO
        |-- arch
             |-- arch-MY_ENV.fcm
             |-- arch-linux_gfortran.fcm
             |-- ...
        |-- src
        |-- tests
             |-- CANAL (compiled)
                   |-- MY_SRC
                        |-- usrdef_sbc.F90
                        |-- ...
                   |-- EXPREF
                   |-- EXP00
                        |-- nemo (link)
                        |-- namelist_cfg
                        |-- ...
                   |-- WORK
                   |-- BLD
             |-- ...
        |-- cfgs
             |-- ORCA2_ICE_PISCES (never compiled)
                   |-- EXPREF
                        |-- namelist_cfg
                        |-- ...
             |-- ...

Get NEMO code for users

# root
cd HOME
# my login
me=techene
# get last trunk revision 
svn co forge.ipsl.jussieu.fr/ipsl/forge/projets/nemo/svn/NEMO/release/r4.0/r4.0.5 NEMO
# let's go !
cd NEMO

Get NEMO code for developers

# root
cd HOME
# my login
me=techene
# get last trunk revision 
svn co svn+ssh://${ME}@forge.ipsl.jussieu.fr/ipsl/forge/projets/nemo/svn/NEMO/trunk NEMO
# let's go !
cd NEMO

Compile NEMO

generic example : ./makenemo -n MY_CONFIG -j 6 -m MY_ENV

  • Replace -n MY_CONFIG by names you find in NEMO/tests prefixed with -a or NEMO/cfgs prefixed with -r.
  • Replace MY_ENV by linux_gfortran or by any name you chose when you created your arch file.
# check you are in HOME you want to go to HOME/NEMO
pwd
# Let's go !
cd NEMO
# generic example : ./makenemo -n MY_CONFIG -j 6 -m MY_ENV 
./makenemo -h
# compile ORCA2 configuration in the pre-set environment
./makenemo -r ORCA2_ICE_PISCES -j 6 -m linux_gfortran
# compile CANAL configuration in the pre-set environment
./makenemo -a CANAL -j 6 -m linux_gfortran

Resulting executable is created in a repository associated with the MY_CONFIG configuration. In the generic example case it is ./*/MY_CONFIG/EXP00/. * stands either for tests or cfgs depending on where your configuration was originally located.

Run NEMO's ORCA2 configuration

# check you are in HOME/NEMO you want to go to HOME/NEMO/cfgs/ORCA2_ICE_PISCES/EXP00
pwd
# Let's go !
cd ./cfgs/ORCA2_ICE_PISCES/EXP00
# Check executable presence
ls -lrt
# Run NEMO on 4 nodes (default)
mpirun -np 4 ./nemo
# Run NEMO with customize parameters
vi namelist_cfg
mpirun -np 4 ./nemo
# Check NEMO runtime
time mpirun -np 8 nemo

What happens ? namelist_cfg is the parameter file that is where you change values and input files in order to customize your run.

  • model time step : rn_rdt is express in seconds

Default value for ORCA2 is rn_rdt = 5760 sec. How many time steps do the model in 1 day ? => 1 day is 86400 seconds => 86400/rn_rdt = #time steps per day => for ORCA2 it is 15

  • simulation duration : nn_itend is the number of time step

Default number for ORCA2 is nn_itend = 5840 How many day last such a simulation ? nn_itend x rn_rdt / 86400 = 389 that is about a year (?) In the .nc simulation output file "ORCA2_XXXXX_grid_T.nc" one time point is written every 5 days.

  • layout and resolution : ln_read_cfg & cn_domcfg

Default grid specified by reading a configuration file ln_read_cfg = .true. & cn_domcfg = "ORCA_R2_zps_domcfg" This configuration use a specific grid ORCA with a 2 degre resolution on average (?) resolution is automatically deduced from the file content. The file must be in HOME/NEMO/cfgs/ORCA2_ICE_PISCES/EXP00.

  • wind stress files : sn_wndi & sn_wndj

Default forcing files come from an average climatology sn_wndi = "u_10.15JUNE2009_fill" but in this work we use real forcing files coming from actual measurements from 1994 to 1997 sn_wndi = "ERAi_1d_all_PAC_R2". As a result I expect ORCA2 tropical instabilities to be "sharper" than the ones from the default experiment.

Run NEMO's CANAL configuration

# check you are in HOME/NEMO you want to go to HOME/NEMO/tests/CANAL/EXP00
pwd
# Let's go !
cd ./tests/CANAL/EXP00
# Check executable presence
ls -lrt
# Run NEMO on 4 nodes (default parameters)
mpirun -np 4 ./nemo
# Run NEMO with customize parameters
vi namelist_cfg
mpirun -np 4 ./nemo
# Check NEMO runtime
time mpirun -np 8 nemo

What happens ? NEMO's sources (files.F90) are in src and in ./tests/CANAL/MY_SRC here we may want to change them in order to implement new features such as boundary condition shape. In that case we have to compile NEMO each time we change a source file.

  • change the model time step : rn_rdt
  • change the model resolution : rn_dx & rn_dy
  • add noise : ln_sshnoise
  • change the experience latitude : rn_ppgphi0
  • change the wind : rn_u10 (amplitude) rn_windszx & rn_windszy (orientation)

Stuff for PAC2

# get trunk
svn co svn+ssh://techene@forge.ipsl.jussieu.fr/ipsl/forge/projets/nemo/svn/NEMO/trunk ref_14312
# compile new PAC2 form an ORCA2 ref one
cd ref_14312
./makenemo -n PAC2 -r ORCA2_ICE_PISCES -m X64_JEANZAY -j 8 del_key "key_top"
# prepare PAC2 run
cd ./cfgs/PAC2/EXP00
# get input files
ln -s $WORK/FORCING/ORCA2_ICE_v4.2/* .
ln -s $WORK/M2_TP_OA/ERAi_1d_199*PAC* .
cp /gpfswork/rech/omr/uaw38ip/M2_TP_OA/data_1m_potential_temperature_PAC_R2.nc /gpfswork/rech/omr/uaw38ip/M2_TP_OA/data_1m_salinity_PAC_R2.nc /gpfswork/rech/omr/uaw38ip/M2_TP_OA/eddy_viscosity_2D_PAC_R2.nc /gpfswork/rech/omr/uaw38ip/M2_TP_OA/eddy_viscosity_3D_PAC_R2.nc /gpfswork/rech/omr/uaw38ip/M2_TP_OA/PAC_R2_zps_domcfg.nc /gpfswork/rech/omr/uaw38ip/M2_TP_OA/resto_PAC_R2.nc .
# set script and parameters
vi script.slurm
vi namelist_cfg
# run 
sbatch script.slurm

Attachments (2)

Download all attachments as: .zip