Version 11 (modified by nicolasmartin, 3 years ago) (diff)

AGRIF nesting tools

  2. Preparing the child model
  3. Running the model

Last edition: 05/13/17 03:19:46 by nicolasmartin

AGRIF is a package for the integration of adaptive mesh refinement (AMR) features within a multidimensional model written in Fortran and discretized on a structured grid. This nesting capability that allows resolution to be focused over a region of interest by introducing an additional grid has been added to NEMO. In the current implementation only horizontal refinement is available.

This page aims to explain the way to run an embedded model by using the NEMO/AGRIF framework in a pre-defined configuration. In this way we are going to describe here how to provide the grid coordinates, the surface forcing and the initial conditions required by each child model. In order to become familiar with NEMO/AGRIF, you should read it even if you are going to create your own configuration.


Nesting tools source code : AttachmentNum(1)?

documentation (pdf) : AttachmentNum(2, raw=true)?

NEMO User's meeting 2006 presentation (pdf ) : AttachmentNum(3, raw=true)?

attached movie ITF (mpg): AttachmentNum(4)?

Preparing the child model

One of the features of the AGRIF package is its ability to deal with the use of fixed fine grids in the domain. Here are presented the various basic knowledges every user should have in order to define his child grid domain.

Refinement ratio

In order to implement an embedded model user have to specify his space and time refinement ratio. Every integer number is allowed by AGRIF indeed even and odd refinement ratio in space as well as in time could be chosen. But user have to keep in mind that a too large value could imply numerical troubles, in this way a value between 2 and 5 seems to be an acceptable choice. Localization of fine grid points for odd and even refinement ratio are described in the figure below Concerning time refinement ratio, there is no specific criterion to determine its value but of course the selected value must prevent the model from blowing up.

Child Grid Position

The nested grids are rectangular and are aligned with the parent (coarser) grid within they are nested. imin,imax,jmin and jmax refer to the positions of the child grid corners, in terms of its parent grid starting with the value 1 in each space direction. When the user chooses these positions, one has to take care that fine grid is contained in the parent grid. (imin,jmin) are the lower left corner position and (imax,jmax) are the upper right corner position. The indice of the first grid point inside the fine domain is given by the formulae:

  • ibeg = imin+ptx-1 in x-direction
  • jbeg = jmin+pty-1 in y-direction

Where ptx and pty are indexes of the first fine grid point inside the domain, those values depend on the staggered variable. In the NEMO framework there is two ghost cells around fine grids so the ptx and pty values are the following :

point typeptx pty
T 3 3
U 2 3
V 3 2
F 2 2

This is a quite important feature to know in order to properly choose integers imin,imax,jmin and jmax for your fine grid domain definition. Indeed the indice of the first coarse grid point inside child domain is (imin+2,jmin+2) and not (imin,jmin). To sum up, the configuration of a given AGRIF zoom on a coarse domain for a 1:3 refinement ratio is described here:

Then in order to prepare a simulation, user must declare the fine grids characteristics in a file called which has to be in the directory where the code runs. For each fine grid and each space direction, a line indicates its position on its parent grid as well as its space and time refinement factors.

Typically a line contains those information in this order : imin imax jmin jmax spacerefx spacerefy timeref Let's consider an example with a root coarse grid (G0) which has two child grids (G1) and (G2), and G1 has one child grid (G3). Space and time refinement factors are equal to 3, in this case the file syntax is the following one :

imin imax jmin jmax spacerefx spacerefy timeref
2 (G0) has 2 child grids : (G1) and (G2)
40 70 2 30 3 3 3 positions and refinement factors for (G1)
110 130 50 80 3 3 3 positions and refinement factors for (G2)
1 (G1) has 1 child grid (G3)
20 40 8 30 3 3 3 positions and refinement factors for (G3)
0 (G2) has no child grid
0 (G3) has no child grid

One-way vs Ywo-way nesting

Nested grid simulations can be produced using either 1-way nesting or 2-way nesting. Those options refer to how a coarse grid and the fine grid interact. In both the 1-way and 2-way simulation modes, the fine grid boundary conditions are interpolated from the coarse grid. In a 1-way nest, this is the only information exchange between the grids (coarse-to-fine). In the 2-way nest integration, the fine grid solution are used to update the coarse grid solution for coarse grid points that lie inside the fine grid (coarse-to-fine and fine-to-coarse). User can choose between those both kind of interactions by editing Agrif_OPA_Update.F90 file located in NESTING_SRC directory (as shown below). This file contains a cpp key called TWO_WAY if one puts #define TWO_WAY, that means 2-way interactive grid nesting is selected at the opposite #undef TWO_WAY means 1-way nesting will be applied.

Getting the lateral boundary conditions, winds and surface fluxes

This step makes use of NEMO/AGRIF Nesting tools package downloadable at the top of this page. Those tools are fully described in its attached documentation.

Please refer to this file for a detailed approach of this package:

Let's just consider the current chapter as a reminder of the functioning of nesting tools.

The first stage consists in editing the Makefile corresponding to your architecture in src directory in order to properly mention the path to your netcdf library with F90 enabled.


cd Nesting_tools/src
vi Makefile_g95

The various architecture currently supported are :

Architecture Compiler
PC/Linux ifort, pgf90, g95
Alpha Compaq f95
IBM xlf90
MacOS g95

hen once Makefile is properly edited, user can compile the code and check the result in bin directory. One should find five executables namely create_coordinates.exe create_bathy.exe create_data.exe create_restart.exe (if needed) and create_restart_trc.exe (when using PISCES for instance).


make -f Makefile_g95
cd ../bin

There is a sample namelist called agulhas to create forcing files for agulhas area. You may edit or copy this file to add your own forcing files, change the zoom area and adjust interpolation parameters.


./create_coordinated.exe agulhas (step 1, always needed)

./create_bathy.exe agulhas (step 2, always needed)

./create_data agulhas (step 3, not needed when using online interpolation)

This order must be kept because create_bathy.exe requires a coordinates file build by create_coordinates.exe in the same way create_data.exe makes use of a bathymetry file produced by create_bathy.exe.

Both grid and bathymetry creation are needed to perform an AGRIF run. For fluxes (step 3), the user is free to use the nesting tools or online interpolation :

  • when using the nesting tools, every forcing files needed for a traditional NEMO run (i.e. without AGRIF) have to be created for fine grids except specific files like runoff.
  • when using online interpolation ,the child grid will use the coarse grid input files during the run, performing an interpolation on the fly (bilinear or bicubic), so we have to provide an weight file (see On the Fly Interpolation)

In case of a multiple nests configuration, one have to fill in in accordance with the prefix given by nesting tools code. Let's consider the example used in section 1.2, (G1) forcing files must be named with 1_ prefix, (G2) with 2_ and (G3) with 3_. But if (G1) and (G2) definition lines are inverted, (G2) should correspond to 1_ prefix and (G1) to 2_. When running nesting tools program if a whole set of 1_ prefix files is already present in bin directory the program will create by default 2_ prefix files. In the case of a number of level of zoom greater than 2, when considering level n one has just to fill in namelist file by replacing forcing files by those created at level n-1.

At this step, user should have filled in and build coordinates, bathymetry and forcing files for every grid mentioned in this file.

Running the model

Input data

data needed to run with OPA are of two types : the namelists and the forcing files (already created thanks to nesting tools). Concerning namelists one need to provide one namelist by grid, the naming convention is the same than the one for forcing files.


cp namelist 1_namelist
cp namelist_ice 1_namelist_ice  (if LIM is included in the configuration)
vi 1_namelist

Then user should edit those namelists in order to modify the number and the duration of time steps in order that those number match the coarse grid values, taking into account refinement in time. Some other coefficients linked with the horizontal resolution of the grid can be tuned (mostly concerning diffusion and viscosity) indeed each grid has his own set of coefficients.

Coarse Grid namelist Fine Grid 1_namelist
nitend 52500 157500
rdt 1800 600

Most important namelist variables to edit. Example for a 1:3 time refinement ratio

User has now to gather in a directory every forcing files, namelists as well as opa executable and file. The simulation is ready to be carried out.

When using online interpolation, the forcing part of 1_namelist will look like (here with core forcing):

    &namsbc_core ! namsbc_core CORE bulk formulea
    ! ! file name ! frequency (hours) ! variable ! time interpol. ! clim ! 'yearly' or ! weights ! rotation !
    ! ! ! (if <0 months) ! name ! (logical) ! (T/F) ! 'monthly' ! filename ! pairing !
    sn_wndi = 'u10_core' , -1. , 'u10' , .true. , .true. , 'yearly' ,'' , ''
    sn_wndj = 'v10_core' , -1. , 'v10' , .true. , .true. , 'yearly' ,'' , ''
    sn_qsr = 'qsw_core' , -1. , 'swdn' , .true. , .true. , 'yearly' ,'', ''
    sn_qlw = 'qlw_core' , -1. , 'lwdn' , .true. , .true. , 'yearly' ,'', ''
    sn_tair = 't2_core' , -1. , 't2' , .true. , .true. , 'yearly' ,'', ''
    sn_humi = 'q2_core' , -1. , 'q2' , .true. , .true. , 'yearly' ,'', ''
    sn_prec = 'precip_core' , -1. , 'precip' , .true. , .true. , 'yearly' ,'', ''
    sn_snow = 'snow_core' , -1. , 'snow' , .true. , .true. , 'yearly' ,'', ''

And we need only a weight file called in the running directory, while core forcing files corresponds to the coarse grid input files.

Output data

output files are NetCDF or DIMG (key_dimgout activated) files giving the diagnostics at points T,U,V,W of the Arakawa C grid. For fine grids output files, the naming convention is exactly the same than the one mentioned above namely a different prefix for each grid of the hierarchy. The output files contain the both ghostcells, mentioned before, at the boundaries, so depending the visualization tool used, user can have the value zero at the boundaries of child grids. Note that the text file called ocean.ouput containing all the information printed during the run is also created for fine grids.

Development Team

Organization Name Developments contact
LMC/IMAG,Grenoble Laurent Debreu MPI implementation,Numerical implementation,2 way grid nesting Laurent.Debreu@…
LMC/IMAG,Grenoble Cyril Mazauric Installation environment,Portability on various platforms,Test of performance,Computation issuesCyril.Mazauric@…
LMC/IMAG,Grenoble Florian Lemarié Nesting tools,New AGRIF interpolation schemes (PPM/ENO),Numerical issues Florian.Lemarie@…



Known Users

Organization Name Region of interest Configuration
LEGI Grenoble FRANCE Jean-Marc Molines Gulf of Mexico NATL3/NATL4
CICESE Ensenada MEXICO Julio Scheinbob,Julio Candela, Julien Jouanno Gulf of Mexico NATL3
DFO St Johns CANADA Andry Ratsimandresy,Fraser Davidson Newfoundland, Labrador sea NOOFS based on ORCA025
LPO Brest Virginie Thierry Bay of Biscay NATL4
NOCS Southampton Steve Alderson
IFM-GEOMAR Kiel Arne Biastoch South Africa ORCA025
Mercator OCean Jerome Chanut North Atlantic CREG025, CREG12


  • AGRIF web page - (Laurent Debreu, LMC-IMAG)
  • Nesting Tool:
    • CROCO Matlab Toolbox - (Pierrick Penven, Patrick Marchesiello - IRD)
    • SCRIP package - (Los Alamos National Laboratory)

Attachments (8)