New URL for NEMO forge!   http://forge.nemo-ocean.eu

Since March 2022 along with NEMO 4.2 release, the code development moved to a self-hosted GitLab.
This present forge is now archived and remained online for history.
interfacing_options.rst in NEMO/trunk/doc/rst/source – NEMO

source: NEMO/trunk/doc/rst/source/interfacing_options.rst @ 10186

Last change on this file since 10186 was 10186, checked in by nicolasmartin, 6 years ago

Preliminary implementation of a NEMO Quick Start Guide via RST files and Sphinx installation

File size: 19.9 KB
Line 
1===================
2Interfacing options
3===================
4
5.. contents::
6   :local:
7   :depth: 1
8           
9Embedded zooms with AGRIF
10=========================
11
12.. contents::
13   :local:
14
15--------
16Overview
17--------
18
19AGRIF (Adaptive Grid Refinement In Fortran) is a library that allows the seamless space and time refinement over
20rectangular regions in NEMO.
21Refinement factors can be odd or even (usually lower than 5 to maintain stability).
22Interaction between grid is "two-ways" in the sense that the parent grid feeds the child grid open boundaries and
23the child grid provides volume averages of prognostic variables once a given number of time step is completed.
24These pages provide guidelines how to use AGRIF in NEMO.
25For a more technical description of the library itself, please refer to http://agrif.imag.fr.
26
27-----------
28Compilation
29-----------
30
31Activating AGRIF requires to append the cpp key ``key_agrif`` at compilation time:
32
33.. code-block:: sh
34
35   ./makenemo add_key 'key_agrif'
36
37Although this is transparent to users, the way the code is processed during compilation is different from
38the standard case:
39a preprocessing stage (the so called "conv" program) translates the actual code so that
40saved arrays may be switched in memory space from one domain to an other.
41
42--------------------------------
43Definition of the grid hierarchy
44--------------------------------
45
46An additional text file ``AGRIF_FixedGrids.in`` is required at run time.
47This is where the grid hierarchy is defined.
48An example of such a file, here taken from the ``ICEDYN`` test case, is given below::
49
50   1
51   34 63 34 63 3 3 3
52   0
53
54The first line indicates the number of zooms (1).
55The second line contains the starting and ending indices in both directions on the root grid
56(imin=34 imax=63 jmin=34 jmax=63) followed by the space and time refinement factors (3 3 3).
57The last line is the number of child grid nested in the refined region (0).
58A more complex example with telescoping grids can be found below and
59in the ``AGRIF_DEMO`` reference configuration directory.
60
61[Add some plots here with grid staggering and positioning ?]
62
63When creating the nested domain, one must keep in mind that the child domain is shifted toward north-east and
64depends on the number of ghost cells as illustrated by the (attempted) drawing below for nbghostcells=1 and
65nbghostcells=3.
66The grid refinement is 3 and nxfin is the number of child grid points in i-direction. 
67
68.. image:: _static/agrif_grid_position.jpg
69
70Note that rectangular regions must be defined so that they are connected to a single parent grid.
71Hence, defining overlapping grids with the same refinement ratio will not work properly,
72boundary data exchange and update being only performed between root and child grids.
73Use of east-west periodic or north-fold boundary conditions is not allowed in child grids either.
74Defining for instance a circumpolar zoom in a global model is therefore not possible.
75
76-------------
77Preprocessing
78-------------
79
80Knowing the refinement factors and area, a ``NESTING`` pre-processing tool may help to create needed input files
81(mesh file, restart, climatological and forcing files).
82The key is to ensure volume matching near the child grid interface,
83a step done by invoking the ``Agrif_create_bathy.exe`` program.
84You may use the namelists provided in the ``NESTING`` directory as a guide.
85These correspond to the namelists used to create ``AGRIF_DEMO`` inputs.
86
87----------------
88Namelist options
89----------------
90
91Each child grid expects to read its own namelist so that different numerical choices can be made
92(these should be stored in the form ``1_namelist_cfg``, ``2_namelist_cfg``, etc... according to their rank in
93the grid hierarchy).
94Consistent time steps and number of steps with the chosen time refinement have to be provided.
95Specific to AGRIF is the following block:
96
97.. code-block:: fortran
98
99   !-----------------------------------------------------------------------
100   &namagrif      !  AGRIF zoom                                            ("key_agrif")
101   !-----------------------------------------------------------------------
102      ln_spc_dyn    = .true.  !  use 0 as special value for dynamics
103      rn_sponge_tra = 2880.   !  coefficient for tracer   sponge layer [m2/s]
104      rn_sponge_dyn = 2880.   !  coefficient for dynamics sponge layer [m2/s]
105      ln_chk_bathy  = .false. !  =T  check the parent bathymetry
106   /             
107
108where sponge layer coefficients have to be chosen according to the child grid mesh size.
109The sponge area is hard coded in NEMO and applies on the following grid points:
1102 x refinement factor (from i=1+nbghostcells+1 to i=1+nbghostcells+sponge_area)
111
112----------   
113References
114----------
115
116`Debreu, L., P. Marchesiello, P. Penven and G. Cambon, 2012: Two-way nesting in split-explicit ocean models: Algorithms, implementation and validation. Ocean Modelling, 49-50, 1-21. <http://doi.org/10.1016/j.ocemod.2012.03.003>`_
117
118`Penven, P., L. Debreu, P. Marchesiello and J. C. Mc Williams, 2006: Evaluation and application of the ROMS 1-way embedding procedure to the central california upwelling system. Ocean Modelling, 12, 157-187. <http://doi.org/10.1016/j.ocemod.2005.05.002>`_
119
120`Spall, M. A. and W. R. Holland, 1991: A Nested Primitive Equation Model for Oceanic Applications. J. Phys. Ocean., 21, 205-220. <https://doi.org/10.1175/1520-0485(1991)021\<0205:ANPEMF\>2.0.CO;2>`_
121
122----
123
124On line biogeochemistry coarsening
125==================================
126
127.. contents::
128   :local:
129
130.. role:: underline
131   :class: underline
132
133------------
134Presentation
135------------
136
137A capacity of coarsening physics to force a BGC model coupled to NEMO has been developed.
138This capacity allow to run 'online' a BGC model coupled to OCE-SI3 with a lower resolution,
139to reduce the CPU cost of the BGC model, while preserving the effective resolution of the dynamics.
140
141A presentation is available [attachment:crs_wiki_1.1.pdf​ here], where the methodology is presented.
142
143-----------------------------------------------------
144What is available and working for now in this version
145-----------------------------------------------------
146
147[To be completed]
148
149----------------------------------------------
150Description of the successful validation tests
151----------------------------------------------
152
153[To be completed]
154
155------------------------------------------------------------------
156What is not working yet with on line coarsening of biogeochemistry
157------------------------------------------------------------------
158
159[To be completed]
160
161''should include precise explanation on MPI decomposition problems too''
162
163---------------------------------------------
164How to set up and use on line biogeochemistry
165---------------------------------------------
166
167:underline:`How to activate coarsening?`
168
169To activate the coarsening, ``key_crs`` should be added to list of CPP keys.
170This key will only activate the coarsening of dynamics.
171
172Some parameters are available in the namelist_cfg:
173
174.. code-block:: fortran
175
176                  !   passive tracer coarsened online simulations
177   !-----------------------------------------------------------------------
178      nn_factx    = 3         !  Reduction factor of x-direction
179      nn_facty    = 3         !  Reduction factor of y-direction
180      nn_msh_crs  = 0         !  create (=1) a mesh file or not (=0)
181      nn_crs_kz   = 3         ! 0, volume-weighted MEAN of KZ
182                              ! 1, MAX of KZ
183                              ! 2, MIN of KZ
184                              ! 3, 10^(MEAN(LOG(KZ))
185                              ! 4, MEDIANE of KZ
186      ln_crs_wn   = .false.   ! wn coarsened (T) or computed using horizontal divergence ( F )
187                              !                           !
188      ln_crs_top = .true.     !coarsening online for the bio
189   /
190
191- Only ``nn_factx = 3`` is available and the coarsening only works for grids with a T-pivot point for
192  the north-fold lateral boundary condition (ORCA025, ORCA12, ORCA36, ...).
193- ``nn_msh_crs = 1`` will activate the generation of the coarsened grid meshmask.
194- ``nn_crs_kz`` is the operator to coarsen the vertical mixing coefficient.
195- ``ln_crs_wn``
196
197  - when ``key_vvl`` is activated, this logical has no effect;
198    the coarsened vertical velocities are computed using horizontal divergence.
199  - when ``key_vvl`` is not activated,
200
201    - coarsened vertical velocities are computed using horizontal divergence (``ln_crs_wn = .false.``)
202    - or coarsened vertical velocities are computed with an average operator (``ln_crs_wn = .true.``)
203- ``ln_crs_top = .true.``: should be activated to run BCG model in coarsened space;
204  so only works when ``key_top`` is in the cpp list and eventually ``key_pisces`` or ``key_my_trc``.
205
206:underline:`Choice of operator to coarsene KZ`
207
208A sensiblity test has been done with an Age tracer to compare the different operators.
209The 3 and 4 options seems to provide the best results.
210
211Some results can be found [xxx here]
212
213:underline:`Example of xml files to output coarsened variables with XIOS`
214
215In the [attachment:iodef.xml iodef.xml]  file, a "nemo" context is defined and
216some variable defined in [attachment:file_def.xml file_def.xml] are writted on the ocean-dynamic grid. 
217To write variables on the coarsened grid, and in particular the passive tracers,
218a "nemo_crs" context should be defined in [attachment:iodef.xml iodef.xml] and
219the associated variable are listed in [attachment:file_crs_def.xml file_crs_def.xml ].
220
221:underline:`Passive tracers tracers initial conditions`
222
223When initial conditions are provided in NetCDF files, the field might be:
224
225- on the coarsened grid
226- or they can be on another grid and
227  interpolated `on-the-fly <http://forge.ipsl.jussieu.fr/nemo/wiki/Users/SetupNewConfiguration/Weight-creator>`_.
228  Example of namelist for PISCES :
229 
230   .. code-block:: fortran
231
232      !-----------------------------------------------------------------------
233      &namtrc_dta      !    Initialisation from data input file
234      !-----------------------------------------------------------------------
235      !
236         sn_trcdta(1)  = 'DIC_REG1'        ,        -12        ,  'DIC'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
237         sn_trcdta(2)  = 'ALK_REG1'        ,        -12        ,  'ALK'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
238         sn_trcdta(3)  = 'O2_REG1'         ,        -1         ,  'O2'      ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
239         sn_trcdta(5)  = 'PO4_REG1'        ,        -1         ,  'PO4'     ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
240         sn_trcdta(7)  = 'Si_REG1'         ,        -1         ,  'Si'      ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
241         sn_trcdta(10) = 'DOC_REG1'        ,        -12        ,  'DOC'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
242         sn_trcdta(14) = 'Fe_REG1'         ,        -12        ,  'Fe'      ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
243         sn_trcdta(23) = 'NO3_REG1'        ,        -1         ,  'NO3'     ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , ''
244         rn_trfac(1)   =   1.0e-06  !  multiplicative factor
245         rn_trfac(2)   =   1.0e-06  !  -      -      -     -
246         rn_trfac(3)   =  44.6e-06  !  -      -      -     -
247         rn_trfac(5)   = 122.0e-06  !  -      -      -     -
248         rn_trfac(7)   =   1.0e-06  !  -      -      -     -
249         rn_trfac(10)  =   1.0e-06  !  -      -      -     -
250         rn_trfac(14)  =   1.0e-06  !  -      -      -     -
251         rn_trfac(23)  =   7.6e-06  !  -      -      -     -
252     
253         cn_dir        =  './'      !  root directory for the location of the data files
254
255:underline:`PISCES forcing files`
256
257They might be on the coarsened grid.
258
259:underline:`Perspectives`
260
261For the future, a few options are on the table to implement coarsening for biogeochemistry in 4.0 and
262future releases.
263Those will be discussed in Autumn 2018
264
265----
266
267Coupling with other models (OASIS, SAS, ...)
268============================================
269
270NEMO currently exploits OASIS-3-MCT to implement a generalised coupled interface
271(`Coupled Formulation <http://forge.ipsl.jussieu.fr/nemo/doxygen/node50.html?doc=NEMO>`_).
272It can be used to interface with most of the European atmospheric GCM (ARPEGE, ECHAM, ECMWF, Ha- dAM, HadGAM, LMDz),
273as well as to WRF (Weather Research and Forecasting Model), and to implement the coupling of
274two independent NEMO components, ocean on one hand and sea-ice plus other surface processes on the other hand
275(`Standalone Surface Module - SAS <http://forge.ipsl.jussieu.fr/nemo/doxygen/node46.html?doc=NEMO>`_).
276
277To enable the OASIS interface the required compilation key is ``key_oasis3``.
278The parameters to set are in sections ``namsbc_cpl`` and in case of using of SAS also in section ``namsbc_sas``.
279
280----
281
282With data assimilation
283======================
284
285.. contents::
286   :local:
287
288The assimilation interface to NEMO is split into three modules.
289- OBS for the observation operator
290- ASM for the application of increments and model bias correction (based on the assimilation increments).
291- TAM the tangent linear and adjoint model.
292
293Please see the `NEMO reference manual`_ for more details including information about the input file formats and
294the namelist settings.
295
296--------------------------------------
297Observation and model comparison (OBS)
298--------------------------------------
299
300The observation and model comparison code (OBS) reads in observation files (profile temperature and salinity,
301sea surface temperature, sea level anomaly, sea ice concentration, and velocity) and
302calculates an interpolated model equivalent value at the observation location and nearest model timestep.
303The resulting data are saved in a feedback file (or files).
304The code was originally developed for use with the NEMOVAR data assimilation code, but
305can be used for validation or verification of model or any other data assimilation system.
306This is all controlled by the namelist.
307To build with the OBS code active ``key_diaobs`` must be set.
308
309More details in the `NEMO reference manual`_ chapter 12.
310
311Standalone observation operator (SAO)
312-------------------------------------
313
314The OBS code can also be run after a model run using saved NEMO model data.
315This is accomplished using the standalone observation operator (SAO)
316(previously known the offline observation operator).
317
318To build the SAO use makenemo.
319This means compiling NEMO once (in the normal way) for the chosen configuration.
320Then include ``SAO`` at the end of the relevant line in ``cfg.txt`` file.
321Then recompile with the replacement main program in ``./src/SAO``.
322This is a special version of ``nemogcm.F90`` (which doesn't run the model, but reads in the model fields, and
323observations and runs the OBS code.
324See section 12.4 of the `NEMO reference manual`_.
325
326-----------------------------------
327Apply assimilation increments (ASM)
328-----------------------------------
329
330The ASM code adds the functionality to apply increments to the model variables:
331temperature, salinity, sea surface height, velocity and sea ice concentration.
332These are read into the model from a NetCDF file which may be produced by separate data assimilation code.
333The code can also output model background fields which are used as an input to data assimilation code.
334This is all controlled by the namelist nam_asminc.
335To build the ASM code ``key asminc`` must be set.
336
337More details in the `NEMO reference manual`_ chapter 13.
338
339--------------------------------
340Tangent linear and adjoint (TAM)
341--------------------------------
342
343This is the tangent linear and adjoint code of NEMO which is useful to 4D VAR assimilation.
344
345----
346
347Inputs-Outputs (using XIOS)
348===========================
349
350.. contents::
351   :local:
352
353| Output of diagnostics in NEMO is usually done using XIOS.
354  This is an efficient way of writing diagnostics because the time averaging, file writing and even
355  some simple arithmetic or regridding is carried out in parallel to the NEMO model run.
356| This page gives a basic introduction to using XIOS with NEMO.
357  Much more information is available from the XIOS homepage above and from the `NEMO reference manual`_.
358
359Use of XIOS for diagnostics is activated using the pre-compiler key ``key_iomput``.
360The default version of XIOS is the 2.0 release.
361
362------------------------------
363Extracting and installing XIOS
364------------------------------
365
3661. Install the NetCDF4 library.
367   If you want to use single file output you will need to compile the HDF & NetCDF libraries to allow parallel IO.
3682. Download the version of XIOS that you wish to use.
369   The recommended version is now XIOS 2.0:
370   
371   .. code-block:: console
372
373      $ svn co ​http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/branchs/xios-2.0 xios-2.0
374
375   and follow the instructions in `XIOS documentation`_ to compile it.
376   If you find problems at this stage, support can be found by subscribing to the `XIOS users mailing list`_ and
377   sending a mail message to it.
378
379---------
380Namelists
381---------
382
383XIOS is controlled using xml input files that should be copied to your model run directory before
384running the model.
385The exact setup differs slightly between 1.0 and 2.0 releases.
386
387An ``iodef.xml`` file is still required in the run directory.
388For XIOS 2.0 the ``field_def.xml`` file has been further split into ``field_def-oce.xml`` (for physics),
389``field_def-ice.xml`` (for ice) and ``field_def-bgc.xml`` (for biogeochemistry).
390Also the definition of the output files has been moved from the ``iodef.xml`` file into
391separate ``file_definition.xml`` files which are included in the ``iodef.xml`` file.
392Note that the ``domain_def.xml`` file is also different for XIOS 2.0.
393
394-----
395Modes
396-----
397
398Detached Mode
399-------------
400
401In detached mode the XIOS executable is executed on separate cores from the NEMO model.
402This is the recommended method for using XIOS for realistic model runs.
403To use this mode set ``using_server`` to ``true`` at the bottom of the ``iodef.xml`` file:
404
405.. code-block:: xml
406
407   <variable id="using_server" type="boolean">true</variable>
408
409Make sure there is a copy (or link to) your XIOS executable in the working directory and
410in your job submission script allocate processors to XIOS.
411
412Attached Mode
413-------------
414
415In attached mode XIOS runs on each of the cores used by NEMO.
416This method is less efficient than the detached mode but can be more convenient for testing or
417with small configurations.
418To activate this mode simply set ``using_server`` to false in the ``iodef.xml`` file
419
420.. code-block:: xml
421
422   <variable id="using_server" type="boolean">false</variable>
423
424and don't allocate any cores to XIOS.
425Note that due to the different domain decompositions between XIOS and NEMO if
426the total number of cores is larger than the number of grid points in the j direction then the model run will fail.
427
428------------------------------
429Adding new diagnostics to NEMO
430------------------------------
431
432If you want to add a NEMO diagnostic to the NEMO code you will need to do the following:
433
4341. Add any necessary code to calculate you new diagnostic in NEMO
4352. Send the field to XIOS using ``CALL iom_put( 'field_id', variable )`` where ``field_id`` is a unique id for
436   your new diagnostics and variable is the fortran variable containing the data.
437   This should be called at every model timestep regardless of how often you want to output the field.
438   No time averaging should be done in the model code.
4393. If it is computationally expensive to calculate your new diagnostic you should also use "iom_use" to
440   determine if it is requested in the current model run. For example,
441   
442   .. code-block:: fortran
443
444      IF iom_use('field_id') THEN
445         !Some expensive computation
446         !...
447         !...
448         iom_put('field_id', variable)
449      ENDIF
450
4514. Add a variable definition to the ``field_def.xml`` (or ``field_def-???.xml``) file
4525. Add the variable to the ``iodef.xml`` or ``file_definition.xml`` file.
453
454.. _NEMO reference manual:   http://forge.ipsl.jussieu.fr/nemo/doxygen/index.html?doc=NEMO
455.. _XIOS documentation:      http://forge.ipsl.jussieu.fr/ioserver/wiki/documentation
456.. _XIOS users mailing list: http://forge.ipsl.jussieu.fr/mailman/listinfo.cgi/xios-users
Note: See TracBrowser for help on using the repository browser.