New URL for NEMO forge!   http://forge.nemo-ocean.eu

Since March 2022 along with NEMO 4.2 release, the code development moved to a self-hosted GitLab.
This present forge is now archived and remained online for history.
Changeset 10201 for NEMO/trunk/doc/rst/source – NEMO

Ignore:
Timestamp:
2018-10-18T12:53:04+02:00 (5 years ago)
Author:
nicolasmartin
Message:

Various modifications related to the setting of a NEMO Quick Start Guide

  • Add missing namelist blocks from ICE and TOP
  • Create a hidden .global.rst to gather common URL links
  • Convert animated gif to frames images for PDF export
  • Place different README.rst appropriately in the code structure and refer to them with symbolic links in doc/rst/source
Location:
NEMO/trunk/doc/rst/source
Files:
114 added
1 deleted
3 edited

Legend:

Unmodified
Added
Removed
  • NEMO/trunk/doc/rst/source/NEMO_guide.rst

    r10186 r10201  
    44   contain the root `toctree` directive. 
    55 
    6 ================= 
     6################# 
    77Quick Start Guide 
    8 ================= 
     8################# 
     9 
     10.. 
     11   A hidden .global.rst should be included in every subfiles with `include` directive 
     12   It contains a list of common URL links  
     13      
     14.. include:: .global.rst 
     15 
     16.. include:: readme.rst 
     17 
     18Summary 
     19======= 
    920 
    1021.. toctree:: 
     
    1829   setup_configuration.rst 
    1930   interfacing_options.rst 
    20    references.rst 
     31   definitions.rst 
    2132 
    22 .. include:: readme.rst 
     33.. 
     34   For headings markup, this convention is recommended from Python’s Style Guide 
     35   # with overline, for parts 
     36   * with overline, for chapters 
     37   =, for sections 
     38   -, for subsections 
     39   ^, for subsubsections 
     40   ", for paragraphs 
    2341 
    24 Indices and tables 
    25 ================== 
    26  
    27 * :ref:`genindex` 
    28 * :ref:`modindex` 
    29 * :ref:`search` 
     42.. 
     43   Indices and tables 
     44   ================== 
     45   * :ref:`genindex` 
     46   * :ref:`modindex` 
     47   * :ref:`search` 
  • NEMO/trunk/doc/rst/source/conf.py

    r10186 r10201  
    2525 
    2626# The short X.Y version 
    27 version = '' 
     27version = '4.0' 
    2828# The full version, including alpha/beta/rc tags 
    29 release = '4.0' 
     29release = '4.0rc' 
    3030 
    3131 
     
    3939# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom 
    4040# ones. 
    41 extensions = [ 
    42 ] 
     41extensions = ['sphinxcontrib.bibtex'] 
    4342 
    4443# Add any paths that contain templates here, relative to this directory. 
  • NEMO/trunk/doc/rst/source/interfacing_options.rst

    r10186 r10201  
    33=================== 
    44 
     5.. include:: .global.rst 
     6 
    57.. contents:: 
    68   :local: 
    79   :depth: 1 
    8             
    9 Embedded zooms with AGRIF 
    10 ========================= 
    1110 
    12 .. contents:: 
    13    :local: 
    14  
    15 -------- 
    16 Overview 
    17 -------- 
    18  
    19 AGRIF (Adaptive Grid Refinement In Fortran) is a library that allows the seamless space and time refinement over 
    20 rectangular regions in NEMO. 
    21 Refinement factors can be odd or even (usually lower than 5 to maintain stability). 
    22 Interaction between grid is "two-ways" in the sense that the parent grid feeds the child grid open boundaries and 
    23 the child grid provides volume averages of prognostic variables once a given number of time step is completed. 
    24 These pages provide guidelines how to use AGRIF in NEMO. 
    25 For a more technical description of the library itself, please refer to http://agrif.imag.fr. 
    26  
    27 ----------- 
    28 Compilation 
    29 ----------- 
    30  
    31 Activating AGRIF requires to append the cpp key ``key_agrif`` at compilation time:  
    32  
    33 .. code-block:: sh 
    34  
    35    ./makenemo add_key 'key_agrif' 
    36  
    37 Although this is transparent to users, the way the code is processed during compilation is different from 
    38 the standard case: 
    39 a preprocessing stage (the so called "conv" program) translates the actual code so that 
    40 saved arrays may be switched in memory space from one domain to an other. 
    41  
    42 -------------------------------- 
    43 Definition of the grid hierarchy 
    44 -------------------------------- 
    45  
    46 An additional text file ``AGRIF_FixedGrids.in`` is required at run time. 
    47 This is where the grid hierarchy is defined. 
    48 An example of such a file, here taken from the ``ICEDYN`` test case, is given below:: 
    49  
    50    1 
    51    34 63 34 63 3 3 3 
    52    0 
    53  
    54 The first line indicates the number of zooms (1). 
    55 The second line contains the starting and ending indices in both directions on the root grid 
    56 (imin=34 imax=63 jmin=34 jmax=63) followed by the space and time refinement factors (3 3 3). 
    57 The last line is the number of child grid nested in the refined region (0). 
    58 A more complex example with telescoping grids can be found below and 
    59 in the ``AGRIF_DEMO`` reference configuration directory. 
    60  
    61 [Add some plots here with grid staggering and positioning ?] 
    62  
    63 When creating the nested domain, one must keep in mind that the child domain is shifted toward north-east and 
    64 depends on the number of ghost cells as illustrated by the (attempted) drawing below for nbghostcells=1 and 
    65 nbghostcells=3. 
    66 The grid refinement is 3 and nxfin is the number of child grid points in i-direction.   
    67  
    68 .. image:: _static/agrif_grid_position.jpg 
    69  
    70 Note that rectangular regions must be defined so that they are connected to a single parent grid. 
    71 Hence, defining overlapping grids with the same refinement ratio will not work properly, 
    72 boundary data exchange and update being only performed between root and child grids. 
    73 Use of east-west periodic or north-fold boundary conditions is not allowed in child grids either. 
    74 Defining for instance a circumpolar zoom in a global model is therefore not possible.  
    75  
    76 ------------- 
    77 Preprocessing 
    78 ------------- 
    79  
    80 Knowing the refinement factors and area, a ``NESTING`` pre-processing tool may help to create needed input files 
    81 (mesh file, restart, climatological and forcing files). 
    82 The key is to ensure volume matching near the child grid interface, 
    83 a step done by invoking the ``Agrif_create_bathy.exe`` program. 
    84 You may use the namelists provided in the ``NESTING`` directory as a guide. 
    85 These correspond to the namelists used to create ``AGRIF_DEMO`` inputs. 
    86  
    87 ---------------- 
    88 Namelist options 
    89 ---------------- 
    90  
    91 Each child grid expects to read its own namelist so that different numerical choices can be made 
    92 (these should be stored in the form ``1_namelist_cfg``, ``2_namelist_cfg``, etc... according to their rank in 
    93 the grid hierarchy). 
    94 Consistent time steps and number of steps with the chosen time refinement have to be provided. 
    95 Specific to AGRIF is the following block: 
    96  
    97 .. code-block:: fortran 
    98  
    99    !----------------------------------------------------------------------- 
    100    &namagrif      !  AGRIF zoom                                            ("key_agrif") 
    101    !----------------------------------------------------------------------- 
    102       ln_spc_dyn    = .true.  !  use 0 as special value for dynamics 
    103       rn_sponge_tra = 2880.   !  coefficient for tracer   sponge layer [m2/s] 
    104       rn_sponge_dyn = 2880.   !  coefficient for dynamics sponge layer [m2/s] 
    105       ln_chk_bathy  = .false. !  =T  check the parent bathymetry 
    106    /              
    107  
    108 where sponge layer coefficients have to be chosen according to the child grid mesh size. 
    109 The sponge area is hard coded in NEMO and applies on the following grid points: 
    110 2 x refinement factor (from i=1+nbghostcells+1 to i=1+nbghostcells+sponge_area)  
    111  
    112 ----------    
    113 References 
    114 ---------- 
    115  
    116 `Debreu, L., P. Marchesiello, P. Penven and G. Cambon, 2012: Two-way nesting in split-explicit ocean models: Algorithms, implementation and validation. Ocean Modelling, 49-50, 1-21. <http://doi.org/10.1016/j.ocemod.2012.03.003>`_ 
    117  
    118 `Penven, P., L. Debreu, P. Marchesiello and J. C. Mc Williams, 2006: Evaluation and application of the ROMS 1-way embedding procedure to the central california upwelling system. Ocean Modelling, 12, 157-187. <http://doi.org/10.1016/j.ocemod.2005.05.002>`_ 
    119  
    120 `Spall, M. A. and W. R. Holland, 1991: A Nested Primitive Equation Model for Oceanic Applications. J. Phys. Ocean., 21, 205-220. <https://doi.org/10.1175/1520-0485(1991)021\<0205:ANPEMF\>2.0.CO;2>`_ 
     11.. include:: zooms.rst 
    12112 
    12213---- 
    12314 
    124 On line biogeochemistry coarsening 
    125 ================================== 
    126  
    127 .. contents:: 
    128    :local: 
    129  
    130 .. role:: underline  
    131    :class: underline 
    132  
    133 ------------ 
    134 Presentation 
    135 ------------ 
    136  
    137 A capacity of coarsening physics to force a BGC model coupled to NEMO has been developed. 
    138 This capacity allow to run 'online' a BGC model coupled to OCE-SI3 with a lower resolution, 
    139 to reduce the CPU cost of the BGC model, while preserving the effective resolution of the dynamics. 
    140  
    141 A presentation is available [attachment:crs_wiki_1.1.pdf​ here], where the methodology is presented. 
    142  
    143 ----------------------------------------------------- 
    144 What is available and working for now in this version 
    145 ----------------------------------------------------- 
    146  
    147 [To be completed] 
    148  
    149 ---------------------------------------------- 
    150 Description of the successful validation tests 
    151 ---------------------------------------------- 
    152  
    153 [To be completed] 
    154  
    155 ------------------------------------------------------------------ 
    156 What is not working yet with on line coarsening of biogeochemistry 
    157 ------------------------------------------------------------------ 
    158  
    159 [To be completed] 
    160  
    161 ''should include precise explanation on MPI decomposition problems too'' 
    162  
    163 --------------------------------------------- 
    164 How to set up and use on line biogeochemistry 
    165 --------------------------------------------- 
    166  
    167 :underline:`How to activate coarsening?` 
    168  
    169 To activate the coarsening, ``key_crs`` should be added to list of CPP keys. 
    170 This key will only activate the coarsening of dynamics. 
    171  
    172 Some parameters are available in the namelist_cfg: 
    173  
    174 .. code-block:: fortran 
    175  
    176                   !   passive tracer coarsened online simulations 
    177    !----------------------------------------------------------------------- 
    178       nn_factx    = 3         !  Reduction factor of x-direction 
    179       nn_facty    = 3         !  Reduction factor of y-direction 
    180       nn_msh_crs  = 0         !  create (=1) a mesh file or not (=0) 
    181       nn_crs_kz   = 3         ! 0, volume-weighted MEAN of KZ 
    182                               ! 1, MAX of KZ 
    183                               ! 2, MIN of KZ 
    184                               ! 3, 10^(MEAN(LOG(KZ))  
    185                               ! 4, MEDIANE of KZ  
    186       ln_crs_wn   = .false.   ! wn coarsened (T) or computed using horizontal divergence ( F ) 
    187                               !                           ! 
    188       ln_crs_top = .true.     !coarsening online for the bio 
    189    / 
    190  
    191 - Only ``nn_factx = 3`` is available and the coarsening only works for grids with a T-pivot point for 
    192   the north-fold lateral boundary condition (ORCA025, ORCA12, ORCA36, ...). 
    193 - ``nn_msh_crs = 1`` will activate the generation of the coarsened grid meshmask. 
    194 - ``nn_crs_kz`` is the operator to coarsen the vertical mixing coefficient.  
    195 - ``ln_crs_wn`` 
    196  
    197   - when ``key_vvl`` is activated, this logical has no effect; 
    198     the coarsened vertical velocities are computed using horizontal divergence. 
    199   - when ``key_vvl`` is not activated, 
    200  
    201     - coarsened vertical velocities are computed using horizontal divergence (``ln_crs_wn = .false.``)  
    202     - or coarsened vertical velocities are computed with an average operator (``ln_crs_wn = .true.``) 
    203 - ``ln_crs_top = .true.``: should be activated to run BCG model in coarsened space; 
    204   so only works when ``key_top`` is in the cpp list and eventually ``key_pisces`` or ``key_my_trc``. 
    205  
    206 :underline:`Choice of operator to coarsene KZ` 
    207  
    208 A sensiblity test has been done with an Age tracer to compare the different operators. 
    209 The 3 and 4 options seems to provide the best results. 
    210  
    211 Some results can be found [xxx here] 
    212  
    213 :underline:`Example of xml files to output coarsened variables with XIOS` 
    214  
    215 In the [attachment:iodef.xml iodef.xml]  file, a "nemo" context is defined and 
    216 some variable defined in [attachment:file_def.xml file_def.xml] are writted on the ocean-dynamic grid.   
    217 To write variables on the coarsened grid, and in particular the passive tracers, 
    218 a "nemo_crs" context should be defined in [attachment:iodef.xml iodef.xml] and 
    219 the associated variable are listed in [attachment:file_crs_def.xml file_crs_def.xml ]. 
    220  
    221 :underline:`Passive tracers tracers initial conditions` 
    222  
    223 When initial conditions are provided in NetCDF files, the field might be: 
    224  
    225 - on the coarsened grid 
    226 - or they can be on another grid and 
    227   interpolated `on-the-fly <http://forge.ipsl.jussieu.fr/nemo/wiki/Users/SetupNewConfiguration/Weight-creator>`_. 
    228   Example of namelist for PISCES : 
    229    
    230    .. code-block:: fortran 
    231  
    232       !----------------------------------------------------------------------- 
    233       &namtrc_dta      !    Initialisation from data input file 
    234       !----------------------------------------------------------------------- 
    235       ! 
    236          sn_trcdta(1)  = 'DIC_REG1'        ,        -12        ,  'DIC'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    237          sn_trcdta(2)  = 'ALK_REG1'        ,        -12        ,  'ALK'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    238          sn_trcdta(3)  = 'O2_REG1'         ,        -1         ,  'O2'      ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    239          sn_trcdta(5)  = 'PO4_REG1'        ,        -1         ,  'PO4'     ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    240          sn_trcdta(7)  = 'Si_REG1'         ,        -1         ,  'Si'      ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    241          sn_trcdta(10) = 'DOC_REG1'        ,        -12        ,  'DOC'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    242          sn_trcdta(14) = 'Fe_REG1'         ,        -12        ,  'Fe'      ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    243          sn_trcdta(23) = 'NO3_REG1'        ,        -1         ,  'NO3'     ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
    244          rn_trfac(1)   =   1.0e-06  !  multiplicative factor 
    245          rn_trfac(2)   =   1.0e-06  !  -      -      -     - 
    246          rn_trfac(3)   =  44.6e-06  !  -      -      -     - 
    247          rn_trfac(5)   = 122.0e-06  !  -      -      -     - 
    248          rn_trfac(7)   =   1.0e-06  !  -      -      -     - 
    249          rn_trfac(10)  =   1.0e-06  !  -      -      -     - 
    250          rn_trfac(14)  =   1.0e-06  !  -      -      -     - 
    251          rn_trfac(23)  =   7.6e-06  !  -      -      -     - 
    252        
    253          cn_dir        =  './'      !  root directory for the location of the data files 
    254  
    255 :underline:`PISCES forcing files` 
    256  
    257 They might be on the coarsened grid. 
    258  
    259 :underline:`Perspectives` 
    260  
    261 For the future, a few options are on the table to implement coarsening for biogeochemistry in 4.0 and 
    262 future releases. 
    263 Those will be discussed in Autumn 2018 
     15.. include:: coarsening.rst 
    26416 
    26517---- 
    26618 
    267 Coupling with other models (OASIS, SAS, ...) 
    268 ============================================ 
    269  
    270 NEMO currently exploits OASIS-3-MCT to implement a generalised coupled interface 
    271 (`Coupled Formulation <http://forge.ipsl.jussieu.fr/nemo/doxygen/node50.html?doc=NEMO>`_). 
    272 It can be used to interface with most of the European atmospheric GCM (ARPEGE, ECHAM, ECMWF, Ha- dAM, HadGAM, LMDz), 
    273 as well as to WRF (Weather Research and Forecasting Model), and to implement the coupling of 
    274 two independent NEMO components, ocean on one hand and sea-ice plus other surface processes on the other hand 
    275 (`Standalone Surface Module - SAS <http://forge.ipsl.jussieu.fr/nemo/doxygen/node46.html?doc=NEMO>`_). 
    276  
    277 To enable the OASIS interface the required compilation key is ``key_oasis3``. 
    278 The parameters to set are in sections ``namsbc_cpl`` and in case of using of SAS also in section ``namsbc_sas``. 
     19.. include:: coupling.rst 
    27920 
    28021---- 
    28122 
    282 With data assimilation 
    283 ====================== 
    284  
    285 .. contents:: 
    286    :local: 
    287  
    288 The assimilation interface to NEMO is split into three modules. 
    289 - OBS for the observation operator 
    290 - ASM for the application of increments and model bias correction (based on the assimilation increments). 
    291 - TAM the tangent linear and adjoint model. 
    292  
    293 Please see the `NEMO reference manual`_ for more details including information about the input file formats and 
    294 the namelist settings. 
    295  
    296 -------------------------------------- 
    297 Observation and model comparison (OBS) 
    298 -------------------------------------- 
    299  
    300 The observation and model comparison code (OBS) reads in observation files (profile temperature and salinity, 
    301 sea surface temperature, sea level anomaly, sea ice concentration, and velocity) and 
    302 calculates an interpolated model equivalent value at the observation location and nearest model timestep. 
    303 The resulting data are saved in a feedback file (or files). 
    304 The code was originally developed for use with the NEMOVAR data assimilation code, but 
    305 can be used for validation or verification of model or any other data assimilation system. 
    306 This is all controlled by the namelist. 
    307 To build with the OBS code active ``key_diaobs`` must be set.  
    308  
    309 More details in the `NEMO reference manual`_ chapter 12. 
    310  
    311 Standalone observation operator (SAO) 
    312 ------------------------------------- 
    313  
    314 The OBS code can also be run after a model run using saved NEMO model data. 
    315 This is accomplished using the standalone observation operator (SAO) 
    316 (previously known the offline observation operator). 
    317  
    318 To build the SAO use makenemo. 
    319 This means compiling NEMO once (in the normal way) for the chosen configuration. 
    320 Then include ``SAO`` at the end of the relevant line in ``cfg.txt`` file. 
    321 Then recompile with the replacement main program in ``./src/SAO``. 
    322 This is a special version of ``nemogcm.F90`` (which doesn't run the model, but reads in the model fields, and 
    323 observations and runs the OBS code. 
    324 See section 12.4 of the `NEMO reference manual`_. 
    325  
    326 ----------------------------------- 
    327 Apply assimilation increments (ASM) 
    328 ----------------------------------- 
    329  
    330 The ASM code adds the functionality to apply increments to the model variables: 
    331 temperature, salinity, sea surface height, velocity and sea ice concentration. 
    332 These are read into the model from a NetCDF file which may be produced by separate data assimilation code. 
    333 The code can also output model background fields which are used as an input to data assimilation code. 
    334 This is all controlled by the namelist nam_asminc. 
    335 To build the ASM code ``key asminc`` must be set. 
    336  
    337 More details in the `NEMO reference manual`_ chapter 13. 
    338  
    339 -------------------------------- 
    340 Tangent linear and adjoint (TAM) 
    341 -------------------------------- 
    342  
    343 This is the tangent linear and adjoint code of NEMO which is useful to 4D VAR assimilation. 
     23.. include:: data_assimilation.rst 
    34424 
    34525---- 
    34626 
    347 Inputs-Outputs (using XIOS) 
    348 =========================== 
     27.. include:: input-output.rst 
    34928 
    350 .. contents:: 
    351    :local: 
     29---- 
    35230 
    353 | Output of diagnostics in NEMO is usually done using XIOS. 
    354   This is an efficient way of writing diagnostics because the time averaging, file writing and even 
    355   some simple arithmetic or regridding is carried out in parallel to the NEMO model run. 
    356 | This page gives a basic introduction to using XIOS with NEMO. 
    357   Much more information is available from the XIOS homepage above and from the `NEMO reference manual`_. 
    358  
    359 Use of XIOS for diagnostics is activated using the pre-compiler key ``key_iomput``. 
    360 The default version of XIOS is the 2.0 release.  
    361  
    362 ------------------------------ 
    363 Extracting and installing XIOS 
    364 ------------------------------ 
    365  
    366 1. Install the NetCDF4 library. 
    367    If you want to use single file output you will need to compile the HDF & NetCDF libraries to allow parallel IO. 
    368 2. Download the version of XIOS that you wish to use. 
    369    The recommended version is now XIOS 2.0: 
    370     
    371    .. code-block:: console 
    372  
    373       $ svn co ​http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/branchs/xios-2.0 xios-2.0 
    374  
    375    and follow the instructions in `XIOS documentation`_ to compile it. 
    376    If you find problems at this stage, support can be found by subscribing to the `XIOS users mailing list`_ and 
    377    sending a mail message to it.  
    378  
    379 --------- 
    380 Namelists 
    381 --------- 
    382  
    383 XIOS is controlled using xml input files that should be copied to your model run directory before 
    384 running the model. 
    385 The exact setup differs slightly between 1.0 and 2.0 releases. 
    386  
    387 An ``iodef.xml`` file is still required in the run directory. 
    388 For XIOS 2.0 the ``field_def.xml`` file has been further split into ``field_def-oce.xml`` (for physics), 
    389 ``field_def-ice.xml`` (for ice) and ``field_def-bgc.xml`` (for biogeochemistry). 
    390 Also the definition of the output files has been moved from the ``iodef.xml`` file into 
    391 separate ``file_definition.xml`` files which are included in the ``iodef.xml`` file. 
    392 Note that the ``domain_def.xml`` file is also different for XIOS 2.0. 
    393  
    394 ----- 
    395 Modes 
    396 ----- 
    397  
    398 Detached Mode 
    399 ------------- 
    400  
    401 In detached mode the XIOS executable is executed on separate cores from the NEMO model. 
    402 This is the recommended method for using XIOS for realistic model runs. 
    403 To use this mode set ``using_server`` to ``true`` at the bottom of the ``iodef.xml`` file: 
    404  
    405 .. code-block:: xml 
    406  
    407    <variable id="using_server" type="boolean">true</variable> 
    408  
    409 Make sure there is a copy (or link to) your XIOS executable in the working directory and 
    410 in your job submission script allocate processors to XIOS. 
    411  
    412 Attached Mode 
    413 ------------- 
    414  
    415 In attached mode XIOS runs on each of the cores used by NEMO. 
    416 This method is less efficient than the detached mode but can be more convenient for testing or 
    417 with small configurations. 
    418 To activate this mode simply set ``using_server`` to false in the ``iodef.xml`` file 
    419  
    420 .. code-block:: xml 
    421  
    422    <variable id="using_server" type="boolean">false</variable> 
    423  
    424 and don't allocate any cores to XIOS. 
    425 Note that due to the different domain decompositions between XIOS and NEMO if 
    426 the total number of cores is larger than the number of grid points in the j direction then the model run will fail. 
    427  
    428 ------------------------------ 
    429 Adding new diagnostics to NEMO 
    430 ------------------------------ 
    431  
    432 If you want to add a NEMO diagnostic to the NEMO code you will need to do the following: 
    433  
    434 1. Add any necessary code to calculate you new diagnostic in NEMO 
    435 2. Send the field to XIOS using ``CALL iom_put( 'field_id', variable )`` where ``field_id`` is a unique id for 
    436    your new diagnostics and variable is the fortran variable containing the data. 
    437    This should be called at every model timestep regardless of how often you want to output the field. 
    438    No time averaging should be done in the model code.  
    439 3. If it is computationally expensive to calculate your new diagnostic you should also use "iom_use" to 
    440    determine if it is requested in the current model run. For example, 
    441     
    442    .. code-block:: fortran 
    443  
    444       IF iom_use('field_id') THEN 
    445          !Some expensive computation 
    446          !... 
    447          !... 
    448          iom_put('field_id', variable) 
    449       ENDIF 
    450  
    451 4. Add a variable definition to the ``field_def.xml`` (or ``field_def-???.xml``) file 
    452 5. Add the variable to the ``iodef.xml`` or ``file_definition.xml`` file. 
    453  
    454 .. _NEMO reference manual:   http://forge.ipsl.jussieu.fr/nemo/doxygen/index.html?doc=NEMO 
    455 .. _XIOS documentation:      http://forge.ipsl.jussieu.fr/ioserver/wiki/documentation 
    456 .. _XIOS users mailing list: http://forge.ipsl.jussieu.fr/mailman/listinfo.cgi/xios-users 
     31.. include:: tracers.rst 
Note: See TracChangeset for help on using the changeset viewer.