New URL for NEMO forge!   http://forge.nemo-ocean.eu

Since March 2022 along with NEMO 4.2 release, the code development moved to a self-hosted GitLab.
This present forge is now archived and remained online for history.
Changeset 10186 for NEMO/trunk – NEMO

Changeset 10186 for NEMO/trunk


Ignore:
Timestamp:
2018-10-09T18:46:38+02:00 (6 years ago)
Author:
nicolasmartin
Message:

Preliminary implementation of a NEMO Quick Start Guide via RST files and Sphinx installation

Location:
NEMO/trunk/doc/rst
Files:
20 added
4 moved

Legend:

Unmodified
Added
Removed
  • NEMO/trunk/doc/rst/source/interfacing_options.rst

    r10182 r10186  
     1=================== 
     2Interfacing options 
     3=================== 
     4 
     5.. contents:: 
     6   :local: 
     7   :depth: 1 
     8            
     9Embedded zooms with AGRIF 
     10========================= 
     11 
     12.. contents:: 
     13   :local: 
     14 
     15-------- 
     16Overview 
     17-------- 
     18 
     19AGRIF (Adaptive Grid Refinement In Fortran) is a library that allows the seamless space and time refinement over 
     20rectangular regions in NEMO. 
     21Refinement factors can be odd or even (usually lower than 5 to maintain stability). 
     22Interaction between grid is "two-ways" in the sense that the parent grid feeds the child grid open boundaries and 
     23the child grid provides volume averages of prognostic variables once a given number of time step is completed. 
     24These pages provide guidelines how to use AGRIF in NEMO. 
     25For a more technical description of the library itself, please refer to http://agrif.imag.fr. 
     26 
     27----------- 
     28Compilation 
     29----------- 
     30 
     31Activating AGRIF requires to append the cpp key ``key_agrif`` at compilation time:  
     32 
     33.. code-block:: sh 
     34 
     35   ./makenemo add_key 'key_agrif' 
     36 
     37Although this is transparent to users, the way the code is processed during compilation is different from 
     38the standard case: 
     39a preprocessing stage (the so called "conv" program) translates the actual code so that 
     40saved arrays may be switched in memory space from one domain to an other. 
     41 
     42-------------------------------- 
     43Definition of the grid hierarchy 
     44-------------------------------- 
     45 
     46An additional text file ``AGRIF_FixedGrids.in`` is required at run time. 
     47This is where the grid hierarchy is defined. 
     48An example of such a file, here taken from the ``ICEDYN`` test case, is given below:: 
     49 
     50   1 
     51   34 63 34 63 3 3 3 
     52   0 
     53 
     54The first line indicates the number of zooms (1). 
     55The second line contains the starting and ending indices in both directions on the root grid 
     56(imin=34 imax=63 jmin=34 jmax=63) followed by the space and time refinement factors (3 3 3). 
     57The last line is the number of child grid nested in the refined region (0). 
     58A more complex example with telescoping grids can be found below and 
     59in the ``AGRIF_DEMO`` reference configuration directory. 
     60 
     61[Add some plots here with grid staggering and positioning ?] 
     62 
     63When creating the nested domain, one must keep in mind that the child domain is shifted toward north-east and 
     64depends on the number of ghost cells as illustrated by the (attempted) drawing below for nbghostcells=1 and 
     65nbghostcells=3. 
     66The grid refinement is 3 and nxfin is the number of child grid points in i-direction.   
     67 
     68.. image:: _static/agrif_grid_position.jpg 
     69 
     70Note that rectangular regions must be defined so that they are connected to a single parent grid. 
     71Hence, defining overlapping grids with the same refinement ratio will not work properly, 
     72boundary data exchange and update being only performed between root and child grids. 
     73Use of east-west periodic or north-fold boundary conditions is not allowed in child grids either. 
     74Defining for instance a circumpolar zoom in a global model is therefore not possible.  
     75 
     76------------- 
     77Preprocessing 
     78------------- 
     79 
     80Knowing the refinement factors and area, a ``NESTING`` pre-processing tool may help to create needed input files 
     81(mesh file, restart, climatological and forcing files). 
     82The key is to ensure volume matching near the child grid interface, 
     83a step done by invoking the ``Agrif_create_bathy.exe`` program. 
     84You may use the namelists provided in the ``NESTING`` directory as a guide. 
     85These correspond to the namelists used to create ``AGRIF_DEMO`` inputs. 
     86 
     87---------------- 
     88Namelist options 
     89---------------- 
     90 
     91Each child grid expects to read its own namelist so that different numerical choices can be made 
     92(these should be stored in the form ``1_namelist_cfg``, ``2_namelist_cfg``, etc... according to their rank in 
     93the grid hierarchy). 
     94Consistent time steps and number of steps with the chosen time refinement have to be provided. 
     95Specific to AGRIF is the following block: 
     96 
     97.. code-block:: fortran 
     98 
     99   !----------------------------------------------------------------------- 
     100   &namagrif      !  AGRIF zoom                                            ("key_agrif") 
     101   !----------------------------------------------------------------------- 
     102      ln_spc_dyn    = .true.  !  use 0 as special value for dynamics 
     103      rn_sponge_tra = 2880.   !  coefficient for tracer   sponge layer [m2/s] 
     104      rn_sponge_dyn = 2880.   !  coefficient for dynamics sponge layer [m2/s] 
     105      ln_chk_bathy  = .false. !  =T  check the parent bathymetry 
     106   /              
     107 
     108where sponge layer coefficients have to be chosen according to the child grid mesh size. 
     109The sponge area is hard coded in NEMO and applies on the following grid points: 
     1102 x refinement factor (from i=1+nbghostcells+1 to i=1+nbghostcells+sponge_area)  
     111 
     112----------    
     113References 
     114---------- 
     115 
     116`Debreu, L., P. Marchesiello, P. Penven and G. Cambon, 2012: Two-way nesting in split-explicit ocean models: Algorithms, implementation and validation. Ocean Modelling, 49-50, 1-21. <http://doi.org/10.1016/j.ocemod.2012.03.003>`_ 
     117 
     118`Penven, P., L. Debreu, P. Marchesiello and J. C. Mc Williams, 2006: Evaluation and application of the ROMS 1-way embedding procedure to the central california upwelling system. Ocean Modelling, 12, 157-187. <http://doi.org/10.1016/j.ocemod.2005.05.002>`_ 
     119 
     120`Spall, M. A. and W. R. Holland, 1991: A Nested Primitive Equation Model for Oceanic Applications. J. Phys. Ocean., 21, 205-220. <https://doi.org/10.1175/1520-0485(1991)021\<0205:ANPEMF\>2.0.CO;2>`_ 
     121 
     122---- 
     123 
     124On line biogeochemistry coarsening 
     125================================== 
     126 
     127.. contents:: 
     128   :local: 
     129 
     130.. role:: underline  
     131   :class: underline 
     132 
     133------------ 
     134Presentation 
     135------------ 
     136 
     137A capacity of coarsening physics to force a BGC model coupled to NEMO has been developed. 
     138This capacity allow to run 'online' a BGC model coupled to OCE-SI3 with a lower resolution, 
     139to reduce the CPU cost of the BGC model, while preserving the effective resolution of the dynamics. 
     140 
     141A presentation is available [attachment:crs_wiki_1.1.pdf​ here], where the methodology is presented. 
     142 
     143----------------------------------------------------- 
     144What is available and working for now in this version 
     145----------------------------------------------------- 
     146 
     147[To be completed] 
     148 
     149---------------------------------------------- 
     150Description of the successful validation tests 
     151---------------------------------------------- 
     152 
     153[To be completed] 
     154 
     155------------------------------------------------------------------ 
     156What is not working yet with on line coarsening of biogeochemistry 
     157------------------------------------------------------------------ 
     158 
     159[To be completed] 
     160 
     161''should include precise explanation on MPI decomposition problems too'' 
     162 
     163--------------------------------------------- 
     164How to set up and use on line biogeochemistry 
     165--------------------------------------------- 
     166 
     167:underline:`How to activate coarsening?` 
     168 
     169To activate the coarsening, ``key_crs`` should be added to list of CPP keys. 
     170This key will only activate the coarsening of dynamics. 
     171 
     172Some parameters are available in the namelist_cfg: 
     173 
     174.. code-block:: fortran 
     175 
     176                  !   passive tracer coarsened online simulations 
     177   !----------------------------------------------------------------------- 
     178      nn_factx    = 3         !  Reduction factor of x-direction 
     179      nn_facty    = 3         !  Reduction factor of y-direction 
     180      nn_msh_crs  = 0         !  create (=1) a mesh file or not (=0) 
     181      nn_crs_kz   = 3         ! 0, volume-weighted MEAN of KZ 
     182                              ! 1, MAX of KZ 
     183                              ! 2, MIN of KZ 
     184                              ! 3, 10^(MEAN(LOG(KZ))  
     185                              ! 4, MEDIANE of KZ  
     186      ln_crs_wn   = .false.   ! wn coarsened (T) or computed using horizontal divergence ( F ) 
     187                              !                           ! 
     188      ln_crs_top = .true.     !coarsening online for the bio 
     189   / 
     190 
     191- Only ``nn_factx = 3`` is available and the coarsening only works for grids with a T-pivot point for 
     192  the north-fold lateral boundary condition (ORCA025, ORCA12, ORCA36, ...). 
     193- ``nn_msh_crs = 1`` will activate the generation of the coarsened grid meshmask. 
     194- ``nn_crs_kz`` is the operator to coarsen the vertical mixing coefficient.  
     195- ``ln_crs_wn`` 
     196 
     197  - when ``key_vvl`` is activated, this logical has no effect; 
     198    the coarsened vertical velocities are computed using horizontal divergence. 
     199  - when ``key_vvl`` is not activated, 
     200 
     201    - coarsened vertical velocities are computed using horizontal divergence (``ln_crs_wn = .false.``)  
     202    - or coarsened vertical velocities are computed with an average operator (``ln_crs_wn = .true.``) 
     203- ``ln_crs_top = .true.``: should be activated to run BCG model in coarsened space; 
     204  so only works when ``key_top`` is in the cpp list and eventually ``key_pisces`` or ``key_my_trc``. 
     205 
     206:underline:`Choice of operator to coarsene KZ` 
     207 
     208A sensiblity test has been done with an Age tracer to compare the different operators. 
     209The 3 and 4 options seems to provide the best results. 
     210 
     211Some results can be found [xxx here] 
     212 
     213:underline:`Example of xml files to output coarsened variables with XIOS` 
     214 
     215In the [attachment:iodef.xml iodef.xml]  file, a "nemo" context is defined and 
     216some variable defined in [attachment:file_def.xml file_def.xml] are writted on the ocean-dynamic grid.   
     217To write variables on the coarsened grid, and in particular the passive tracers, 
     218a "nemo_crs" context should be defined in [attachment:iodef.xml iodef.xml] and 
     219the associated variable are listed in [attachment:file_crs_def.xml file_crs_def.xml ]. 
     220 
     221:underline:`Passive tracers tracers initial conditions` 
     222 
     223When initial conditions are provided in NetCDF files, the field might be: 
     224 
     225- on the coarsened grid 
     226- or they can be on another grid and 
     227  interpolated `on-the-fly <http://forge.ipsl.jussieu.fr/nemo/wiki/Users/SetupNewConfiguration/Weight-creator>`_. 
     228  Example of namelist for PISCES : 
     229   
     230   .. code-block:: fortran 
     231 
     232      !----------------------------------------------------------------------- 
     233      &namtrc_dta      !    Initialisation from data input file 
     234      !----------------------------------------------------------------------- 
     235      ! 
     236         sn_trcdta(1)  = 'DIC_REG1'        ,        -12        ,  'DIC'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     237         sn_trcdta(2)  = 'ALK_REG1'        ,        -12        ,  'ALK'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     238         sn_trcdta(3)  = 'O2_REG1'         ,        -1         ,  'O2'      ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     239         sn_trcdta(5)  = 'PO4_REG1'        ,        -1         ,  'PO4'     ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     240         sn_trcdta(7)  = 'Si_REG1'         ,        -1         ,  'Si'      ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     241         sn_trcdta(10) = 'DOC_REG1'        ,        -12        ,  'DOC'     ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     242         sn_trcdta(14) = 'Fe_REG1'         ,        -12        ,  'Fe'      ,    .false.   , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     243         sn_trcdta(23) = 'NO3_REG1'        ,        -1         ,  'NO3'     ,    .true.    , .true. , 'yearly'  , 'reshape_REG1toeORCA075_bilin.nc'       , ''   , '' 
     244         rn_trfac(1)   =   1.0e-06  !  multiplicative factor 
     245         rn_trfac(2)   =   1.0e-06  !  -      -      -     - 
     246         rn_trfac(3)   =  44.6e-06  !  -      -      -     - 
     247         rn_trfac(5)   = 122.0e-06  !  -      -      -     - 
     248         rn_trfac(7)   =   1.0e-06  !  -      -      -     - 
     249         rn_trfac(10)  =   1.0e-06  !  -      -      -     - 
     250         rn_trfac(14)  =   1.0e-06  !  -      -      -     - 
     251         rn_trfac(23)  =   7.6e-06  !  -      -      -     - 
     252       
     253         cn_dir        =  './'      !  root directory for the location of the data files 
     254 
     255:underline:`PISCES forcing files` 
     256 
     257They might be on the coarsened grid. 
     258 
     259:underline:`Perspectives` 
     260 
     261For the future, a few options are on the table to implement coarsening for biogeochemistry in 4.0 and 
     262future releases. 
     263Those will be discussed in Autumn 2018 
     264 
     265---- 
     266 
     267Coupling with other models (OASIS, SAS, ...) 
     268============================================ 
     269 
     270NEMO currently exploits OASIS-3-MCT to implement a generalised coupled interface 
     271(`Coupled Formulation <http://forge.ipsl.jussieu.fr/nemo/doxygen/node50.html?doc=NEMO>`_). 
     272It can be used to interface with most of the European atmospheric GCM (ARPEGE, ECHAM, ECMWF, Ha- dAM, HadGAM, LMDz), 
     273as well as to WRF (Weather Research and Forecasting Model), and to implement the coupling of 
     274two independent NEMO components, ocean on one hand and sea-ice plus other surface processes on the other hand 
     275(`Standalone Surface Module - SAS <http://forge.ipsl.jussieu.fr/nemo/doxygen/node46.html?doc=NEMO>`_). 
     276 
     277To enable the OASIS interface the required compilation key is ``key_oasis3``. 
     278The parameters to set are in sections ``namsbc_cpl`` and in case of using of SAS also in section ``namsbc_sas``. 
     279 
     280---- 
     281 
     282With data assimilation 
     283====================== 
     284 
     285.. contents:: 
     286   :local: 
     287 
     288The assimilation interface to NEMO is split into three modules. 
     289- OBS for the observation operator 
     290- ASM for the application of increments and model bias correction (based on the assimilation increments). 
     291- TAM the tangent linear and adjoint model. 
     292 
     293Please see the `NEMO reference manual`_ for more details including information about the input file formats and 
     294the namelist settings. 
     295 
     296-------------------------------------- 
     297Observation and model comparison (OBS) 
     298-------------------------------------- 
     299 
     300The observation and model comparison code (OBS) reads in observation files (profile temperature and salinity, 
     301sea surface temperature, sea level anomaly, sea ice concentration, and velocity) and 
     302calculates an interpolated model equivalent value at the observation location and nearest model timestep. 
     303The resulting data are saved in a feedback file (or files). 
     304The code was originally developed for use with the NEMOVAR data assimilation code, but 
     305can be used for validation or verification of model or any other data assimilation system. 
     306This is all controlled by the namelist. 
     307To build with the OBS code active ``key_diaobs`` must be set.  
     308 
     309More details in the `NEMO reference manual`_ chapter 12. 
     310 
     311Standalone observation operator (SAO) 
     312------------------------------------- 
     313 
     314The OBS code can also be run after a model run using saved NEMO model data. 
     315This is accomplished using the standalone observation operator (SAO) 
     316(previously known the offline observation operator). 
     317 
     318To build the SAO use makenemo. 
     319This means compiling NEMO once (in the normal way) for the chosen configuration. 
     320Then include ``SAO`` at the end of the relevant line in ``cfg.txt`` file. 
     321Then recompile with the replacement main program in ``./src/SAO``. 
     322This is a special version of ``nemogcm.F90`` (which doesn't run the model, but reads in the model fields, and 
     323observations and runs the OBS code. 
     324See section 12.4 of the `NEMO reference manual`_. 
     325 
     326----------------------------------- 
     327Apply assimilation increments (ASM) 
     328----------------------------------- 
     329 
     330The ASM code adds the functionality to apply increments to the model variables: 
     331temperature, salinity, sea surface height, velocity and sea ice concentration. 
     332These are read into the model from a NetCDF file which may be produced by separate data assimilation code. 
     333The code can also output model background fields which are used as an input to data assimilation code. 
     334This is all controlled by the namelist nam_asminc. 
     335To build the ASM code ``key asminc`` must be set. 
     336 
     337More details in the `NEMO reference manual`_ chapter 13. 
     338 
     339-------------------------------- 
     340Tangent linear and adjoint (TAM) 
     341-------------------------------- 
     342 
     343This is the tangent linear and adjoint code of NEMO which is useful to 4D VAR assimilation. 
     344 
     345---- 
     346 
     347Inputs-Outputs (using XIOS) 
     348=========================== 
     349 
     350.. contents:: 
     351   :local: 
     352 
     353| Output of diagnostics in NEMO is usually done using XIOS. 
     354  This is an efficient way of writing diagnostics because the time averaging, file writing and even 
     355  some simple arithmetic or regridding is carried out in parallel to the NEMO model run. 
     356| This page gives a basic introduction to using XIOS with NEMO. 
     357  Much more information is available from the XIOS homepage above and from the `NEMO reference manual`_. 
     358 
     359Use of XIOS for diagnostics is activated using the pre-compiler key ``key_iomput``. 
     360The default version of XIOS is the 2.0 release.  
     361 
     362------------------------------ 
     363Extracting and installing XIOS 
     364------------------------------ 
     365 
     3661. Install the NetCDF4 library. 
     367   If you want to use single file output you will need to compile the HDF & NetCDF libraries to allow parallel IO. 
     3682. Download the version of XIOS that you wish to use. 
     369   The recommended version is now XIOS 2.0: 
     370    
     371   .. code-block:: console 
     372 
     373      $ svn co ​http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/branchs/xios-2.0 xios-2.0 
     374 
     375   and follow the instructions in `XIOS documentation`_ to compile it. 
     376   If you find problems at this stage, support can be found by subscribing to the `XIOS users mailing list`_ and 
     377   sending a mail message to it.  
     378 
     379--------- 
     380Namelists 
     381--------- 
     382 
     383XIOS is controlled using xml input files that should be copied to your model run directory before 
     384running the model. 
     385The exact setup differs slightly between 1.0 and 2.0 releases. 
     386 
     387An ``iodef.xml`` file is still required in the run directory. 
     388For XIOS 2.0 the ``field_def.xml`` file has been further split into ``field_def-oce.xml`` (for physics), 
     389``field_def-ice.xml`` (for ice) and ``field_def-bgc.xml`` (for biogeochemistry). 
     390Also the definition of the output files has been moved from the ``iodef.xml`` file into 
     391separate ``file_definition.xml`` files which are included in the ``iodef.xml`` file. 
     392Note that the ``domain_def.xml`` file is also different for XIOS 2.0. 
     393 
     394----- 
     395Modes 
     396----- 
     397 
     398Detached Mode 
     399------------- 
     400 
     401In detached mode the XIOS executable is executed on separate cores from the NEMO model. 
     402This is the recommended method for using XIOS for realistic model runs. 
     403To use this mode set ``using_server`` to ``true`` at the bottom of the ``iodef.xml`` file: 
     404 
     405.. code-block:: xml 
     406 
     407   <variable id="using_server" type="boolean">true</variable> 
     408 
     409Make sure there is a copy (or link to) your XIOS executable in the working directory and 
     410in your job submission script allocate processors to XIOS. 
     411 
     412Attached Mode 
     413------------- 
     414 
     415In attached mode XIOS runs on each of the cores used by NEMO. 
     416This method is less efficient than the detached mode but can be more convenient for testing or 
     417with small configurations. 
     418To activate this mode simply set ``using_server`` to false in the ``iodef.xml`` file 
     419 
     420.. code-block:: xml 
     421 
     422   <variable id="using_server" type="boolean">false</variable> 
     423 
     424and don't allocate any cores to XIOS. 
     425Note that due to the different domain decompositions between XIOS and NEMO if 
     426the total number of cores is larger than the number of grid points in the j direction then the model run will fail. 
     427 
     428------------------------------ 
     429Adding new diagnostics to NEMO 
     430------------------------------ 
     431 
     432If you want to add a NEMO diagnostic to the NEMO code you will need to do the following: 
     433 
     4341. Add any necessary code to calculate you new diagnostic in NEMO 
     4352. Send the field to XIOS using ``CALL iom_put( 'field_id', variable )`` where ``field_id`` is a unique id for 
     436   your new diagnostics and variable is the fortran variable containing the data. 
     437   This should be called at every model timestep regardless of how often you want to output the field. 
     438   No time averaging should be done in the model code.  
     4393. If it is computationally expensive to calculate your new diagnostic you should also use "iom_use" to 
     440   determine if it is requested in the current model run. For example, 
     441    
     442   .. code-block:: fortran 
     443 
     444      IF iom_use('field_id') THEN 
     445         !Some expensive computation 
     446         !... 
     447         !... 
     448         iom_put('field_id', variable) 
     449      ENDIF 
     450 
     4514. Add a variable definition to the ``field_def.xml`` (or ``field_def-???.xml``) file 
     4525. Add the variable to the ``iodef.xml`` or ``file_definition.xml`` file. 
     453 
     454.. _NEMO reference manual:   http://forge.ipsl.jussieu.fr/nemo/doxygen/index.html?doc=NEMO 
     455.. _XIOS documentation:      http://forge.ipsl.jussieu.fr/ioserver/wiki/documentation 
     456.. _XIOS users mailing list: http://forge.ipsl.jussieu.fr/mailman/listinfo.cgi/xios-users 
  • NEMO/trunk/doc/rst/source/reference_configurations.rst

    r10182 r10186  
     1===================== 
     2Build a configuration 
     3===================== 
     4 
     5.. contents:: 
     6   :local: 
     7   :depth: 1 
     8       
     9.. role:: underline  
     10   :class: underline 
     11 
     12Official configurations 
     13======================= 
     14 
     15| NEMO is distributed with some reference configurations allowing both the user to set up a first application and 
     16  the developer to validate their developments. 
     17| :underline:`The NEMO System Team is in charge of these configurations`. 
     18 
     19+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     20|                      | OPA | SI3 | TOP | PISCES | AGRIF | Inputs                        | 
     21+======================+=====+=====+=====+========+=======+===============================+ 
     22| `AGRIF_DEMO`_        |  X  |  X  |     |        |   X   | - `AGRIF_DEMO_v4.0.tar`_      | 
     23|                      |     |     |     |        |       | - `ORCA2_ICE_v4.0.tar`_       | 
     24+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     25| `AMM12`_             |  X  |     |     |        |       | `AMM12_v4.0.tar`_             | 
     26+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     27| `C1D_PAPA`_          |  X  |     |     |        |       | `INPUTS_C1D_PAPA_v4.0.tar`_   | 
     28+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     29| `GYRE_BFM`_          |  X  |     |  X  |        |       | ``-``                         | 
     30+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     31| `GYRE_PISCES`_       |  X  |     |  X  |   X    |       | ``-``                         | 
     32+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     33| `ORCA2_ICE_PISCES`_  |  X  |  X  |  X  |   X    |       | - `ORCA2_ICE_v4.0.tar`_       | 
     34|                      |     |     |     |        |       | - `INPUTS_PISCES_v4.0.tar`_   | 
     35+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     36| `ORCA2_OFF_PISCES`_  |     |     |  X  |   X    |       | - `INPUTS_PISCES_v4.0.tar`_   | 
     37|                      |     |     |     |        |       | - `ORCA2_OFF_v4.0.tar`_       | 
     38+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     39| `ORCA2_OFF_TRC`_     |     |     |  X  |        |       | `ORCA2_OFF_v4.0.tar`_         | 
     40+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     41| `ORCA2_SAS_ICE`_     |     |  X  |     |        |       | - `ORCA2_ICE_v4.0.tar`_       | 
     42|                      |     |     |     |        |       | - `INPUTS_SAS_v4.0.tar`_      | 
     43+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     44| `SPITZ12`_           |  X  |  X  |     |        |       | `SPITZ12_v4.0.tar`_           | 
     45+----------------------+-----+-----+-----+--------+-------+-------------------------------+ 
     46 
     47---------- 
     48AGRIF_DEMO 
     49---------- 
     50 
     51.. image:: _static/AGRIF_DEMO.jpg 
     52 
     53``AGRIF_DEMO`` is based on the ``ORCA2_LIM3_PISCES`` global 2° configuration but 
     54it includes 3 online nested grids that demonstrate the overall capabilities of AGRIF in a realistic context, 
     55including nesting sea ice models. 
     56 
     57The configuration includes a 1:1 grid in the Pacific and two successively nested grids with odd and 
     58even refinement ratios over the Arctic ocean. 
     59The finest grid spanning the whole Svalbard archipelago is of particular interest to check that 
     60sea ice coupling is done properly. 
     61The 1:1 grid, used alone, is used as a benchmark to check that the solution is not corrupted by grid exchanges. 
     62 
     63Note that since grids interact only at the baroclinic time level, 
     64numerically exact results can not be achieved in the 1:1 case. 
     65One has to switch to a fully explicit in place of a split explicit free surface scheme in order to 
     66retrieve perfect reproducibility. 
     67 
     68Corresponding ``AGRIF_FixedGrids.in`` file is given by:: 
     69 
     70   2 
     71   42 82 49 91 1 1 1 
     72   122 153 110 143 4 4 4 
     73   0 
     74   1 
     75   38 80 71 111 3 3 3 
     76   0 
     77 
     78----- 
     79AMM12 
     80----- 
     81 
     82``AMM12`` for *Atlantic Margin Model 12kms* is a `regional model`_ covering the Northwest European Shelf domain on 
     83a regular lat-lon grid at approximately 12km horizontal resolution. 
     84The key ``key_amm_12km`` is used to create the correct dimensions of the AMM domain. 
     85 
     86| This configuration tests several features of NEMO functionality specific to the shelf seas. 
     87| In particular, the AMM uses s-coordinates in the vertical rather than z-coordinates and is forced with 
     88  tidal lateral boundary conditions using a flather boundary condition from the BDY module (``key_bdy``). 
     89 
     90The AMM configuration uses the GLS (``key_zdfgls``) turbulence scheme, 
     91the VVL non-linear free surface (``key_vvl``) and time-splitting (``key_dynspg_ts``). 
     92 
     93In addition to the tidal boundary condition, the model may also take open boundary conditions from 
     94a North Atlantic model. 
     95Boundaries may be completely ommited by removing the BDY key (key_bdy) in ``./cfgs/AMM12/cpp_AMM12_fcm``. 
     96 
     97Sample surface fluxes, river forcing and a sample initial restart file are included to test a realistic model run. 
     98The Baltic boundary is included within the river input file and is specified as a river source. 
     99Unlike ordinary river points the Baltic inputs also include salinity and temperature data. 
     100 
     101-------- 
     102C1D_PAPA 
     103-------- 
     104 
     105``C1D_PAPA`` is a 1D configuration (one water column called NEMO1D, activated with CPP key ``key_c1d``), 
     106located at the `PAPA station 145W-50N <http://www.pmel.noaa.gov/OCS/Papa/index-Papa.shtml>`_. 
     107 
     108| NEMO1D is useful to test vertical physics in NEMO 
     109  (turbulent closure scheme, solar penetration, interaction ocean/atmosphere.,...) 
     110| Size of the horizontal domain is 3x3 grid points. 
     111 
     112This reference configuration uses a 75 vertical levels grid (1m at the surface), 
     113the GLS (key_zdfgls) turbulence scheme with K-epsilon closure and the CORE BULK formulae. 
     114The atmospheric forcing comes from ECMWF operational analysis with a modification of the long and short waves flux. 
     115This set has been rescaled at a frequency of 1h. 1 year is simulated in outputs, 
     116see below (June,15 2010 to June,14 2011) 
     117 
     118`Reffray 2015`_ describes some tests on vertical physic using this configuration. 
     119 
     120The inputs tar file includes: 
     121 
     122- forcing files covering the years 2010 and 2011 (``forcing_PAPASTATION_1h_y201*.nc``) 
     123- initialization file for June,15 2010 deduced from observed data and Levitus 2009 climatology 
     124  (``init_PAPASTATION_m06d15.nc``) 
     125- surface chlorophyll file (``chlorophyll_PAPASTATION.nc``) deduced from Seawifs data. 
     126 
     127-------- 
     128GYRE_BFM 
     129-------- 
     130 
     131``GYRE_BFM`` is the same configuration as `GYRE_PISCES`_, except that PISCES is replaced by 
     132BFM biogeochemichal model in coupled mode. 
     133 
     134----------- 
     135GYRE_PISCES 
     136----------- 
     137 
     138| Idealized configuration representing double gyres in the North hemisphere, Beta-plane with 
     139  a regular grid spacing at 1° horizontal resolution (and possible use as a benchmark by 
     140  easily inscreasing grid size), 101 vertical levels, forced with analytical heat, freshwater and 
     141  wind-stress fields. 
     142| This configuration is coupled to `PISCES biogeochemical model`_. 
     143 
     144Running GYRE as a benchmark 
     145--------------------------- 
     146 
     147This simple configuration can be used as a benchmark since it is easy to increase resolution 
     148(and in this case no physical meaning of outputs): 
     149 
     1501. Choose the grid size 
     151 
     152   In ``./cfgs/GYRE/EXP00``, edit your ``namelist_cfg`` file to change the ``jp_cfg``, ``jpi``, ``jpj``, 
     153   ``jpk`` variables in &namcfg: 
     154 
     155   +------------+---------+---------+---------+------------------+---------------+ 
     156   | ``jp_cfg`` | ``jpi`` | ``jpj`` | ``jpk`` | Number of points | Equivalent to | 
     157   +============+=========+=========+=========+==================+===============+ 
     158   | 1          | 30      | 20      | 101     | 60600            | GYRE 1°       | 
     159   +------------+---------+---------+---------+------------------+---------------+ 
     160   | 25         | 750     | 500     | 101     | 37875000         | ORCA 1/2°     | 
     161   +------------+---------+---------+---------+------------------+---------------+ 
     162   | 50         | 1500    | 1000    | 101     | 151500000        | ORCA 1/4°     | 
     163   +------------+---------+---------+---------+------------------+---------------+ 
     164   | 150        | 4500    | 3000    | 101     | 1363500000       | ORCA 1/12°    | 
     165   +------------+---------+---------+---------+------------------+---------------+ 
     166   | 200        | 6000    | 4000    | 101     | 2424000000       | ORCA 1/16°    | 
     167   +------------+---------+---------+---------+------------------+---------------+ 
     168 
     1692. In `namelist_cfg` again, avoid problems in the physics (and results will not be meaningful in terms of physics) by setting `nn_bench = 1` in &namctl 
     170 
     171.. code-block:: fortran 
     172    
     173   nn_bench    =    1     !  Bench mode (1/0): CAUTION use zero except for bench 
     174 
     1753. If you increase domain size, you may need to decrease time-step (for stability) by changing `rn_rdt` value in &namdom (i.e. for `jp_cfg = 150`, ORCA12 equivalent, use `rn_rdt = 1200`) 
     176 
     177.. code-block:: fortran 
     178    
     179   rn_rdt      = 1200.     !  time step for the dynamics 
     180 
     1814. Optional, in order to increase the number of MPI communication for benchmark purposes: 
     182   you can change the number of sub-timesteps computed in the time-splitting scheme each iteration. 
     183   First change the list of active CPP keys for your experiment, 
     184   in `cfgs/"your configuration name"/cpp_"your configuration name".fcm`: 
     185   replace ``key_dynspg_flt by key_dynspg_ts`` and recompile/create your executable again 
     186    
     187   .. code-block:: fortran 
     188    
     189   makenemo [...] add_key 'key_dynspg_ts' del_key 'key_dynspg_flt' 
     190 
     191In your ``namelist_cfg`` file, edit the &namsplit namelist by adding the following line:  
     192 
     193.. code-block:: fortran 
     194    
     195   nn_baro       =    30               !  Number of iterations of barotropic mode/ 
     196 
     197``nn_baro = 30`` is a kind of minimum (we usually use 30 to 60). 
     198So than increasing the ``nn_baro`` value will increase the number of MPI communications. 
     199 
     200The GYRE CPP keys, namelists and scripts can be explored in the ``GYRE`` configuration directory 
     201(``./cfgs/GYRE`` and ``./cfgs/GYRE/EXP00``). 
     202 
     203Find `here <http://prodn.idris.fr/thredds/catalog/ipsl_public/reee451/NEMO_OUT/GYRE/catalog.html>`_ 
     204monthly mean outputs of 1 year run 
     205 
     206---------------- 
     207ORCA2_ICE_PISCES 
     208---------------- 
     209 
     210ORCA is the generic name given to global ocean configurations. 
     211Its specificity lies on the horizontal curvilinear mesh used to overcome the North Pole singularity found for 
     212geographical meshes. 
     213SI3 (Sea Ice Integrated Initiative) is a thermodynamic-dynamic sea ice model specifically designed for 
     214climate studies. 
     215A brief description of the model is given here. 
     216 
     217:underline:`Space-time domain` 
     218 
     219The horizontal resolution available through the standard configuration is ORCA2. 
     220It is based on a 2 degrees Mercator mesh, (i.e. variation of meridian scale factor as cosinus of the latitude). 
     221In the northern hemisphere the mesh has two poles so that the ratio of anisotropy is nearly one everywhere. 
     222The mean grid spacing is about 2/3 of the nominal value: for example it is 1.3 degrees for ORCA2. 
     223Other resolutions (ORCA4, ORCA05 and ORCA025) are running or under development within specific projects. 
     224In the coarse resolution version (i.e. ORCA2 and ORCA4) the meridional grid spacing is increased near 
     225the equator to improve the equatorial dynamics. 
     226Figures in pdf format of mesh and bathymetry can be found and downloaded here. 
     227The sea-ice model runs on the same grid. 
     228 
     229The vertical domain spreads from the surface to a depth of 5000m. 
     230There are 31 levels, with 10 levels in the top 100m. 
     231The vertical mesh is deduced from a mathematical function of z ([[AttachmentNum(1)]]). 
     232The ocean surface corresponds to the w-level k=1, and the ocean bottom to the w-level k=31. 
     233The last T-level (k=31) is thus always in the ground.The depths of the vertical levels and 
     234the associated scale factors can be viewed. 
     235Higher vertical resolution is used in ORCA025 and ORCA12 (see `DRAKKAR project <http://www.drakkar-ocean.eu>`_). 
     236 
     237The time step depends on the resolution. It is 1h36' for ORCA2 so that there is 15 time steps in one day. 
     238 
     239:underline:`Ocean Physics (for ORCA2)` 
     240 
     241- horizontal diffusion on momentum: the eddy viscosity coefficient depends on the geographical position. 
     242  It is taken as 40000 $m^2/s$, reduced in the equator regions (2000 $m^2/s$) excepted near the western boundaries. 
     243- isopycnal diffusion on tracers: the diffusion acts along the isopycnal surfaces (neutral surface) with 
     244  a eddy diffusivity coefficient of 2000 $m^2/s$. 
     245- Eddy induced velocity parametrization with a coefficient that depends on the growth rate of 
     246  baroclinic instabilities (it usually varies from 15 $m^2/s$ to 3000 $m^2/s$). 
     247- lateral boundary conditions : zero fluxes of heat and salt and no-slip conditions are applied through 
     248  lateral solid boundaries. 
     249- bottom boundary condition : zero fluxes of heat and salt are applied through the ocean bottom. 
     250  The Beckmann [19XX] simple bottom boundary layer parameterization is applied along continental slopes. 
     251  A linear friction is applied on momentum. 
     252- convection: the vertical eddy viscosity and diffusivity coefficients are increased to 1 $m^2/s$ in case of 
     253  static instability. 
     254- forcings: the ocean receives heat, freshwater, and momentum fluxes from the atmosphere and/or the sea-ice. 
     255  The solar radiation penetrates the top meters of the ocean. 
     256  The downward irradiance I(z) is formulated with two extinction coefficients [Paulson and Simpson, 1977], 
     257  whose values correspond to a Type I water in Jerlov's classification (i.e the most transparent water) 
     258 
     259ORCA2_ICE_PISCES is a reference configuration with the following characteristics: 
     260 
     261- global ocean configuration 
     262- based on a tri-polar ORCA grid, with a 2° horizontal resolution 
     263- 31 vertical levels 
     264- forced with climatological surface fields 
     265- coupled to the sea-ice model SI3. 
     266- coupled to TOP passive tracer transport module and `PISCES biogeochemical model`_. 
     267 
     268:underline:`AGRIF demonstrator` 
     269 
     270| From the ``ORCA2_ICE_PISCES`` configuration, a demonstrator using AGRIF nesting can be activated. 
     271  It includes the global ``ORCA2_ICE_PISCES`` configuration and a nested grid in the Agulhas region. 
     272| To set up this configuration, after extracting NEMO: 
     273 
     274- Build your AGRIF configuration directory from ORCA2_ICE_PISCES, with the key_agrif CPP key activated: 
     275 
     276.. code-block:: console 
     277                 
     278   $ ./makenemo -r 'ORCA2_ICE_PISCES' -n 'AGRIF' add_key 'key_agrif' 
     279 
     280- Using the ``ORCA2_ICE_PISCES`` input files and namelist, AGRIF test configuration is ready to run 
     281 
     282:underline:`On-The-Fly Interpolation` 
     283 
     284| NEMO allows to use the interpolation on the fly option allowing to interpolate input data during the run. 
     285  If you want to use this option you need files giving informations on weights, which have been created. 
     286| You can find 
     287  `here <http://prodn.idris.fr/thredds/catalog/ipsl_public/reee512/ORCA2_ONTHEFLY/WEIGHTS/catalog.html>`_ 
     288  2 weights files `bil_weights` for scalar field (bilinear interpolation) and `bic_weights` for 
     289  vector field (bicubic interpolation). 
     290| The data files used are `COREII forcing <http://data1.gdfl.noaa.gov/nomads/forms/mom4/COREv2>`_ extrapolated on 
     291  continents, ready to be used for on the fly option: 
     292  `COREII`_ forcing files extrapolated on continents 
     293 
     294---------------- 
     295ORCA2_OFF_PISCES 
     296---------------- 
     297 
     298``ORCA2_OFF_PISCES`` uses the ORCA2 configuration in which the `PISCES biogeochemical model`_ has been activated in 
     299standalone using the dynamical fields that are pre calculated. 
     300 
     301See `ORCA2_ICE_PISCES`_ for general description of ORCA2. 
     302 
     303The input files for PISCES are needed, in addition the dynamical fields are used as input. 
     304They are coming from a 2000 years of an ORCA2_LIM climatological run using ERA40 atmospheric forcing. 
     305 
     306------------- 
     307ORCA2_OFF_TRC 
     308------------- 
     309 
     310``ORCA2_OFF_TRC`` uses the ORCA2_LIM configuration in which the tracer passive transport module TOP has been 
     311activated in standalone using the dynamical fields that are pre calculated. 
     312 
     313See `ORCA2_ICE_PISCES`_ for general description of ORCA2. 
     314 
     315In ``namelist_top_cfg``, different passive tracers can be activated ( cfc11, cfc12, sf6, c14, age ) or my-trc, 
     316a user-defined tracer. 
     317 
     318The dynamical fields are used as input, they are coming from a 2000 years of an ORCA2_LIM climatological run using 
     319ERA40 atmospheric forcing. 
     320 
     321------------- 
     322ORCA2_SAS_ICE 
     323------------- 
     324 
     325``ORCA2_SAS_ICE`` is a demonstrator of the SAS ( Stand-alone Surface module ) based on ORCA2_LIM configuration. 
     326 
     327The standalone surface module allows surface elements such as sea-ice, iceberg drift and surface fluxes to 
     328be run using prescribed model state fields. 
     329For example, it can be used to inter-compare different bulk formulae or adjust the parameters of 
     330a given bulk formula 
     331 
     332See `ORCA2_ICE_PISCES`_ for general description of ORCA2. 
     333 
     334Same input files as `ORCA2_ICE_PISCES`_ are needed plus fields from a previous ORCA2_LIM run. 
     335 
     336More informations on input and configuration files in `NEMO Reference manual`_. 
     337 
     338------- 
     339SPITZ12 
     340------- 
     341 
     342``SPITZ12`` 
     343 
     344Unsupported configurations 
     345========================== 
     346 
     347Other configurations are developed and used by some projects with "NEMO inside", 
     348these projects are welcome to publicize it here: http://www.nemo-ocean.eu/projects/add-project 
     349 
     350:underline:`Obviously these "projects configurations" are not under the NEMO System Team's responsibility`. 
     351 
     352.. _regional model:               http://www.tandfonline.com/doi/pdf/10.1080/1755876X.2012.11020128 
     353.. _AMM12_v4.0.tar:               http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/AMM12_v4.0.tar 
     354.. _PISCES biogeochemical model:  http://www.geosci-model-dev.net/8/2465/2015 
     355.. _INPUTS_PISCES_v4.0.tar:       http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/INPUTS_PISCES_v4.0.tar 
     356.. _ORCA2_OFF_v4.0.tar:           http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/ORCA2_OFF_v4.0.tar 
     357.. _ORCA2_ICE_v4.0.tar:           http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/ORCA2_ICE_v4.0.tar 
     358.. _INPUTS_SAS_v4.0.tar:          http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/INPUTS_SAS_v4.0.tar 
     359.. _NEMO Reference manual:        http://forge.ipsl.jussieu.fr/nemo/doxygen/index.html?doc=NEMO 
     360.. _INPUTS_C1D_PAPA_v4.0.tar:     http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/INPUTS_C1D_PAPA_v4.0.tar 
     361.. _Reffray 2015:                 http://www.geosci-model-dev.net/8/69/2015 
     362.. _COREII:                       http://prodn.idris.fr/thredds/catalog/ipsl_public/reee512/ORCA2_ONTHEFLY/FILLED_FILES/catalog.html 
     363.. _SPITZ12_v4.0.tar:             http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/SPITZ12_v4.0.tar 
     364.. _AGRIF_DEMO_v4.0.tar:          http://prodn.idris.fr/thredds/fileServer/ipsl_public/romr005/Online_forcing_archives/AGRIF_DEMO_v4.0.tar 
  • NEMO/trunk/doc/rst/source/setup_configuration.rst

    r10182 r10186  
     1============================== 
     2Setting up a new configuration 
     3============================== 
     4 
     5.. contents:: 
     6 
     7Starting from an existing configuration 
     8======================================= 
     9 
     10There are three options to build a new configuration from an existing one: 
     11 
     12--------------------------------------------- 
     13Option 1: Duplicate an existing configuration 
     14--------------------------------------------- 
     15 
     16The NEMO so-called Reference Configurations cover a number of major features for NEMO setup 
     17(global, regional, 1D, using embeded zoom with AGRIF...) 
     18 
     19One can create a new configuration by duplicating one of the reference configurations 
     20(``ORCA2_LIM3_PISCES`` in the following example) 
     21 
     22.. code-block:: sh 
     23 
     24   makenemo –n 'ORCA2_LIM3_PISCES_MINE' -r 'ORCA2_LIM3_PISCES' 
     25 
     26------------------------------------ 
     27Option 2: Duplicate with differences 
     28------------------------------------ 
     29 
     30Create and compile a new configuration based on a reference configuration 
     31(``ORCA2_LIM3_PISCES`` in the following example) but with different pre-processor options. 
     32For this either add ``add_key`` or ``del_key`` keys as required; e.g. 
     33 
     34.. code-block:: sh 
     35 
     36   makenemo –n 'ORCA2_LIM3_PISCES_MINE' -r 'ORCA2_LIM3_PISCES' del_key 'key_iomput' add_key 'key_xios' 
     37 
     38--------------------------------------------------------- 
     39Option 3: Use the SIREN tools to subset an existing model 
     40--------------------------------------------------------- 
     41 
     42Define a regional configuration which is a sub- or super-set of an existing configuration. 
     43 
     44This last option employs the SIREN software tools that are included in the standard distribution. 
     45The software is written in Fortran 95 and available in the ``./tools/SIREN`` directory. 
     46SIREN allows you to create your own regional configuration embedded in a wider one. 
     47 
     48SIREN is a set of programs to create all the input files you need to run a NEMO regional configuration. 
     49As a basic demonstrator, a set of GLORYS files (GLObal ReanalYSis on the ORCA025 grid), 
     50as well as examples of namelists are available `here`_. 
     51 
     52`SIREN documentation`_ 
     53 
     54Any questions or comments regarding the use of SIREN should be posted in `the corresponding forum`_. 
     55 
     56Creating a completely new configuration 
     57======================================= 
     58 
     59From NEMO version 4.0 there are two ways to build configurations from scratch. 
     60The appropriate method to use depends largely on the target configuration. 
     61Method 1 is for more complex/realistic global or regional configurations and 
     62method 2 is intended for simpler, idealised configurations whose 
     63domains and characteristics can be described in simple geometries and formulae. 
     64 
     65---------------------------------------------------- 
     66Option 1: Create and use a domain configuration file 
     67---------------------------------------------------- 
     68 
     69This method is used by each of the reference configurations, 
     70so that downloading their input files linked to their description can help. 
     71Although starting from scratch it is advisable to create the directory structure to house your new configuration by 
     72duplicating the closest reference configuration to your target application. 
     73For example, if your application requires both ocean ice and passive tracers, 
     74then use the ``ORCA2_LIM3_PISCES`` as template, 
     75and execute following command to build your ``MY_NEW_CONFIG`` configuration: 
     76 
     77.. code-block:: sh 
     78 
     79   makenemo –n 'MY_NEW_CONFIG' -r 'ORCA2_LIM3_PISCES' 
     80 
     81where ``MY_NEW_CONFIG`` can be substituted with a suitably descriptive name for your new configuration.   
     82 
     83The purpose of this step is simply to create and populate the appropriate ``WORK``, ``MY_SRC`` and 
     84``EXP00`` subdirectories for your new configuration. 
     85Other choices for the base reference configuration might be 
     86 
     87- ``GYRE``  - If your target application is ocean-only 
     88- ``AMM12`` - If your target application is regional with open boundaries 
     89 
     90All the domain information for your new configuration will be contained within 
     91a netcdf file called ``domain_cfg.nc`` which you will need to create and 
     92place in the ``./cfgs/MY_NEW_CONFIG/EXP00`` sub-directory. 
     93Firstly though, ensure that your configuration is set to use such a file by checking that 
     94 
     95.. code-block:: fortran 
     96 
     97   ln_read_cfg = .true. 
     98 
     99in ``./cfgs/MY_NEW_CONFIG/EXP00/namelist_cfg`` 
     100 
     101Create the domain_cfg.nc file which must contain the following fields 
     102 
     103.. code-block:: c++ 
     104 
     105   int    ORCA, ORCA_index                  /* configuration name, configuration resolution                 */ 
     106   int    jpiglo, jpjglo, jpkglo            /* global domain sizes                                          */ 
     107   int    jperio                            /* lateral global domain b.c.                                   */ 
     108   int    ln_zco, ln_zps, ln_sco            /* flags for z-coord, z-coord with partial steps and s-coord    */ 
     109   int    ln_isfcav                         /* flag  for ice shelf cavities                                 */ 
     110   double glamt, glamu, glamv, glamf        /* geographic position                                          */ 
     111   double gphit, gphiu, gphiv, gphif        /* geographic position                                          */ 
     112   double iff, ff_f, ff_t                   /* Coriolis parameter (if not on the sphere)                    */ 
     113   double e1t, e1u, e1v, e1f                /* horizontal scale factors                                     */ 
     114   double e2t, e2u, e2v, e2f                /* horizontal scale factors                                     */ 
     115   double ie1e2u_v, e1e2u, e1e2v            /* U and V surfaces (if grid size reduction in some straits)    */ 
     116   double e3t_1d, e3w_1d                    /* reference vertical scale factors at T and W points           */ 
     117   double e3t_0, e3u_0, e3v_0, e3f_0, e3w_0 /* vertical scale factors 3D coordinate at T,U,V,F and W points */ 
     118   double e3uw_0, e3vw_0                    /* vertical scale factors 3D coordinate at UW and VW points     */ 
     119   int    bottom_level, top_level           /* last wet T-points, 1st wet T-points (for ice shelf cavities) */ 
     120 
     121There are two options for creating a domain_cfg.nc file: 
     122 
     123- Users can use tools of their own choice to build a ``domain_cfg.nc`` with all mandatory fields. 
     124- Users can adapt and apply the supplied tool available in ``./tools/DOMAINcfg``. 
     125  This tool is based on code extracted from NEMO version 3.6 and will allow similar choices for 
     126  the horizontal and vertical grids that were available internally to that version. 
     127  See ``./tools/DOMAINcfg/README`` for details. 
     128 
     129----------------------------------------------------------------------------- 
     130Option 2: Adapt the usr_def configuration module of NEMO for you own purposes 
     131----------------------------------------------------------------------------- 
     132 
     133This method is intended for configuring easily simple/idealised configurations which 
     134are often used as demonstrators or for process evaluation and comparison. 
     135This method can be used whenever the domain geometry has a simple mathematical description and 
     136the ocean initial state and boundary forcing is described analytically.  
     137As a start, consider the case of starting a completely new ocean-only test case based on 
     138the ``LOCK_EXCHANGE`` example. 
     139[Note: we probably need an even more basic example than this with only one namelist and 
     140minimal changes to the usrdef modules] 
     141 
     142Firstly, construct the directory structure, starting in the ``cfgs`` directory: 
     143 
     144.. code-block:: sh 
     145 
     146   ./makenemo -n 'MY_NEW_TEST' -t 'LOCK_EXCHANGE' 
     147 
     148where the ``-t`` option has been used to locate the new configuration in the ``tests`` subdirectory 
     149(it is recommended practice to keep full configurations and idealised cases clearly distinguishable). 
     150This command will have created (amongst others) the following files and directories:: 
     151 
     152   ./tests/MY_NEW_TEST: 
     153   BLD     MY_SRC cpp_MY_NEW_TEST.fcm 
     154   EXP00   WORK 
     155   # 
     156   ./tests/MY_NEW_TEST/EXP00: 
     157   context_nemo.xml       domain_def_nemo.xml 
     158   field_def_nemo-opa.xml file_def_nemo-opa.xml 
     159   iodef.xml 
     160   namelist_cfg 
     161   namelist_ref 
     162   # 
     163   ./tests/MY_NEW_TEST/MY_SRC: 
     164   usrdef_hgr.F90    usrdef_nam.F90 usrdef_zgr.F90 
     165   usrdef_istate.F90 usrdef_sbc.F90 zdfini.F90 
     166 
     167The key to setting up an idealised configuration lies in adapting a small set of short Fortran 90 modules which 
     168should be dropped into the ``MY_SRC`` directory. 
     169Here the ``LOCK_EXCHANGE`` example is using 5 such routines but the full set that is available in 
     170the ``src/OCE/USR`` directory is:: 
     171 
     172   ./src/OCE/USR: 
     173   usrdef_closea.F90 usrdef_istate.F90 usrdef_zgr.F90 
     174   usrdef_fmask.F90  usrdef_nam.F90 
     175   usrdef_hgr.F90    usrdef_sbc.F90 
     176 
     177Before discussing these in more detail it is worth noting the various namelist controls that 
     178engage the different user-defined aspects. 
     179These controls are set using two new logical switches or are implied by the settings of existing ones. 
     180For example, the mandatory requirement for an idealised configuration is to provide routines which 
     181define the horizontal and vertical domains. 
     182Templates for these are provided in the ``usrdef_hgr.F90`` and ``usrdef_zgr.F90`` modules. 
     183The application of these modules is activated whenever: 
     184 
     185.. code-block:: fortran 
     186 
     187   ln_read_cfg = .false. 
     188 
     189in any configuration's ``namelist_cfg`` file. 
     190This setting also activates the reading of an optional ``nam_usrdef`` namelist which can be used to 
     191supply configuration specific settings. 
     192These need to be declared and read in the ``usrdef_nam.F90`` module. 
     193 
     194Another explicit control is available in the ``namsbc`` namelist which activates the use of analytical forcing. 
     195With 
     196 
     197.. code-block:: fortran 
     198 
     199   ln_usr = .true. 
     200 
     201Other usrdef modules are activated by less explicit means. 
     202For example, code in ``usrdef_istate.F90`` is used to define initial temperature and salinity fields if 
     203 
     204.. code-block:: fortran 
     205 
     206   ln_tsd_init   = .false. 
     207 
     208in the ``namtsd`` namelist. 
     209The remaining modules, namely:: 
     210 
     211   usrdef_closea.F90 usrdef_fmask.F90 
     212 
     213are specific to ORCA configurations and set local variations of some specific fields for 
     214the various resolutions of the global models. 
     215They do not need to be considered here in the context of idealised cases but it is worth noting that all 
     216configuration specific code has now been isolated in the usrdef modules. 
     217In the case of these last two modules, they are activated only if an ORCA configuration is detected. 
     218Currently this requires a specific integer variable named ``ORCA`` to be set in a ``domain_cfg.nc`` file. 
     219[Note: this would be less confusing if the cn_cfg string is read directly as a character attribue from 
     220the ``domain_cfg.nc`` ] 
     221 
     222So, in most cases, the set up of idealised model configurations can be completed by 
     223copying the template routines from ``./src/OCE/USR`` into 
     224your new ``./cfgs/MY_NEW_TEST/MY_SRC`` directory and editing the appropriate modules as needed. 
     225The default set are those used for the GYRE reference configuration. 
     226The contents of ``MY_SRC`` directories from other idealised configurations may provide more convenient templates if 
     227they share common characteristics with your target application. 
     228 
     229Whatever the starting point it should not require too many changes or additional lines of code to 
     230produce routines in ``./src/OCE/USR`` that define analytically the domain, 
     231the initial state and the surface boundary conditions for your new configuration. 
     232 
     233To summarize, the base set of modules is: 
     234 
     235- ``usrdef_hgr.F90``   : define horizontal grid 
     236- ``usrdef_zgr.F90``   : define vertical grid 
     237- ``usrdef_sbc.F90``   : provides at each time-step the surface boundary condition, i.e. the momentum, heat and freshwater fluxes 
     238- ``usrdef_istate.F90``: defines initialization of the dynamics and tracers 
     239- ``usrdef_nam.F90``   : configuration-specific namelist processing to set any associated run-time parameters 
     240 
     241with two specialised ORCA modules 
     242(not related to idealised configurations but used to isolate configuration specific code that is used in 
     243ORCA2 reference configurations and established global configurations using the ORCA tripolar grid): 
     244 
     245- ``usrdef_fmask.F90`` : only used in ORCA CONFIGURATIONS for alteration of f-point land/ocean mask in some straits 
     246- ``usrdef_closea.F90``: only used in ORCA CONFIGURATIONS for specific treatments associated with closed seas 
     247 
     248From version 4.0, the NEMO release includes a ``tests`` subdirectory containing available and 
     249up to date test cases build by the community. 
     250These will not be fully supported as is NEMO reference but should provide a source of raw material. 
     251 
     252.. _here:                     http://prodn.idris.fr/thredds/catalog/ipsl_public/rron463/catalog.html?dataset=DatasetScanipsl_public/rron463/INPUT_SIREN.tar 
     253.. _the corresponding forum:  http://forge.ipsl.jussieu.fr/nemo/discussion/forum/2 
     254.. _SIREN documentation:      http://forge.ipsl.jussieu.fr/nemo/doxygen/index.html 
  • NEMO/trunk/doc/rst/source/test_cases.rst

    r10182 r10186  
     1====================== 
     2Explore the test cases 
     3====================== 
     4 
     5- ``tests/ICEDYN``: 
     6   
     7  [Clement to add an illustration here ?] 
     8 
     9  This is an East-west + north-south periodic channel. 
     10  The configuration includes an AGRIF zoom (1:3) in the middle of the basin to test how 
     11  an ice patch is advected through it but one can also test the advection schemes (Prather and Ultimate-Macho) by 
     12  removing the ``key_agrif`` in the CPP keys. 
     13 
     14- ``tests/VORTEX``: 
     15   
     16  This test case illustrates the propagation of an anticyclonic eddy over a Beta plan and a flat bottom. 
     17  It is implemented here with an online refined subdomain (1:3) out of which the vortex propagates. 
     18  It serves as a benchmark for quantitative estimates of nesting errors as in Debreu et al. (2012), 
     19  Penven et al. (2006) or Spall and Holland (1991). 
     20  The animation below (sea level anomaly in meters) illustrates with two 1:2 successively nested grids how 
     21  the vortex smoothly propagates out of the refined grids. 
     22   
     23  .. image:: _static/VORTEX_anim.gif 
Note: See TracChangeset for help on using the changeset viewer.