New URL for NEMO forge!   http://forge.nemo-ocean.eu

Since March 2022 along with NEMO 4.2 release, the code development moved to a self-hosted GitLab.
This present forge is now archived and remained online for history.
Changeset 4219 for branches – NEMO

Changeset 4219 for branches


Ignore:
Timestamp:
2013-11-15T16:25:37+01:00 (10 years ago)
Author:
acc
Message:

Branch dev_r3948_NOC_FK. Ready for merge; now includes trunk changes up to revision 4119 and has been SETTE tested. SETTE scripts now include options for testing in detached mode.

Location:
branches/2013/dev_r3948_NOC_FK
Files:
1 deleted
53 edited
2 copied

Legend:

Unmodified
Added
Removed
  • branches/2013/dev_r3948_NOC_FK/DOC/NEMO_book.tex

    r3294 r4219  
    2323\usepackage[margin=10pt,font={small},labelsep=colon,labelfont={bf}]{caption} % Gives small font for captions 
    2424\usepackage{enumitem}                          % allows non-bold description items 
     25\usepackage{longtable}                         % allows multipage tables 
    2526%\usepackage{colortbl}                           % gives coloured panels behind table columns 
    2627 
  • branches/2013/dev_r3948_NOC_FK/DOC/TexFiles/Chapters/Chap_DIA.tex

    r3940 r4219  
    11% ================================================================ 
    2 % Chapter Ñ I/O & Diagnostics 
     2% Chapter I/O & Diagnostics 
    33% ================================================================ 
    44\chapter{Ouput and Diagnostics (IOM, DIA, TRD, FLO)} 
     
    1616 
    1717The model outputs are of three types: the restart file, the output listing,  
    18 and the output file(s). The restart file is used internally by the code when  
     18and the diagnostic output file(s). The restart file is used internally by the code when  
    1919the user wants to start the model with initial conditions defined by a  
    2020previous simulation. It contains all the information that is necessary in  
     
    2525that it is saved in the same binary format as the one used by the computer  
    2626that is to read it (in particular, 32 bits binary IEEE format must not be used for  
    27 this file). The output listing and file(s) are predefined but should be checked  
     27this file).  
     28 
     29The output listing and file(s) are predefined but should be checked  
    2830and eventually adapted to the user's needs. The output listing is stored in  
    2931the $ocean.output$ file. The information is printed from within the code on the  
     
    3133"\textit{grep -i numout}" in the source code directory. 
    3234 
    33 By default, outpout files are written in NetCDF format but an IEEE output format, called DIMG, can be choosen when defining \key{dimgout}. Since version 3.2, when defining \key{iomput}, an I/O server has been added which provides more flexibility in the choice of the fields to be outputted as well as how the writing work is distributed over the processors in massively parallel computing. The complete description of the use of this I/O server is presented in next section. If neither \key{iomput} nor \key{dimgout} are defined, NEMO is producing NetCDF with the old IOIPSL library which has been kept for compatibility and its easy installation, but it is quite inefficient on parrallel machines. If \key{iomput} is not defined, output files are defined in the \mdl{diawri} module and containing mean (or instantaneous if \key{diainstant} is defined) values over a period of nn\_write time-step (namelist parameter).  
     35By default, diagnostic output files are written in NetCDF format but an IEEE binary output format, called DIMG, can be choosen by defining \key{dimgout}.  
     36 
     37Since version 3.2, when defining \key{iomput}, an I/O server has been added which provides more flexibility in the choice of the fields to be written as well as how the writing work is distributed over the processors in massively parallel computing. The complete description of the use of this I/O server is presented in the next section.  
     38 
     39By default, if neither \key{iomput} nor \key{dimgout} are defined, NEMO produces NetCDF with the old IOIPSL library which has been kept for compatibility and its easy installation. However, the IOIPSL library is quite inefficient on parallel machines and, since version 3.2, many diagnostic options have been added presuming the use of \key{iomput}. The usefulness of the default IOIPSL-based option is expected to reduce with each new release. If \key{iomput} is not defined, output files and content are defined in the \mdl{diawri} module and contain mean (or instantaneous if \key{diainstant} is defined) values over a regular period of nn\_write time-steps (namelist parameter).  
    3440 
    3541%\gmcomment{                    % start of gmcomment 
     
    4248 
    4349 
    44 Since version 3.2, iomput is the NEMO output interface. It was designed to be simple to use, flexible and efficient. The two main purposes of iomput are: \\ 
    45 (1) the complete and flexible control of the output files through an external xml file defined by the user \\ 
    46 (2) to achieve high performance outputs through the distribution (or not) of all tasks related to output files on dedicated processes. \\ 
    47 The first functionality allows the user to specify, without touching anything into the code, the way he want to output data: \\ 
    48 - choice of output frequencies that can be different for each file (including real months and years) \\ 
    49 - choice of file contents: decide which data will be written in which file (the same data can be outputted in different files)  \\ 
    50 - possibility to split output files at a choosen frequency \\ 
    51 - possibility to extract a vertical or an horizontal subdomain  \\ 
    52 - choice of the temporal operation to perform: average, accumulate, instantaneous, min, max and once  \\ 
    53 - extremely large choice of data available   \\ 
    54 - redefine variables name and long\_name  \\ 
    55 In addition, iomput allows the user to output any variable (scalar, 2D or 3D) in the code in a very easy way. All details of iomput functionalities are listed in the following subsections. Example of the iodef.xml files that control the outputs can be found here: NEMOGCM/CONFIG/ORCA2\_LIM/EXP00/iodef*.xml 
    56  
    57 The second functionality targets outputs performances when running on a very large number of processes. First, iomput provides the possibility to dedicate N specific processes (in addition to NEMO processes) to write the outputs, where N is big enough (and defined by the user) to suppress the bottle neck associated with the the writing of the output files. Since version 3.5, this interface depends on an external code called \href{http://forge.ipsl.jussieu.fr/ioserver}{XIOS}. This new IO server takes advantage of the new functionalitiy of NetCDF4 that allows the user to write files in parallel and therefore to bypass the rebuilding phase. Note that writting in parallel into the same NetCDF files requires that your NetCDF4 library is linked to an HDF5 library that has been correctly compiled (i.e. with the configure option $--$enable-parallel). Note that the files created by iomput trough xios are incompatible with NetCDF3. All post-processsing and visualization tools must therefore be compatible with NetCDF4 and not only NetCDF3. 
    58  
    59 \subsection{Basic knowledge} 
    60  
     50Since version 3.2, iomput is the NEMO output interface of choice. It has been designed to be simple to use, flexible and efficient. The two main purposes of iomput are:  
     51\begin{enumerate} 
     52\item The complete and flexible control of the output files through external XML files adapted by the user from standard templates.  
     53\item To achieve high performance and scalable output through the optional distribution of all diagnostic output related tasks to dedicated processes.  
     54\end{enumerate} 
     55The first functionality allows the user to specify, without code changes or recompilation, aspects of the diagnostic output stream, such as: 
     56\begin{itemize} 
     57\item The choice of output frequencies that can be different for each file (including real months and years). 
     58\item The choice of file contents; includes complete flexibility over which data are written in which files (the same data can be written in different files).  
     59\item The possibility to split output files at a choosen frequency. 
     60\item The possibility to extract a vertical or an horizontal subdomain. 
     61\item The choice of the temporal operation to perform, e.g.: average, accumulate, instantaneous, min, max and once. 
     62\item Control over metadata via a large XML "database" of possible output fields. 
     63\end{itemize} 
     64In addition, iomput allows the user to add the output of any new variable (scalar, 2D or 3D) in the code in a very easy way. All details of iomput functionalities are listed in the following subsections. Examples of the XML files that control the outputs can be found in: 
     65\begin{alltt} 
     66\begin{verbatim} 
     67  NEMOGCM/CONFIG/ORCA2_LIM/EXP00/iodef.xml 
     68  NEMOGCM/CONFIG/SHARED/field_def.xml 
     69  and 
     70  NEMOGCM/CONFIG/SHARED/domain_def.xml. 
     71\end{verbatim} 
     72\end{alltt} 
     73 
     74The second functionality targets output performance when running in parallel (\key{mpp\_mpi}). Iomput provides the possibility to specify N dedicated I/O processes (in addition to the NEMO processes) to collect and write the outputs. With an appropriate choice of N by the user, the bottleneck associated with the writing of the output files can be greatly reduced.  
     75 
     76Since version 3.5, the iom\_put interface depends on an external code called \href{http://forge.ipsl.jussieu.fr/ioserver}{XIOS}. This new IO server can take advantage of the parallel I/O functionality of NetCDF4 to create a single output file and therefore to bypass the rebuilding phase. Note that writing in parallel into the same NetCDF files requires that your NetCDF4 library is linked to an HDF5 library that has been correctly compiled (i.e. with the configure option $--$enable-parallel). Note that the files created by iomput through XIOS are incompatible with NetCDF3. All post-processsing and visualization tools must therefore be compatible with NetCDF4 and not only NetCDF3. 
     77 
     78Even if not using the parallel I/O functionality of NetCDF4, using N dedicated I/O servers, where N is typically much less than the number of NEMO processors, will reduce the number of output files created. This can greatly reduce the post-processing burden usually associated with using large numbers of NEMO processors. Note that for smaller configurations, the rebuilding phase can be avoided, even without a parallel-enabled NetCDF4 library, simply by employing only one dedicated I/O server. 
     79 
     80\subsection{XIOS: the IO\_SERVER} 
     81 
     82\subsubsection{Attached or detached mode?} 
     83 
     84Iomput is based on \href{http://forge.ipsl.jussieu.fr/ioserver/wiki}{XIOS}, the io\_server developed by Yann Meurdesoif from IPSL. The behaviour of the io subsystem is controlled by settings in the external XML files listed above. Key settings in the iodef.xml file are {\tt using\_server} and the {\tt type} tag associated with each defined file. The {\tt using\_server} setting determines whether or not the server will be used in ''attached mode'' (as a library) [{\tt false}] or in ''detached mode'' (as an external executable on N additional, dedicated cpus) [{\tt true}]. The ''attached mode'' is simpler to use but much less efficient for massively parallel applications. The type of each file can be either ''multiple\_file'' or ''one\_file''. 
     85 
     86In attached mode and if the type of file is ''multiple\_file'', then each NEMO process will also act as an IO server and produce its own set of output files. Superficially, this emulates the standard behaviour in previous versions, However, the subdomain written out by each process does not correspond to the {\tt jpi x jpj x jpk} domain actually computed by the process (although it may if {\tt jpni=1}). Instead each process will have collected and written out a number of complete longitudinal strips. If the ''one\_file'' option is chosen then all processes will collect their longitudinal strips and write (in parallel) to a single output file.  
     87 
     88In detached mode and if the type of file is ''multiple\_file'', then each stand-alone XIOS process will collect data for a range of complete longitudinal strips and write to its own set of output files. If the ''one\_file'' option is chosen then all XIOS processes will collect their longitudinal strips and write (in parallel) to a single output file. Note running in detached mode requires launching a Multiple Process Multiple Data (MPMD) parallel job. The following subsection provides a typical example but the syntax will vary in different MPP environments. 
     89 
     90\subsubsection{Number of cpu used by XIOS in detached mode} 
     91 
     92The number of cores used by the XIOS is specified when launching the model. The number of cores dedicated to XIOS should be from ~1/10 to ~1/50 of the number or cores dedicated to NEMO. Some manufacturers suggest using O($\sqrt{N}$) dedicated IO processors for N processors but this is a general recommendation and not specific to NEMO. It is difficult to provide precise recommendations because the optimal choice will depend on the particular hardware properties of the target system (parallel filesystem performance, available memory, memory bandwidth etc.) and the volume and frequency of data to be created. Here is an example of 2 cpus for the io\_server and 62 cpu for nemo using mpirun: 
     93 
     94\texttt{ mpirun -np 62 ./nemo.exe : -np 2 ./xios\_server.exe } 
     95 
     96\subsubsection{Control of XIOS: the XIOS context in iodef.xml} 
     97 
     98As well as the {\tt using\_server} flag, other controls on the use of XIOS are set in the XIOS context in iodef.xml. See the XML basics section below for more details on XML syntax and rules. 
     99 
     100\begin{tabular}{|p{4cm}|p{6.0cm}|p{2.0cm}|} 
     101   \hline 
     102   variable name &  
     103   description &  
     104   example \\  
     105   \hline    
     106   \hline 
     107   buffer\_size &  
     108   buffer size used by XIOS to send data from NEMO to XIOS. Larger is more efficient. Note that needed/used buffer sizes are summarized at the end of the job &  
     109   25000000 \\  
     110   \hline    
     111   buffer\_server\_factor\_size &  
     112   ratio between NEMO and XIOS buffer size. Should be 2. &  
     113   2 \\  
     114   \hline 
     115   info\_level &  
     116   verbosity level (0 to 100) &  
     117   0 \\  
     118   \hline 
     119   using\_server &  
     120   activate attached(false) or detached(true) mode &  
     121   true \\  
     122   \hline 
     123   using\_oasis &  
     124   XIOS is used with OASIS(true) or not (false) &  
     125   false \\  
     126   \hline 
     127   oasis\_codes\_id &  
     128   when using oasis, define the identifier of NEMO in the namcouple. Note that the identifier of XIOS is xios.x &  
     129   oceanx \\  
     130   \hline    
     131\end{tabular} 
     132 
     133 
     134\subsection{Practical issues} 
     135 
     136\subsubsection{Installation} 
     137 
     138As mentioned, XIOS is supported separately and must be downloaded and compiled before it can be used with NEMO. See the installation guide on the \href{http://forge.ipsl.jussieu.fr/ioserver/wiki}{XIOS} wiki for help and guidance. NEMO will need to link to the compiled XIOS library. The  
     139\href{http://www.nemo-ocean.eu/Using-NEMO/User-Guides/Basics/XIOS-IO-server-installation-and-use}{XIOS with NEMO} guide provides an example illustration of how this can be achieved. 
     140 
     141\subsubsection{Add your own outputs} 
     142 
     143It is very easy to add your own outputs with iomput. Many standard fields and diagnostics are already prepared (i.e., steps 1 to 3 below have been done) and simply need to be activated by including the required output in a file definition in iodef.xml (step 4). To add new output variables, all 4 of the following steps must be taken. 
     144\begin{description} 
     145\item[1.] in NEMO code, add a \\ 
     146\texttt{      CALL iom\_put( 'identifier', array ) } \\ 
     147where you want to output a 2D or 3D array. 
     148 
     149\item[2.] If necessary, add \\ 
     150\texttt{   USE iom\ \ \ \ \ \ \ \ \ \ \ \ ! I/O manager library }  \\ 
     151to the list of used modules in the upper part of your module.  
     152 
     153\item[3.] in the field\_def.xml file, add the definition of your variable using the same identifier you used in the f90 code (see subsequent sections for a details of the XML syntax and rules). For example: 
     154\vspace{-20pt} 
     155\begin{alltt}  {{\scriptsize 
     156\begin{verbatim} 
     157   <field_definition> 
     158      <!-- T grid --> 
     159 
     160     <field_group id="grid_T" grid_ref="grid_T_3D"> 
     161      ... 
     162      <field id="identifier" long_name="blabla" ... />    
     163      ... 
     164   </field_definition>  
     165\end{verbatim} 
     166}}\end{alltt}  
     167Note your definition must be added to the field\_group whose reference grid is consistent with the size of the array passed to iomput. The grid\_ref attribute refers to definitions set in iodef.xml which, in turn, reference grids and axes either defined in the code (iom\_set\_domain\_attr and iom\_set\_axis\_attr in iom.F90) or defined in the domain\_def.xml file. E.g.: 
     168\vspace{-20pt} 
     169\begin{alltt}  {{\scriptsize 
     170\begin{verbatim} 
     171     <grid id="grid_T_3D" domain_ref="grid_T" axis_ref="deptht"/> 
     172\end{verbatim} 
     173}}\end{alltt}  
     174Note, if your array is computed within the surface module each nn\_fsbc time\_step,  
     175add the field definition within the field\_group defined with the id ''SBC'': $<$field\_group id=''SBC''...$>$ which has been defined with the correct frequency of operations (iom\_set\_field\_attr in iom.F90) 
     176 
     177\item[4.] add your field in one of the output files defined in iodef.xml (again see subsequent sections for syntax and rules)   \\ 
     178\vspace{-20pt} 
     179\begin{alltt}  {{\scriptsize 
     180\begin{verbatim} 
     181   <file id="file1" .../>    
     182      ... 
     183      <field field_ref="identifier" />    
     184      ... 
     185   </file>    
     186\end{verbatim} 
     187}}\end{alltt}  
     188 
     189\end{description} 
     190\subsection{XML fundamentals} 
    61191 
    62192\subsubsection{ XML basic rules} 
     
    72202 
    73203The XML file used in XIOS is structured by 7 families of tags: context, axis, domain, grid, field, file and variable. Each tag family has hierarchy of three flavors (except for context): 
    74 \begin{description} 
    75 \item[root]: declaration of the root element that can contain element groups or elements, for example : $<$file\_definition ...$/>$ \\ 
    76 \item[group]: declaration of a group element that can contain element groups or elements, for example : $<$file\_group ...$/>$ \\ 
    77 \item[element]: declaration of an element that can contain elements, for example : $<$file ...$/>$  \\ 
    78 \end{description} 
     204\\ 
     205\begin{tabular}{|p{3.0cm}|p{4.5cm}|p{4.5cm}|} 
     206   \hline 
     207   flavor & 
     208   description & 
     209   example \\ 
     210   \hline 
     211   \hline 
     212   root & 
     213   declaration of the root element that can contain element groups or elements & 
     214   {\scriptsize \verb? < file_definition ... >?} \\ 
     215   \hline 
     216   group & 
     217   declaration of a group element that can contain element groups or elements & 
     218   {\scriptsize \verb? < file_group ... >?} \\ 
     219   \hline 
     220   element & 
     221   declaration of an element that can contain elements & 
     222   {\scriptsize \verb? < file ... >?} \\ 
     223   \hline 
     224\end{tabular} 
     225\\ 
    79226 
    80227Each element may have several attributes. Some attributes are mandatory, other are optional but have a default value and other are are completely optional. Id is a special attribute used to identify an element or a group of elements. It must be unique for a kind of element. It is optional, but no reference to the corresponding element can be done if it is not defined. 
    81228 
    82 The XML file is split into context tags that are used to isolate IO definition from different codes or different parts of a code. No interference is possible between 2 different contexts. Each context has its own calendar and an associated timestep. In NEMO, we used the following contexts (that can be defined in any order): 
    83 \begin{description} 
    84 \item[contex xios]: context containing informations for XIOS \\ 
    85 \verb?   <context id="xios" ... ?  
    86 \item[context nemo]: contex containing IO informations for NEMO (mother grid when using AGRIF) \\ 
    87 \verb?   <context id="nemo" ... ?  
    88 \item[context 1\_nemo]: contex containing IO informations for NEMO child grid 1 (when using AGRIF) \\ 
    89 \verb?   <context id="1_nemo" ... ?   
    90 \item[context n\_nemo]: contex containing IO informations for NEMO child grid n (when using AGRIF) \\ 
    91 \verb?   <context id="n_nemo" ... ? 
    92 \end{description} 
    93  
    94 Each context tag related to NEMO (mother or child grids) is divided into 5 parts (that can be defined in any order): 
    95 \begin{description} 
    96 \item[field definition]: define all variables that can potentially be outputted \\ 
    97 \verb?   <field_definition ... ? 
    98 \item[file definition]: define the netcdf files to be created and the variables they will contain \\ 
    99 \verb?   <file_definition ... ?  
    100 \item[axis definitions]: define vertical axis \\ 
    101 \verb?   <axis_definition ... ? 
    102 \item[domain definitions]: define the horizontal grids \\ 
    103 \verb?   <domain_definition ... ? 
    104 \item[grid definitions]: define the 2D and 3D grids (association of an axis and a domain) \\ 
    105 \verb?   <grid_definition ... ?  
    106 \end{description} 
    107  
    108 the xios context contains only 1 tag: 
    109 \begin{description} 
    110 \item[variable definition]: define variables needed by xios. This can be seen as a kind of namelist for xios. \\ 
    111 \verb?   <variable_definition ... ?  
    112 \end{description} 
    113  
    114 The XML file can be split in different parts to improve its readability and facilitate its use. The inclusing of XML files into the main XML file can be done through the attribute src: \\ 
    115 \verb?   <context src="./nemo_def.xml" /> ?  
    116 In NEMO, by default, the field and domain définition is done in 2 séparate files: \\ 
    117 NEMOGCM/CONFIG/SHARED/field\_def.xml and \\ 
    118 NEMOGCM/CONFIG/SHARED/domain\_def.xml that are included in the main iodef.xml file through the following commands: \\ 
    119 \verb?   <field_definition src="./field_def.xml" /> ? \\ 
    120 \verb?   <domain_definition src="./domain_def.xml" /> ?  
     229The XML file is split into context tags that are used to isolate IO definition from different codes or different parts of a code. No interference is possible between 2 different contexts. Each context has its own calendar and an associated timestep. In NEMO, we used the following contexts (that can be defined in any order):\\ 
     230\\ 
     231\begin{tabular}{|p{3.0cm}|p{4.5cm}|p{4.5cm}|} 
     232   \hline 
     233   context & 
     234   description & 
     235   example \\ 
     236   \hline 
     237   \hline 
     238   context xios & 
     239   context containing information for XIOS & 
     240   {\scriptsize \verb? <context id="xios" ...  ?} \\ 
     241   \hline 
     242   context nemo & 
     243   context containing IO information for NEMO (mother grid when using AGRIF) & 
     244   {\scriptsize \verb? <context id="nemo" ... ?} \\ 
     245   \hline 
     246   context 1\_nemo & 
     247   context containing IO information for NEMO child grid 1 (when using AGRIF) & 
     248   {\scriptsize \verb? <context id="1_nemo" ...  ?} \\ 
     249   \hline 
     250   context n\_nemo & 
     251   context containing IO information for NEMO child grid n (when using AGRIF) & 
     252   {\scriptsize \verb? <context id="n_nemo" ...  ?} \\ 
     253   \hline 
     254\end{tabular} 
     255\\ 
     256 
     257\noindent The xios context contains only 1 tag: 
     258\\ 
     259\begin{tabular}{|p{3.0cm}|p{4.5cm}|p{4.5cm}|} 
     260   \hline 
     261   context tag & 
     262   description & 
     263   example \\ 
     264   \hline 
     265   \hline 
     266   variable\_definition & 
     267   define variables needed by XIOS. This can be seen as a kind of namelist for XIOS. & 
     268   {\scriptsize \verb? <variable_definition ... ?} \\ 
     269   \hline 
     270\end{tabular} 
     271\\ 
     272 
     273\noindent Each context tag related to NEMO (mother or child grids) is divided into 5 parts (that can be defined in any order):\\ 
     274\\ 
     275\begin{tabular}{|p{3.0cm}|p{4.5cm}|p{4.5cm}|} 
     276   \hline 
     277   context tag & 
     278   description & 
     279   example \\ 
     280   \hline 
     281   \hline 
     282   field\_definition & 
     283   define all variables that can potentially be outputted & 
     284   {\scriptsize \verb? <field_definition ... ?} \\ 
     285   \hline 
     286   file\_definition & 
     287   define the netcdf files to be created and the variables they will contain & 
     288   {\scriptsize \verb? <file_definition ... ?} \\ 
     289   \hline 
     290   axis\_definition & 
     291   define vertical axis & 
     292   {\scriptsize \verb? <axis_definition ... ?} \\ 
     293   \hline 
     294   domain\_definition & 
     295   define the horizontal grids & 
     296   {\scriptsize \verb? <domain_definition ... ?} \\ 
     297   \hline 
     298   grid\_definition & 
     299   define the 2D and 3D grids (association of an axis and a domain) & 
     300   {\scriptsize \verb? <grid_definition ... ?} \\ 
     301   \hline 
     302\end{tabular} 
     303\\ 
     304 
     305\subsubsection{Nesting XML files} 
     306 
     307The XML file can be split in different parts to improve its readability and facilitate its use. The inclusion of XML files into the main XML file can be done through the attribute src: \\ 
     308{\scriptsize \verb? <context src="./nemo_def.xml" /> ?}\\ 
     309  
     310\noindent In NEMO, by default, the field and domain definition is done in 2 separate files: 
     311{\scriptsize \tt 
     312\begin{verbatim} 
     313NEMOGCM/CONFIG/SHARED/field_def.xml 
     314and 
     315NEMOGCM/CONFIG/SHARED/domain_def.xml  
     316\end{verbatim} 
     317} 
     318\noindent that are included in the main iodef.xml file through the following commands: \\ 
     319{\scriptsize \verb? <field_definition src="./field_def.xml" /> ? \\ 
     320\verb? <domain_definition src="./domain_def.xml" /> ? } 
    121321 
    122322 
    123323\subsubsection{Use of inheritance} 
    124324 
    125 XML extensively uses the concept of inheritance. XML has a based tree structure with a parent-child oriented relation: all children inherit attributes from parent, but an attribute defined in a child replace the inherited attribute value. Note that the special attribute ''id'' is never inherited.  \\ 
     325XML extensively uses the concept of inheritance. XML has a tree based structure with a parent-child oriented relation: all children inherit attributes from parent, but an attribute defined in a child replace the inherited attribute value. Note that the special attribute ''id'' is never inherited.  \\ 
    126326\\ 
    127 example 1: Direct inheritance. \\ 
     327example 1: Direct inheritance. 
     328\vspace{-20pt} 
    128329\begin{alltt}  {{\scriptsize     
    129330\begin{verbatim} 
    130331   <field_definition operation="average" > 
    131       <field id="sst"                    />   <!-- averaged      sst -->  
    132       <field id="sss" operation="instant"/>   <!-- instantaneous sss -->  
     332     <field id="sst"                    />   <!-- averaged      sst -->  
     333     <field id="sss" operation="instant"/>   <!-- instantaneous sss -->  
    133334   </field_definition>  
    134335\end{verbatim} 
     
    140341for example output instantaneous values instead of average values. \\ 
    141342\\ 
    142 example 2: Inheritance by reference. \\ 
     343example 2: Inheritance by reference. 
     344\vspace{-20pt} 
    143345\begin{alltt}  {{\scriptsize 
    144346\begin{verbatim} 
    145347   <field_definition> 
    146       <field id="sst" long_name="sea surface temperature" />    
    147       <field id="sss" long_name="sea surface salinity"    />   
     348     <field id="sst" long_name="sea surface temperature" />    
     349     <field id="sss" long_name="sea surface salinity"    />   
    148350   </field_definition>       
    149351 
    150352   <file_definition> 
    151       <file id="myfile" output_freq="1d" />    
    152             <field field_ref="sst"                            />  <!-- default def --> 
    153             <field field_ref="sss" long_name="my description" />  <!-- overwrite   --> 
    154       </file>    
     353     <file id="myfile" output_freq="1d" />    
     354       <field field_ref="sst"                            />  <!-- default def --> 
     355       <field field_ref="sss" long_name="my description" />  <!-- overwrite   --> 
     356     </file>    
    155357   </file_definition>  
    156358\end{verbatim} 
    157359}}\end{alltt}  
    158 Inherite (and overwrite, if needed) the attributes of a tag you are refering to. 
    159  
    160 \subsubsection{Use of Group} 
    161  
    162 Groups can be used fort 2 purposes. \\ 
    163  
    164 First, the group can be used to define common attributes to be shared by the elements of the group through the inheritance. In the following example, we define a group of field that will share a common grid ''grid\_T\_2D''. Note that for the field ''toce'', we overwrite the grid definition inherited from the group by ''grid\_T\_3D''. 
     360Inherit (and overwrite, if needed) the attributes of a tag you are refering to. 
     361 
     362\subsubsection{Use of Groups} 
     363 
     364Groups can be used for 2 purposes. Firstly, the group can be used to define common attributes to be shared by the elements of the group through inheritance. In the following example, we define a group of field that will share a common grid ''grid\_T\_2D''. Note that for the field ''toce'', we overwrite the grid definition inherited from the group by ''grid\_T\_3D''. 
     365\vspace{-20pt} 
    165366\begin{alltt}  {{\scriptsize 
    166367\begin{verbatim} 
    167368   <field_group id="grid_T" grid_ref="grid_T_2D"> 
    168       <field id="toce" long_name="temperature"             unit="degC" grid_ref="grid_T_3D"/> 
    169       <field id="sst"  long_name="sea surface temperature" unit="degC"                     /> 
    170       <field id="sss"  long_name="sea surface salinity"    unit="psu"                      /> 
    171       <field id="ssh"  long_name="sea surface height"      unit="m"                        /> 
     369    <field id="toce" long_name="temperature"             unit="degC" grid_ref="grid_T_3D"/> 
     370    <field id="sst"  long_name="sea surface temperature" unit="degC"                     /> 
     371    <field id="sss"  long_name="sea surface salinity"    unit="psu"                      /> 
     372    <field id="ssh"  long_name="sea surface height"      unit="m"                        /> 
    172373         ... 
    173374\end{verbatim} 
    174375}}\end{alltt}  
    175376 
    176 Second, the group can be used to replace a list of elements. Several examples of groups of fields are proposed at the end of the file \\ 
    177 NEMOGCM/CONFIG/SHARED/field\_def.xml. For example, a short list of usual variables related to the U grid: 
     377Secondly, the group can be used to replace a list of elements. Several examples of groups of fields are proposed at the end of the file {\tt CONFIG/SHARED/field\_def.xml}. For example, a short list of the usual variables related to the U grid: 
     378\vspace{-20pt} 
    178379\begin{alltt}  {{\scriptsize 
    179380\begin{verbatim} 
    180381   <field_group id="groupU" > 
    181       <field field_ref="uoce"  /> 
    182       <field field_ref="suoce" /> 
    183       <field field_ref="utau"  /> 
     382    <field field_ref="uoce"  /> 
     383    <field field_ref="suoce" /> 
     384    <field field_ref="utau"  /> 
    184385   </field_group> 
    185386\end{verbatim} 
    186387}}\end{alltt}  
    187 that can be directly include in a file through the following syntaxe: 
     388that can be directly included in a file through the following syntax: 
     389\vspace{-20pt} 
    188390\begin{alltt}  {{\scriptsize 
    189391\begin{verbatim} 
    190392   <file id="myfile_U" output_freq="1d" />    
    191       <field_group group_ref="groupU"/>   
    192       <field field_ref="uocetr_eff"  />  <!-- add another field --> 
     393    <field_group group_ref="groupU"/>   
     394    <field field_ref="uocetr_eff"  />  <!-- add another field --> 
    193395   </file>    
    194396\end{verbatim} 
     
    197399\subsection{Detailed functionalities } 
    198400 
    199 The file NEMOGCM/CONFIG/ORCA2\_LIM/iodef\_demo.xml provides several examples of the use of the new functionalities offered by the XML interface of XIOS.  
     401The file {\tt NEMOGCM/CONFIG/ORCA2\_LIM/iodef\_demo.xml} provides several examples of the use of the new functionalities offered by the XML interface of XIOS.  
    200402 
    201403\subsubsection{Define horizontal subdomains} 
    202 Horizontal subdomains are defined through the attributs zoom\_ibegin, zoom\_jbegin, zoom\_ni, zoom\_nj of the tag family domain. It must therefore be done in the domain part of the XML file. For example, in NEMOGCM/CONFIG/SHARED/domain\_def.xml, we provide the following example of a definition of a 5 by 5 box with the bottom left corner at point (10,10). 
     404Horizontal subdomains are defined through the attributs zoom\_ibegin, zoom\_jbegin, zoom\_ni, zoom\_nj of the tag family domain. It must therefore be done in the domain part of the XML file. For example, in {\tt CONFIG/SHARED/domain\_def.xml}, we provide the following example of a definition of a 5 by 5 box with the bottom left corner at point (10,10). 
     405\vspace{-20pt} 
    203406\begin{alltt}  {{\scriptsize 
    204407\begin{verbatim} 
    205408   <domain_group id="grid_T"> 
    206       <domain id="myzoom" zoom_ibegin="10" zoom_jbegin="10" zoom_ni="5" zoom_nj="5" /> 
     409    <domain id="myzoom" zoom_ibegin="10" zoom_jbegin="10" zoom_ni="5" zoom_nj="5" /> 
    207410\end{verbatim} 
    208411}}\end{alltt}  
    209412The use of this subdomain is done through the redefinition of the attribute domain\_ref of the tag family field. For example: 
     413\vspace{-20pt} 
    210414\begin{alltt}  {{\scriptsize 
    211415\begin{verbatim} 
     
    216420}}\end{alltt}  
    217421Moorings are seen as an extrem case corresponding to a 1 by 1 subdomain. The Equatorial section, the TAO, RAMA and PIRATA moorings are alredy registered in the code and can therefore be outputted without taking care of their (i,j) position in the grid. These predefined domains can be activated by the use of specific domain\_ref: ''EqT'', ''EqU'' or ''EqW'' for the equatorial sections and the mooring position for TAO, RAMA and PIRATA followed by ''T'' (for example: ''8s137eT'', ''1.5s80.5eT'' ...) 
     422\vspace{-20pt} 
    218423\begin{alltt}  {{\scriptsize 
    219424\begin{verbatim} 
     
    227432\subsubsection{Define vertical zooms} 
    228433Vertical zooms are defined through the attributs zoom\_begin and zoom\_end of the tag family axis. It must therefore be done in the axis part of the XML file. For example, in NEMOGCM/CONFIG/ORCA2\_LIM/iodef\_demo.xml, we provide the following example: 
     434\vspace{-20pt} 
    229435\begin{alltt}  {{\scriptsize 
    230436\begin{verbatim} 
     
    235441}}\end{alltt}  
    236442The use of this vertical zoom is done through the redefinition of the attribute axis\_ref of the tag family field. For example: 
     443\vspace{-20pt} 
    237444\begin{alltt}  {{\scriptsize 
    238445\begin{verbatim} 
     
    246453 
    247454The output file names are defined by the attributs ''name'' and ''name\_suffix'' of the tag family file. for example: 
     455\vspace{-20pt} 
    248456\begin{alltt}  {{\scriptsize 
    249457\begin{verbatim} 
     
    258466\end{verbatim} 
    259467}}\end{alltt}  
    260 However it is also often very convienent to define the file name with the name of the experience, the output file frequency and the date of the beginning and the end of the simulation (which are informations stored either in the namelist or in the XML file). To do so, we added the following rule: if the id of the tag file is ''fileN''(where N = 1 to 99) or one of the predefined section or mooring (see next subsection), the following part of the name and the name\_suffix (that can be inherited) will be automatically replaced by: \\ 
     468However it is often very convienent to define the file name with the name of the experiment, the output file frequency and the date of the beginning and the end of the simulation (which are informations stored either in the namelist or in the XML file). To do so, we added the following rule: if the id of the tag file is ''fileN''(where N = 1 to 99) or one of the predefined sections or moorings (see next subsection), the following part of the name and the name\_suffix (that can be inherited) will be automatically replaced by:\\ 
    261469\\ 
    262470\begin{tabular}{|p{4cm}|p{8cm}|} 
    263471   \hline 
    264    \centering part of the name automatically to be replaced & 
    265    by \\ 
     472   \centering placeholder string & automatically  replaced by \\ 
    266473   \hline 
    267474   \hline 
    268475   \centering @expname@ & 
    269    the experience name (from cn\_exp in the namelist) \\ 
     476   the experiment name (from cn\_exp in the namelist) \\ 
    270477   \hline 
    271478   \centering @freq@ & 
     
    284491   ending date of the simulation (from nn\_date0 and nn\_itend in the namelist). \verb?yyyymmdd_hh:mm:ss? format \\ 
    285492   \hline 
    286 \end{tabular} 
     493\end{tabular}\\ 
    287494\\ 
    288495 
    289 For example,  
    290  
    291 \begin{alltt}  {{\scriptsize 
     496\noindent For example,  
     497{{\scriptsize 
    292498\begin{verbatim} 
    293499   <file id="myfile_hzoom" name="myfile_@expname@_@startdate@_freq@freq@" output_freq="1d" > 
    294500\end{verbatim} 
    295 }}\end{alltt}  
    296  
    297 With, in the namelist: 
    298  
    299 \begin{alltt}  {{\scriptsize 
     501}} 
     502\noindent with the namelist: 
     503{{\scriptsize 
    300504\begin{verbatim} 
    301505   cn_exp      =  "ORCA2" 
     
    303507   ln_rstart   = .false. 
    304508\end{verbatim} 
    305 }}\end{alltt}  
    306  
    307 will give the following file name radical: 
    308  
    309 \begin{alltt}  {{\scriptsize 
     509}} 
     510\noindent will give the following file name radical: 
     511{{\scriptsize 
    310512\begin{verbatim} 
    311513   myfile_ORCA2_19891231_freq1d  
    312514\end{verbatim} 
    313 }}\end{alltt}  
    314  
     515}} 
    315516 
    316517\subsubsection{Other controls of the xml attributes from NEMO} 
    317518 
    318 The values of some attributes are automatically defined by NEMO (and any definition given in the xml file is overwritten). By convention, these attributes are defined to ''auto'' (for string) or ''0000'' (for integer) in the xml file (but this is not necessary).  
    319  
    320 Here is the list of these attributes: \\ 
     519The values of some attributes are defined by subroutine calls within NEMO (calls to iom\_set\_domain\_attr, iom\_set\_axis\_attr and iom\_set\_field\_attr in iom.F90). Any definition given in the xml file will be overwritten. By convention, these attributes are defined to ''auto'' (for string) or ''0000'' (for integer) in the xml file (but this is not necessary).  
     520 
     521Here is the list of these attributes:\\ 
    321522\\ 
    322523\begin{tabular}{|l|c|c|c|} 
     
    343544 
    344545 
     546\subsection{XML reference tables} 
     547 
    345548\subsubsection{Tag list} 
    346549 
    347  
    348 \begin{tabular}{|p{2cm}|p{2.5cm}|p{3.5cm}|p{2cm}|p{2cm}|} 
     550\begin{longtable}{|p{2.2cm}|p{2.5cm}|p{3.5cm}|p{2.2cm}|p{1.6cm}|} 
    349551   \hline 
    350552   tag name &  
     
    352554   accepted attribute &  
    353555   child of & 
    354    parent of \\ 
    355    \hline    
     556   parent of \endhead 
    356557   \hline    
    357558   simulation &  
     
    362563   \hline    
    363564   context & 
    364    encapsulates parts of the xml file dédicated to different codes or different parts of a code & 
     565   encapsulates parts of the xml file dedicated to different codes or different parts of a code & 
    365566   id (''xios'', ''nemo'' or ''n\_nemo'' for the nth AGRIF zoom), src, time\_origin & 
    366567   simulation & 
    367    all root tags: ...\_definition \\ 
     568   all root tags: ... \_definition \\ 
    368569   \hline    
    369570   \hline    
     
    389590   file\_definition &  
    390591   encapsulates the definition of all the files that will be outputted & 
    391    enabled, min\_digits, name, name\_suffix, output\_level, split\_format, split\_freq, sync\_freq, type, src & 
     592   enabled, min\_digits, name, name\_suffix, output\_level, split\_freq\_format, split\_freq, sync\_freq, type, src & 
    392593   context &  
    393594   file or file\_group \\ 
     
    395596   file\_group &  
    396597   encapsulates a group of files that will be outputted & 
    397    enabled, description, id, min\_digits, name, name\_suffix, output\_freq, output\_level, split\_format, split\_freq, sync\_freq, type, src & 
     598   enabled, description, id, min\_digits, name, name\_suffix, output\_freq, output\_level, split\_freq\_format, split\_freq, sync\_freq, type, src & 
    398599   file\_definition, file\_group &  
    399600   file or file\_group \\ 
    400601   \hline    
    401602   file &  
    402    defile the contentof a file to be outputted & 
    403    enabled, description, id, min\_digits, name, name\_suffix, output\_freq, output\_level, split\_format, split\_freq, sync\_freq, type, src & 
     603   define the contents of a file to be outputted & 
     604   enabled, description, id, min\_digits, name, name\_suffix, output\_freq, output\_level, split\_freq\_format, split\_freq, sync\_freq, type, src & 
    404605   file\_definition, file\_group &  
    405606   field \\ 
    406    \hline    
    407 \end{tabular} 
    408 \begin{tabular}{|p{2cm}|p{2.5cm}|p{3.5cm}|p{2cm}|p{2cm}|} 
    409    \hline 
    410    tag name &  
    411    description &  
    412    accepted attribute &  
    413    child of & 
    414    parent of \\ 
    415    \hline    
    416607   \hline    
    417608   axis\_definition &  
     
    434625   \hline    
    435626   \hline    
    436    domain\_definition &  
     627   domain\_\-definition &  
    437628   define all the horizontal domains potentially used by the variables & 
    438629   src & 
    439630   context &  
    440    domain\_group, domain \\ 
     631   domain\_\-group, domain \\ 
    441632   \hline    
    442633   domain\_group &  
    443634   encapsulates a group of horizontal domains & 
    444635   id, lon\_name, src, zoom\_ibegin, zoom\_jbegin, zoom\_ni, zoom\_nj & 
    445    domain\_definition, domain\_group &  
    446    domain\_group, domain \\ 
     636   domain\_\-definition, domain\_group &  
     637   domain\_\-group, domain \\ 
    447638   \hline    
    448639   domain &  
    449640   define an horizontal domain & 
    450641   id, lon\_name, src, zoom\_ibegin, zoom\_jbegin, zoom\_ni, zoom\_nj & 
    451    domain\_definition, domain\_group &  
     642   domain\_\-definition, domain\_group &  
    452643   none \\ 
    453644   \hline    
     
    471662   none \\ 
    472663   \hline    
    473 \end{tabular} 
     664\end{longtable} 
    474665 
    475666 
    476667\subsubsection{Attributes list} 
    477668 
    478 \begin{tabular}{|p{2cm}|p{4cm}|p{4cm}|p{2cm}|} 
    479    \hline 
     669\begin{longtable}{|p{2.2cm}|p{4cm}|p{3.8cm}|p{2cm}|} 
     670   \hline 
     671   attribute name &  
     672   description &  
     673   example &  
     674   accepted by \endhead 
     675   \hline    
     676   axis\_ref &  
     677   refers to the id of a vertical axis &  
     678   axis\_ref="deptht" &  
     679   field, grid families \\  
     680   \hline    
     681   enabled &  
     682   switch on/off the output of a field or a file &  
     683   enabled=".TRUE." &  
     684   field, file families \\  
     685   \hline    
     686   default\_value &  
     687   missing\_value definition &  
     688   default\_value="1.e20" &  
     689   field family \\  
     690   \hline    
     691   description &  
     692   just for information, not used &  
     693   description="ocean T grid variables" &  
     694   all tags \\  
     695   \hline    
     696   domain\_ref &  
     697   refers to the id of a domain &  
     698   domain\_ref="grid\_T" &  
     699   field or grid families \\  
     700   \hline    
     701   field\_ref &  
     702   id of the field we want to add in a file &  
     703   field\_ref="toce" &  
     704   field \\  
     705   \hline    
     706   grid\_ref &  
     707   refers to the id of a grid &  
     708   grid\_ref="grid\_T\_2D" &  
     709   field family \\  
     710   \hline    
     711   group\_ref &  
     712   refer to a group of variables &  
     713   group\_ref="mooring" &  
     714   field\_group \\  
     715   \hline    
     716   id &  
     717   allow to identify a tag &  
     718   id="nemo" & 
     719   accepted by all tags except simulation \\  
     720   \hline    
     721   level &  
     722   output priority of a field: 0 (high) to 10 (low)&  
     723   level="1" &  
     724   field family \\  
     725   \hline    
     726   long\_name &  
     727   define the long\_name attribute in the NetCDF file &  
     728   long\_name="Vertical T levels" &  
     729   field \\  
     730   \hline    
     731   min\_digits &  
     732   specify the minimum of digits used in the core number in the name of the NetCDF file &  
     733   min\_digits="4" &  
     734   file family \\  
     735   \hline    
     736   name &  
     737   name of a variable or a file. If the name of a file is undefined, its id is used as a name &  
     738   name="tos" &  
     739   field or file families \\  
     740   \hline    
     741   name\_suffix &  
     742   suffix to be inserted after the name and before the cpu number and the ''.nc'' termination of a file &  
     743   name\_suffix="\_myzoom" &  
     744   file family \\  
     745   \hline    
    480746   attribute name &  
    481747   description &  
     
    484750   \hline    
    485751   \hline    
    486    axis\_ref &  
    487    refers to the id of a vertical axis &  
    488    axis\_ref="deptht" &  
    489    field, grid families \\  
    490    \hline    
    491    enabled &  
    492    switch on/off the output of a field or a file &  
    493    enabled=".TRUE." &  
    494    field, file families \\  
    495    \hline    
    496    default\_value &  
    497    missing\_value definition &  
    498    default\_value="1.e20" &  
     752   operation &  
     753   type of temporal operation: average, accumulate, instantaneous, min, max and once &  
     754   operation="average" &  
    499755   field family \\  
    500756   \hline    
    501    description &  
    502    just for information, not used &  
    503    description="ocean T grid variables" &  
    504    all tags \\  
    505    \hline    
    506    domain\_ref &  
    507    refers to the id of a domain &  
    508    domain\_ref="grid\_T" &  
    509    field or grid families \\  
    510    \hline    
    511    field\_ref= &  
    512    id of the field we want to add in a file &  
    513    field\_ref="toce" &  
     757   output\_freq &  
     758   operation frequency. units can be ts (timestep), y, mo, d, h, mi, s. &  
     759   output\_freq="1d12h" &  
     760   field family \\  
     761   \hline    
     762   output\_level &  
     763   output priority of variables in a file: 0 (high) to 10 (low). All variables listed in the file with a level smaller or equal to output\_level will be output. Other variables won't be output even if they are listed in the file. &   
     764   output\_level="10"&  
     765   file family \\  
     766   \hline    
     767   positive &  
     768   convention used for the orientation of vertival axis (positive downward in \NEMO). &  
     769   positive="down" &  
     770   axis family \\  
     771   \hline    
     772   prec &  
     773   output precision: real 4 or real 8 &  
     774   prec="4" &  
     775   field family \\  
     776   \hline    
     777   split\_freq &  
     778   frequency at which to temporally split output files. Units can be ts (timestep), y, mo, d, h, mi, s. Useful for long runs to prevent over-sized output files.&  
     779   split\_freq="1mo" &  
     780   file family \\  
     781   \hline    
     782   split\_freq\-\_format &  
     783   date format used in the name of temporally split output files. Can be specified  
     784   using the following syntaxes: \%y, \%mo, \%d, \%h \%mi and \%s &  
     785   split\_freq\_format= "\%y\%mo\%d" &  
     786   file family \\  
     787   \hline    
     788   src &  
     789   allow to include a file &  
     790   src="./field\_def.xml" &  
     791   accepted by all tags except simulation \\  
     792   \hline    
     793   standard\_name &  
     794   define the standard\_name attribute in the NetCDF file &  
     795   standard\_name= "Eastward\_Sea\_Ice\_Transport" &  
    514796   field \\  
    515797   \hline    
    516    grid\_ref &  
    517    refers to the id of a grid &  
    518    grid\_ref="grid\_T\_2D" &  
    519    field family \\  
    520    \hline    
    521    group\_ref &  
    522    refer to a group of variables &  
    523    group\_ref="mooring" &  
    524    field\_group \\  
    525    \hline    
    526    id &  
    527    allow to identify a tag &  
    528    id="nemo" & 
    529    accepted by all tags except simulation \\  
    530    \hline    
    531    level &  
    532    output priority of a field: 0 (high) to 10 (low)&  
    533    level="1" &  
    534    field family \\  
    535    \hline    
    536    long\_name &  
    537    define the long\_name attribute in the NetCDF file &  
    538    long\_name="Vertical T levels" &  
    539    field \\  
    540    \hline    
    541    min\_digits &  
    542    specify the minimum of digits used in the core number in the name of the NetCDF file &  
    543    min\_digits="4" &  
     798   sync\_freq &  
     799   NetCDF file synchronization frequency (update of the time\_counter). units can be ts (timestep), y, mo, d, h, mi, s. &  
     800   sync\_freq="10d" &  
    544801   file family \\  
    545802   \hline    
    546    name &  
    547    name of a variable or a file. If the name of a file is undefined, its id is used as a name &  
    548    name="tos" &  
    549    field or file families \\  
    550    \hline    
    551    name\_suffix &  
    552    suffix to be inserted after the name and before the cpu number and the ''.nc'' termination of a file &  
    553    name\_suffix="\_myzoom" &  
    554    file family \\  
    555    \hline    
    556 \end{tabular} 
    557 \begin{tabular}{|p{2cm}|p{4cm}|p{4cm}|p{2cm}|} 
    558    \hline 
    559803   attribute name &  
    560804   description &  
     
    563807   \hline    
    564808   \hline    
    565    operation &  
    566    type of temporal operation: average, accumulate, instantaneous, min, max and once &  
    567    operation="average" &  
    568    field family \\  
    569    \hline    
    570    output\_freq &  
    571    operation frequency. units can be ts (timestep), y, mo, d, h, mi, s. &  
    572    output\_freq="1d12h" &  
    573    field family \\  
    574    \hline    
    575    output\_level &  
    576    output priority of variables in a file: 0 (high) to 10 (low). All variables listed in the file with a level smaller or equal to output\_level will be output. Other variables won't be output even if they are listed in the file. &   
    577    output\_level="10"&  
    578    file family \\  
    579    \hline    
    580    positive &  
    581    convention used for the orientation of vertival axis (positive downward in \NEMO). &  
    582    positive="down" &  
    583    axis family \\  
    584    \hline    
    585    prec &  
    586    output precision: real 4 or real 8 &  
    587    prec="4" &  
    588    field family \\  
    589    \hline    
    590    split\_format &  
    591    date format used in the name of splitted output files. can be spécified using the following syntaxe: \%y, \%mo, \%d, \%h \%mi and \%s &  
    592    split\_format="\%yy\%mom\%dd" &  
    593    file family \\  
    594    \hline    
    595    split\_freq &  
    596    split output files frequency. units can be ts (timestep), y, mo, d, h, mi, s. &  
    597    split\_freq="1mo" &  
    598    file family \\  
    599    \hline    
    600    src &  
    601    allow to include a file &  
    602    src="./field\_def.xml" &  
    603    accepted by all tags except simulation \\  
    604    \hline    
    605    standard\_name &  
    606    define the standard\_name attribute in the NetCDF file &  
    607    standard\_name="Eastward\_Sea\_Ice\_Transport" &  
    608    field \\  
    609    \hline    
    610    sync\_freq &  
    611    NetCDF file synchronization frequency (update of the time\_counter). units can be ts (timestep), y, mo, d, h, mi, s. &  
    612    sync\_freq="10d" &  
    613    file family \\  
    614    \hline    
    615 \end{tabular} 
    616 \begin{tabular}{|p{2cm}|p{4cm}|p{4cm}|p{2cm}|} 
    617    \hline 
    618    attribute name &  
    619    description &  
    620    example &  
    621    accepted by \\  
    622    \hline    
    623    \hline    
    624809   time\_origin &  
    625810   specify the origin of the time counter &  
     
    628813   \hline    
    629814   type (1)&  
    630    specify if the output files must be splitted (multiple\_file) or not (one\_file) &  
     815   specify if the output files are to be split spatially (multiple\_file) or not (one\_file) &  
    631816   type="multiple\_file" &  
    632817   file familly \\  
     
    662847   domain family \\  
    663848   \hline    
    664 \end{tabular} 
    665  
    666 \subsection{XIOS: the IO\_SERVER} 
    667  
    668 \subsubsection{Attached or detached mode?} 
    669  
    670 Iomput is based on \href{http://forge.ipsl.jussieu.fr/ioserver/wiki}{XIOS}, the io\_server developed by Yann Meurdesoif from IPSL. This server can be used in ''attached mode'' (as a library) or in ''detached mode'' (as an external executable on n cpus). The ''attached mode'' is simpler to use but much less efficient. If the type of file is ''multiple\_file'', then in attached(detached) mode, each NEMO(XIOS) process will output its own subdomain: if NEMO(XIOS) is runnning on N cores, the ouput files will be splitted into N files. If the type of file is ''one\_file'', the output files will be directly recombined into one unique file either in ''detached mode'' or ''attached mode''.   
    671  
    672 \subsubsection{Control of xios: the xios context in iodef.xml} 
    673  
    674 The control of the use of xios is done through the xios context in iodef.xml. 
    675  
    676 \begin{tabular}{|p{3cm}|p{6.5cm}|p{2.5cm}|} 
    677    \hline 
    678    variable name &  
    679    description &  
    680    example \\  
    681    \hline    
    682    \hline 
    683    buffer\_size &  
    684    buffer size used by XIOS to send data from NEMO to XIOS. Larger is more efficient. Note that needed/used buffer sizes are summarized at the end of the job &  
    685    25000000 \\  
    686    \hline    
    687    buffer\_server\_factor\_size &  
    688    ratio between NEMO and XIOS buffer size. Should be 2. &  
    689    2 \\  
    690    \hline 
    691    info\_level &  
    692    verbosity level (0 to 100) &  
    693    0 \\  
    694    \hline 
    695    using\_server &  
    696    activate attached(false) or detached(true) mode &  
    697    true \\  
    698    \hline 
    699    using\_oasis &  
    700    xios is used with OASIS(true) or not (false) &  
    701    false \\  
    702    \hline 
    703    oasis\_codes\_id &  
    704    when using oasis, define the identifier of NEMO in the namcouple. Not that the identifier of XIOS is xios.x &  
    705    oceanx \\  
    706    \hline    
    707 \end{tabular} 
    708  
    709 \subsubsection{Number of cpu used by XIOS in detached mode} 
    710  
    711 The number of cores used by the xios is specified only when launching the model. The number or cores dedicated to XIOS should be from ~1/10 to ~1/50 of the number or cores dedicated to NEMO (according of the amount of data to be created). Here is an example of 2 cpus for the io\_server and 62 cpu for opa using mpirun: 
    712  
    713 \texttt{ mpirun -np 2 ./nemo.exe : -np 62 ./xios\_server.exe } 
    714  
    715 \subsection{Practical issues} 
    716  
    717 \subsubsection{Add your own outputs} 
    718  
    719 It is very easy to add you own outputs with iomput. 4 points must be followed. 
    720 \begin{description} 
    721 \item[1-] in NEMO code, add a \\ 
    722 \texttt{      CALL iom\_put( 'identifier', array ) } \\ 
    723 where you want to output a 2D or 3D array. 
    724  
    725 \item[2-] don't forget to add \\ 
    726 \texttt{   USE iom            ! I/O manager library }  \\ 
    727 in the list of used modules in the upper part of your module.  
    728  
    729 \item[3-] in the file\_definition part of the xml file, add the definition of your variable using the same identifier you used in the f90 code. 
    730 \vspace{-20pt} 
    731 \begin{alltt}  {{\scriptsize 
    732 \begin{verbatim} 
    733    <field_definition> 
    734       ... 
    735       <field id="identifier" long_name="blabla" ... />    
    736       ... 
    737    </field_definition>  
    738 \end{verbatim} 
    739 }}\end{alltt}  
    740 attributes axis\_ref and grid\_ref must be consistent with the size of the array to pass to iomput. 
    741 if your array is computed within the surface module each nn\_fsbc time\_step,  
    742 add the field definition within the group defined with the id ''SBC'': $<$group id=''SBC''...$>$ 
    743  
    744 \item[4-] add your field in one of the output files   \\ 
    745 \vspace{-20pt} 
    746 \begin{alltt}  {{\scriptsize 
    747 \begin{verbatim} 
    748    <file id="file1" .../>    
    749       ... 
    750       <field ref="identifier" />    
    751       ... 
    752    </file>    
    753 \end{verbatim} 
    754 }}\end{alltt}  
    755  
    756 \end{description} 
     849\end{longtable} 
     850 
    757851 
    758852 
     
    809903domain size in any dimension. The algorithm used is: 
    810904 
     905\vspace{-20pt} 
    811906\begin{alltt}  {{\scriptsize  
    812907\begin{verbatim} 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/ARCH/arch-ALTIX_NAUTILUS_MPT.fcm

    r3922 r4219  
    4747%LD                  ifort 
    4848%FPPFLAGS            -P -C -traditional 
    49 %LDFLAGS             -lmpi -lstdc++ 
     49%LDFLAGS             -lmpi -lstdc++ -lcurl 
    5050%AR                  ar  
    5151%ARFLAGS             -r 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/ARCH/arch-X64_CURIE.fcm

    r3922 r4219  
    2929#  - fcm variables are starting with a % (and not a $) 
    3030# 
    31 %NCDF_HOME           /usr/local/netcdf-4.2_hdf5 
    32 %HDF5_HOME           /usr/local/hdf5-1.8.8 
     31%NCDF_HOME           /usr/local/netcdf-4.2_hdf5_parallel 
     32%HDF5_HOME           /usr/local/hdf5-1.8.9_parallel 
    3333%XIOS_HOME           $WORKDIR/now/models/xios 
    3434%OASIS_HOME          $WORKDIR/now/models/oa3mct 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/AMM12/EXP00/iodef.xml

    r3940 r4219  
    6262      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    6363      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     64      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    6465   </axis_definition>  
    6566     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/GYRE/EXP00/iodef.xml

    r4188 r4219  
    101101      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    102102      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     103      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    103104   </axis_definition>  
    104105     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/GYRE_BFM/EXP00/iodef.xml

    r3940 r4219  
    6262      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    6363      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     64      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    6465   </axis_definition>  
    6566     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/GYRE_PISCES/EXP00/iodef.xml

    r4188 r4219  
    137137      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    138138      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     139      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    139140   </axis_definition>  
    140141     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_LIM/EXP00/iodef_ar5.xml

    r3940 r4219  
    248248      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    249249      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     250      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    250251   </axis_definition>  
    251252     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_LIM/EXP00/iodef_default.xml

    • Property svn:mime-type deleted
    • Property svn:keywords set to Id
    r4188 r4219  
    138138      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    139139      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     140      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    140141   </axis_definition>  
    141142     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_LIM/EXP00/iodef_demo.xml

    • Property svn:mime-type deleted
    • Property svn:keywords set to Id
    r3940 r4219  
    4444   <!-- mooring: automatic definition of the file name suffix based on id="0n180wT"  --> 
    4545   <!-- include a group of variables. see field_def.xml for mooring variables definition  --> 
    46    <file id="0n180wT" > 
    47      <field_group group_ref="mooring"/>   
     46   <file id="0n180wT"> 
     47     <field_group group_ref="mooring" domain_ref="0n180wT" />   
    4848   </file> 
    4949    
     
    5353     <field_group id="EqT" domain_ref="EqT" > 
    5454       <field field_ref="toce" name="votemper" axis_ref="deptht_myzoom"  /> 
     55       <field field_ref="sss" /> 
    5556     </field_group> 
    5657   </file> 
     
    8586      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    8687      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     88      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    8789   </axis_definition>  
    8890     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_LIM/EXP00/iodef_oldstyle.xml

    • Property svn:mime-type deleted
    • Property svn:keywords set to Id
    r3940 r4219  
    116116      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    117117      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     118      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    118119   </axis_definition>  
    119120     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_LIM_CFC_C14b/EXP00/iodef.xml

    r3771 r4219  
    2121    --> 
    2222     
    23     <file_definition type="multiple_file" sync_freq="1d" min_digits="4"> 
     23    <file_definition type="multiple_file" name="@expname@_@freq@_@startdate@_@enddate@" sync_freq="1d" min_digits="4"> 
    2424     
    2525      <file_group id="1h" output_freq="1h"  output_level="10" enabled=".TRUE."/> <!-- 1h files --> 
     
    6060      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    6161      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     62      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    6263   </axis_definition>  
    6364     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_LIM_PISCES/EXP00/iodef.xml

    r4188 r4219  
    2121    --> 
    2222     
    23     <file_definition type="multiple_file" sync_freq="1d" min_digits="4"> 
     23    <file_definition type="multiple_file" name="@expname@_@freq@_@startdate@_@enddate@" sync_freq="10d" min_digits="4"> 
    2424     
    2525      <file_group id="1h" output_freq="1h"  output_level="10" enabled=".TRUE."/> <!-- 1h files --> 
     
    3131      <file_group id="1d" output_freq="1d"  output_level="10" enabled=".TRUE."> <!-- 1d files --> 
    3232 
    33    <file id="1d_grid_T" name="auto" description="ocean T grid variables" > 
     33   <file id="file1" name_suffix="_grid_T" description="ocean T grid variables" > 
    3434     <field field_ref="sst"          name="sosstsst"  /> 
    3535     <field field_ref="sss"          name="sosaline"  /> 
     
    3737   </file> 
    3838 
    39    <file id="1d_grid_U" name="auto" description="ocean U grid variables" > 
     39   <file id="file2" name_suffix="_grid_U" description="ocean U grid variables" > 
    4040     <field field_ref="suoce"         name="vozocrtx"  /> 
    4141   </file> 
    4242    
    43    <file id="1d_grid_V" name="auto" description="ocean V grid variables" > 
     43   <file id="file3" name_suffix="_grid_V" description="ocean V grid variables" > 
    4444     <field field_ref="svoce"         name="vomecrty"  /> 
    4545   </file> 
     
    4949      <file_group id="5d" output_freq="5d"  output_level="10" enabled=".TRUE.">  <!-- 5d files -->    
    5050 
    51    <file id="5d_grid_T" name="auto" description="ocean T grid variables" > 
     51   <file id="file4" name_suffix="_grid_T" description="ocean T grid variables" > 
    5252     <field field_ref="toce"         name="votemper"  /> 
    5353     <field field_ref="soce"         name="vosaline"  /> 
     
    7272   </file> 
    7373    
    74    <file id="5d_grid_U" name="auto" description="ocean U grid variables" > 
     74   <file id="file5" name_suffix="_grid_U" description="ocean U grid variables" > 
    7575     <field field_ref="uoce"         name="vozocrtx"  /> 
    7676     <field field_ref="uoce_eiv"     name="vozoeivu"  /> 
     
    8181   </file> 
    8282    
    83    <file id="5d_grid_V" name="auto" description="ocean V grid variables" > 
     83   <file id="file6" name_suffix="_grid_V" description="ocean V grid variables" > 
    8484     <field field_ref="voce"         name="vomecrty"  /> 
    8585     <field field_ref="voce_eiv"     name="vomeeivv"  /> 
     
    9090   </file> 
    9191    
    92    <file id="5d_grid_W" name="auto" description="ocean W grid variables" > 
     92   <file id="file7" name_suffix="_grid_W" description="ocean W grid variables" > 
    9393     <field field_ref="woce"         name="vovecrtz" /> 
    9494     <field field_ref="avt"          name="votkeavt" /> 
     
    9797   </file> 
    9898    
    99    <file id="5d_icemod" name="auto" description="ice variables" > 
     99   <file id="file8" name_suffix="_icemod" description="ice variables" > 
    100100     <field field_ref="ice_pres"                     /> 
    101101     <field field_ref="snowthic_cea" name="isnowthi" /> 
     
    117117      <file_group id="1m" output_freq="1mo" output_level="10" enabled=".TRUE."> <!-- real monthly files --> 
    118118 
    119    <file id="1m_ptrc_T" name="auto" description="pisces sms variables" > 
     119   <file id="file9" name_suffix="_ptrc_T" description="pisces sms variables" > 
    120120          <field field_ref="DIC"      /> 
    121121          <field field_ref="Alkalini" /> 
     
    129129   </file> 
    130130    
    131    <file id="1m_diad_T" name="auto" description="additional pisces diagnostics" > 
     131   <file id="file10" name_suffix="_diad_T" description="additional pisces diagnostics" > 
    132132          <field field_ref="Cflx"     /> 
    133133          <field field_ref="Dpco2"    /> 
     
    142142      <file_group id="1y"  output_freq="1y" output_level="10" enabled=".TRUE."> <!-- real yearly files --> 
    143143 
    144    <file id="1y_ptrc_T" name="auto" description="pisces sms variables" > 
     144   <file id="file11" name_suffix="_ptrc_T" description="pisces sms variables" > 
    145145          <field field_ref="DIC"      /> 
    146146          <field field_ref="Alkalini" /> 
     
    169169   </file> 
    170170 
    171    <file id="1y_diad_T" name="auto" description="additional pisces diagnostics" > 
     171   <file id="file12" name_suffix="_diad_T" description="additional pisces diagnostics" > 
    172172          <field field_ref="PH"       /> 
    173173          <field field_ref="CO3"      /> 
     
    239239      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    240240      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     241      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    241242   </axis_definition>  
    242243     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_OFF_PISCES/EXP00/iodef.xml

    r3771 r4219  
    2121    --> 
    2222     
    23     <file_definition type="multiple_file" sync_freq="1d" min_digits="4"> 
     23    <file_definition type="multiple_file" name="@expname@_@freq@_@startdate@_@enddate@" sync_freq="1d" min_digits="4"> 
    2424     
    2525      <file_group id="1h" output_freq="1h"  output_level="10" enabled=".TRUE."/> <!-- 1h files --> 
     
    3535      <file_group id="1m" output_freq="1mo" output_level="10" enabled=".TRUE."> <!-- real monthly files --> 
    3636 
    37    <file id="1m_ptrc_T" name="auto" description="pisces sms variables" > 
     37   <file id="file1" name_suffix="_ptrc_T" description="pisces sms variables" > 
    3838          <field field_ref="DIC"      /> 
    3939          <field field_ref="Alkalini" /> 
     
    4747   </file> 
    4848    
    49    <file id="1m_diad_T" name="auto" description="additional pisces diagnostics" > 
     49   <file id="file2" name_suffix="_diad_T" description="additional pisces diagnostics" > 
    5050          <field field_ref="Cflx"     /> 
    5151          <field field_ref="Dpco2"    /> 
     
    6060      <file_group id="1y"  output_freq="1y" output_level="10" enabled=".TRUE."> <!-- real yearly files --> 
    6161 
    62    <file id="1y_ptrc_T" name="auto" description="pisces sms variables" > 
     62   <file id="file3" name_suffix="_ptrc_T" description="pisces sms variables" > 
    6363          <field field_ref="DIC"      /> 
    6464          <field field_ref="Alkalini" /> 
     
    8787   </file> 
    8888 
    89    <file id="1y_diad_T" name="auto" description="additional pisces diagnostics" > 
     89   <file id="file4" name_suffix="_diad_T" description="additional pisces diagnostics" > 
    9090          <field field_ref="PH"       /> 
    9191          <field field_ref="CO3"      /> 
     
    157157      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    158158      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     159      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    159160   </axis_definition>  
    160161     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/ORCA2_SAS_LIM/EXP00/iodef.xml

    r3771 r4219  
    2121    --> 
    2222     
    23     <file_definition type="multiple_file" sync_freq="1d" min_digits="4"> 
     23    <file_definition type="multiple_file" name="@expname@_@freq@_@startdate@_@enddate@" sync_freq="1d" min_digits="4"> 
    2424     
    2525      <file_group id="1h" output_freq="1h"  output_level="10" enabled=".TRUE."/> <!-- 1h files --> 
     
    6060      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    6161      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     62      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    6263   </axis_definition>  
    6364     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/CONFIG/SHARED/field_def.xml

    r4188 r4219  
    227227      </field_group> 
    228228 
     229      <!-- variables available with iceberg trajectories --> 
     230      <field_group id="icbvar" domain_ref="grid_T"  >  
     231        <field id="berg_melt"          long_name="icb melt rate of icebergs"                     unit="kg/m2/s"   /> 
     232        <field id="berg_buoy_melt"     long_name="icb buoyancy component of iceberg melt rate"   unit="kg/m2/s"   /> 
     233        <field id="berg_eros_melt"     long_name="icb erosion component of iceberg melt rate"    unit="kg/m2/s"   /> 
     234        <field id="berg_conv_melt"     long_name="icb convective component of iceberg melt rate" unit="kg/m2/s"   /> 
     235        <field id="berg_virtual_area"  long_name="icb virtual coverage by icebergs"              unit="m2"        /> 
     236        <field id="bits_src"           long_name="icb mass source of bergy bits"                 unit="kg/m2/s"   /> 
     237        <field id="bits_melt"          long_name="icb melt rate of bergy bits"                   unit="kg/m2/s"   /> 
     238        <field id="bits_mass"          long_name="icb bergy bit density field"                   unit="kg/m2"     /> 
     239        <field id="berg_mass"          long_name="icb iceberg density field"                     unit="kg/m2"     /> 
     240        <field id="calving"            long_name="icb calving mass input"                        unit="kg/s"      /> 
     241        <field id="berg_floating_melt" long_name="icb melt rate of icebergs + bits"              unit="kg/m2/s"   /> 
     242        <field id="berg_real_calving"  long_name="icb calving into iceberg class"                unit="kg/s"     axis_ref="icbcla" /> 
     243        <field id="berg_stored_ice"    long_name="icb accumulated ice mass by class"             unit="kg"       axis_ref="icbcla" /> 
     244      </field_group> 
     245 
    229246      <!-- ptrc on T grid --> 
    230247 
     
    255272       <field id="NH4"      long_name="Ammonium Concentration"                   unit="mmol/m3" /> 
    256273 
     274       <!-- PISCES with Kriest parametisation : variables available with key_kriest --> 
     275       <field id="Num"      long_name="Number of organic particles"              unit="nbr" /> 
     276 
     277       <!-- PISCES light : variables available with key_pisces_reduced --> 
    257278       <field id="DET"      long_name="Detritus"                                 unit="mmol-N/m3" /> 
    258279       <field id="DOM"      long_name="Dissolved Organic Matter"                 unit="mmol-N/m3" /> 
    259280 
     281       <!-- CFC11 : variables available with key_cfc --> 
    260282       <field id="CFC11"    long_name="CFC-11 Concentration"                     unit="umol/L" /> 
     283       <!-- Bomb C14 : variables available with key_c14b --> 
    261284       <field id="C14B"     long_name="Bomb C14 Concentration"                   unit="ration" /> 
    262285     </field_group> 
    263286 
    264       <!-- diad on T grid : variables available with key_diatrc --> 
    265  
     287      <!-- PISCES additional diagnostics on T grid  --> 
    266288     <field_group id="diad_T" grid_ref="grid_T_2D"> 
    267289       <field id="PH"          long_name="PH"                                      unit="-"          grid_ref="grid_T_3D" /> 
     
    317339       <field id="Heup"        long_name="Euphotic layer depth"                    unit="m"                            /> 
    318340       <field id="Irondep"     long_name="Iron deposition from dust"               unit="mol/m2/s"                     /> 
    319        <field id="Ironsed"     long_name="Iron deposition from sediment"           unit="mol/m2/s"  grid_ref="grid_T_3D"  /> 
    320  
    321        <field id="FNO3PHY"  long_name="FNO3PHY"                             unit="-"  grid_ref="grid_T_3D" />  
    322        <field id="FNH4PHY"  long_name="FNH4PHY"                             unit="-"  grid_ref="grid_T_3D" />  
    323        <field id="FNH4NO3"  long_name="FNH4NO3"                             unit="-"  grid_ref="grid_T_3D" />  
    324        <field id="TNO3PHY"  long_name="TNO3PHY"                             unit="-"  />  
    325        <field id="TNH4PHY"  long_name="TNH4PHY"                             unit="-"  />  
    326        <field id="TPHYDOM"  long_name="TPHYDOM"                             unit="-"  />  
    327        <field id="TPHYNH4"  long_name="TPHYNH4"                             unit="-"  />  
    328        <field id="TPHYZOO"  long_name="TPHYZOO"                             unit="-"  />  
    329        <field id="TPHYDET"  long_name="TPHYDET"                             unit="-"  />  
    330        <field id="TDETZOO"  long_name="TDETZOO"                             unit="-"  />  
    331        <field id="TZOODET"  long_name="TZOODET"                             unit="-"  />  
    332        <field id="TZOOBOD"  long_name="TZOOBOD"                             unit="-"  />  
    333        <field id="TZOONH4"  long_name="TZOONH4"                             unit="-"  />  
    334        <field id="TZOODOM"  long_name="TZOODOM"                             unit="-"  />  
    335        <field id="TNH4NO3"  long_name="TNH4NO3"                             unit="-"  />  
    336        <field id="TDOMNH4"  long_name="TDOMNH4"                             unit="-"  />  
    337        <field id="TDETNH4"  long_name="TDETNH4"                             unit="-"  />  
    338        <field id="TPHYTOT"  long_name="TPHYTOT"                             unit="-"  />  
    339        <field id="TZOOTOT"  long_name="TZOOTOT"                             unit="-"  />  
    340        <field id="SEDPOC"   long_name="SEDPOC"                              unit="-"  />  
    341        <field id="TDETSED"  long_name="TDETSED"                             unit="-"  />  
    342  
    343        <field id="qtrCFC11"     long_name="Air-sea flux of CFC-11"                   unit="mol/m2/s"   /> 
    344        <field id="qintCFC11"    long_name="Cumulative air-sea flux of CFC-11"        unit="mol/m2"     /> 
    345        <field id="qtrC14b"      long_name="Air-sea flux of Bomb C14"                 unit="mol/m2/s"   /> 
    346        <field id="qintC14b"     long_name="Cumulative air-sea flux of Bomb C14"      unit="mol/m2"     /> 
    347        <field id="fdecay"       long_name="Radiactive decay of Bomb C14"             unit="mol/m3"  grid_ref="grid_T_3D"  /> 
     341       <field id="Ironsed"     long_name="Iron deposition from sediment"           unit="mol/m2/s"  grid_ref="grid_T_3D "/> 
     342 
     343       <!-- PISCES with Kriest parametisation : variables available with key_kriest --> 
     344       <field id="POCFlx"      long_name="Particulate organic C flux"              unit="mol/m2/s"   grid_ref="grid_T_3D" /> 
     345       <field id="NumFlx"      long_name="Particle number flux"                    unit="nbr/m2/s"   grid_ref="grid_T_3D" /> 
     346       <field id="SiFlx"       long_name="Biogenic Si flux"                        unit="mol/m2/s"   grid_ref="grid_T_3D" /> 
     347       <field id="CaCO3Flx"    long_name="CaCO3 flux"                              unit="mol/m2/s"   grid_ref="grid_T_3D" /> 
     348       <field id="xnum"        long_name="Number of particles in aggregats"        unit="-"          grid_ref="grid_T_3D" /> 
     349       <field id="W1"          long_name="sinking speed of mass flux"              unit="m2/s"       grid_ref="grid_T_3D" /> 
     350       <field id="W2"          long_name="sinking speed of number flux"            unit="m2/s"       grid_ref="grid_T_3D" /> 
     351 
     352       <!-- PISCES light : variables available with key_pisces_reduced --> 
     353       <field id="FNO3PHY"     long_name="FNO3PHY"                                 unit="-"          grid_ref="grid_T_3D" />  
     354       <field id="FNH4PHY"     long_name="FNH4PHY"                                 unit="-"          grid_ref="grid_T_3D" />  
     355       <field id="FNH4NO3"     long_name="FNH4NO3"                                 unit="-"          grid_ref="grid_T_3D" />  
     356       <field id="TNO3PHY"     long_name="TNO3PHY"                                 unit="-"  />  
     357       <field id="TNH4PHY"     long_name="TNH4PHY"                                 unit="-"  />  
     358       <field id="TPHYDOM"     long_name="TPHYDOM"                                 unit="-"  />  
     359       <field id="TPHYNH4"     long_name="TPHYNH4"                                 unit="-"  />  
     360       <field id="TPHYZOO"     long_name="TPHYZOO"                                 unit="-"  />  
     361       <field id="TPHYDET"     long_name="TPHYDET"                                 unit="-"  />  
     362       <field id="TDETZOO"     long_name="TDETZOO"                                 unit="-"  />  
     363       <field id="TZOODET"     long_name="TZOODET"                                 unit="-"  />  
     364       <field id="TZOOBOD"     long_name="TZOOBOD"                                 unit="-"  />  
     365       <field id="TZOONH4"     long_name="TZOONH4"                                 unit="-"  />  
     366       <field id="TZOODOM"     long_name="TZOODOM"                                 unit="-"  />  
     367       <field id="TNH4NO3"     long_name="TNH4NO3"                                 unit="-"  />  
     368       <field id="TDOMNH4"     long_name="TDOMNH4"                                 unit="-"  />  
     369       <field id="TDETNH4"     long_name="TDETNH4"                                 unit="-"  />  
     370       <field id="TPHYTOT"     long_name="TPHYTOT"                                 unit="-"  />  
     371       <field id="TZOOTOT"     long_name="TZOOTOT"                                 unit="-"  />  
     372       <field id="SEDPOC"      long_name="SEDPOC"                                  unit="-"  />  
     373       <field id="TDETSED"     long_name="TDETSED"                                 unit="-"  />  
     374 
     375       <!-- CFC11 : variables available with key_cfc --> 
     376       <field id="qtrCFC11"    long_name="Air-sea flux of CFC-11"                   unit="mol/m2/s"   /> 
     377       <field id="qintCFC11"   long_name="Cumulative air-sea flux of CFC-11"        unit="mol/m2"     /> 
     378       <!-- Bomb C14 : variables available with key_c14b --> 
     379       <field id="qtrC14b"     long_name="Air-sea flux of Bomb C14"                 unit="mol/m2/s"   /> 
     380       <field id="qintC14b"    long_name="Cumulative air-sea flux of Bomb C14"      unit="mol/m2"     /> 
     381       <field id="fdecay"      long_name="Radiactive decay of Bomb C14"             unit="mol/m3"  grid_ref="grid_T_3D"  /> 
    348382     </field_group> 
    349383 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/NST_SRC/agrif_opa_sponge.F90

    r3918 r4219  
    185185      INTEGER  :: ji,jj,jk 
    186186      INTEGER  :: ispongearea, ilci, ilcj 
    187       REAL(wp) :: z1spongearea 
    188       REAL(wp), POINTER, DIMENSION(:,:) :: zlocalviscsponge 
     187      LOGICAL  :: ll_spdone 
     188      REAL(wp) :: z1spongearea, zramp 
     189      REAL(wp), POINTER, DIMENSION(:,:) :: ztabramp 
    189190 
    190191#if defined SPONGE || defined SPONGE_TOP 
    191  
    192       CALL wrk_alloc( jpi, jpj, zlocalviscsponge ) 
    193  
    194       ispongearea  = 2 + 2 * Agrif_irhox() 
    195       ilci = nlci - ispongearea 
    196       ilcj = nlcj - ispongearea  
    197       z1spongearea = 1._wp / REAL( ispongearea - 2 ) 
    198       spbtr2(:,:) = 1. / ( e1t(:,:) * e2t(:,:) ) 
     192      ll_spdone=.TRUE. 
     193      IF (( .NOT. spongedoneT ).OR.( .NOT. spongedoneU )) THEN 
     194         ! Define ramp from boundaries towards domain interior 
     195         ! at T-points 
     196         ! Store it in ztabramp 
     197         ll_spdone=.FALSE. 
     198 
     199         CALL wrk_alloc( jpi, jpj, ztabramp ) 
     200 
     201         ispongearea  = 2 + 2 * Agrif_irhox() 
     202         ilci = nlci - ispongearea 
     203         ilcj = nlcj - ispongearea  
     204         z1spongearea = 1._wp / REAL( ispongearea - 2 ) 
     205         spbtr2(:,:) = 1. / ( e1t(:,:) * e2t(:,:) ) 
     206 
     207         ztabramp(:,:) = 0. 
     208 
     209         IF( (nbondi == -1) .OR. (nbondi == 2) ) THEN 
     210            DO jj = 1, jpj 
     211               IF ( umask(2,jj,1) == 1._wp ) THEN 
     212                 DO ji = 2, ispongearea                   
     213                    ztabramp(ji,jj) = ( ispongearea-ji ) * z1spongearea 
     214                 END DO 
     215               ENDIF 
     216            ENDDO 
     217         ENDIF 
     218 
     219         IF( (nbondi == 1) .OR. (nbondi == 2) ) THEN 
     220            DO jj = 1, jpj 
     221               IF ( umask(nlci-2,jj,1) == 1._wp ) THEN 
     222                  DO ji = ilci+1,nlci-1 
     223                     zramp = (ji - (ilci+1) ) * z1spongearea 
     224                     ztabramp(ji,jj) = MAX( ztabramp(ji,jj), zramp ) 
     225                  ENDDO 
     226               ENDIF 
     227            ENDDO 
     228         ENDIF 
     229 
     230         IF( (nbondj == -1) .OR. (nbondj == 2) ) THEN 
     231            DO ji = 1, jpi 
     232               IF ( vmask(ji,2,1) == 1._wp ) THEN 
     233                  DO jj = 2, ispongearea 
     234                     zramp = ( ispongearea-jj ) * z1spongearea 
     235                     ztabramp(ji,jj) = MAX( ztabramp(ji,jj), zramp ) 
     236                  END DO 
     237               ENDIF 
     238            ENDDO 
     239         ENDIF 
     240 
     241         IF( (nbondj == 1) .OR. (nbondj == 2) ) THEN 
     242            DO ji = 1, jpi 
     243               IF ( vmask(ji,nlcj-2,1) == 1._wp ) THEN 
     244                  DO jj = ilcj+1,nlcj-1 
     245                     zramp = (jj - (ilcj+1) ) * z1spongearea 
     246                     ztabramp(ji,jj) = MAX( ztabramp(ji,jj), zramp ) 
     247                  END DO 
     248               ENDIF 
     249            ENDDO 
     250         ENDIF 
     251 
     252      ENDIF 
    199253 
    200254      ! Tracers 
    201255      IF( .NOT. spongedoneT ) THEN 
    202          zlocalviscsponge(:,:) = 0. 
    203256         spe1ur(:,:) = 0. 
    204257         spe2vr(:,:) = 0. 
    205258 
    206259         IF( (nbondi == -1) .OR. (nbondi == 2) ) THEN 
    207             DO ji = 2, ispongearea 
    208                zlocalviscsponge(ji,:) = visc_tra * ( ispongearea-ji ) * z1spongearea 
    209             ENDDO 
    210             spe1ur(2:ispongearea-1,:      ) = 0.5 * ( zlocalviscsponge(2:ispongearea-1,:      )   & 
    211                &                         +            zlocalviscsponge(3:ispongearea  ,:      ) ) & 
    212                &                         * e2u(2:ispongearea-1,:      ) / e1u(2:ispongearea-1,:      ) 
    213             spe2vr(2:ispongearea  ,1:jpjm1) = 0.5 * ( zlocalviscsponge(2:ispongearea  ,1:jpjm1)   & 
    214                &                         +            zlocalviscsponge(2:ispongearea,2  :jpj  ) ) & 
    215                &                         * e1v(2:ispongearea  ,1:jpjm1) / e2v(2:ispongearea  ,1:jpjm1) 
     260            spe1ur(2:ispongearea-1,:       ) = visc_tra                                        & 
     261               &                             *    0.5 * (  ztabramp(2:ispongearea-1,:      )   & 
     262               &                                         + ztabramp(3:ispongearea  ,:      ) ) & 
     263               &                             * e2u(2:ispongearea-1,:) / e1u(2:ispongearea-1,:) 
     264 
     265            spe2vr(2:ispongearea  ,1:jpjm1 ) = visc_tra                                        & 
     266               &                             *    0.5 * (  ztabramp(2:ispongearea  ,1:jpjm1)   & 
     267               &                                         + ztabramp(2:ispongearea,2  :jpj  ) ) & 
     268               &                             * e1v(2:ispongearea,1:jpjm1) / e2v(2:ispongearea,1:jpjm1) 
    216269         ENDIF 
    217270 
    218271         IF( (nbondi == 1) .OR. (nbondi == 2) ) THEN 
    219             DO ji = ilci+1,nlci-1 
    220                zlocalviscsponge(ji,:) = visc_tra * (ji - (ilci+1) ) * z1spongearea 
    221             ENDDO 
    222    
    223             spe1ur(ilci+1:nlci-2,:      ) = 0.5 * (  zlocalviscsponge(ilci+1:nlci-2,:)    &  
    224                &                          +          zlocalviscsponge(ilci+2:nlci-1,:) )  & 
    225                &                          * e2u(ilci+1:nlci-2,:) / e1u(ilci+1:nlci-2,:) 
    226  
    227             spe2vr(ilci+1:nlci-1,1:jpjm1) = 0.5 * (  zlocalviscsponge(ilci+1:nlci-1,1:jpjm1)    &  
    228                &                            +        zlocalviscsponge(ilci+1:nlci-1,2:jpj  )  ) &  
    229                &                                   * e1v(ilci+1:nlci-1,1:jpjm1) / e2v(ilci+1:nlci-1,1:jpjm1) 
     272            spe1ur(ilci+1:nlci-2,:        ) = visc_tra                                   & 
     273               &                            * 0.5 * (  ztabramp(ilci+1:nlci-2,:      )   &  
     274               &                                     + ztabramp(ilci+2:nlci-1,:      ) ) & 
     275               &                            * e2u(ilci+1:nlci-2,:) / e1u(ilci+1:nlci-2,:) 
     276 
     277            spe2vr(ilci+1:nlci-1,1:jpjm1  )  = visc_tra                                  & 
     278               &                            * 0.5 * (  ztabramp(ilci+1:nlci-1,1:jpjm1)   &  
     279               &                                     + ztabramp(ilci+1:nlci-1,2:jpj  ) ) &  
     280               &                            * e1v(ilci+1:nlci-1,1:jpjm1) / e2v(ilci+1:nlci-1,1:jpjm1) 
    230281         ENDIF 
    231282 
    232283         IF( (nbondj == -1) .OR. (nbondj == 2) ) THEN 
    233             DO jj = 2, ispongearea 
    234                zlocalviscsponge(:,jj) = visc_tra * ( ispongearea-jj ) * z1spongearea 
    235             ENDDO 
    236             spe1ur(1:jpim1,2:ispongearea  ) = 0.5 * ( zlocalviscsponge(1:jpim1,2:ispongearea  ) &  
    237                &                            +         zlocalviscsponge(2:jpi  ,2:ispongearea) ) & 
     284            spe1ur(1:jpim1,2:ispongearea  ) = visc_tra                                     & 
     285               &                            * 0.5 * (  ztabramp(1:jpim1,2:ispongearea  )   &  
     286               &                                     + ztabramp(2:jpi  ,2:ispongearea  ) ) & 
    238287               &                            * e2u(1:jpim1,2:ispongearea) / e1u(1:jpim1,2:ispongearea) 
    239288    
    240             spe2vr(:      ,2:ispongearea-1) = 0.5 * ( zlocalviscsponge(:,2:ispongearea-1)       & 
    241                &                            +         zlocalviscsponge(:,3:ispongearea  )     ) & 
     289            spe2vr(:      ,2:ispongearea-1) = visc_tra                                     & 
     290               &                            * 0.5 * (  ztabramp(:      ,2:ispongearea-1)   & 
     291               &                                     + ztabramp(:      ,3:ispongearea  ) ) & 
    242292               &                            * e1v(:,2:ispongearea-1) / e2v(:,2:ispongearea-1) 
    243293         ENDIF 
    244294 
    245295         IF( (nbondj == 1) .OR. (nbondj == 2) ) THEN 
    246             DO jj = ilcj+1,nlcj-1 
    247                zlocalviscsponge(:,jj) = visc_tra * (jj - (ilcj+1) ) * z1spongearea 
    248             ENDDO 
    249             spe1ur(1:jpim1,ilcj+1:nlcj-1) = 0.5 * ( zlocalviscsponge(1:jpim1,ilcj+1:nlcj-1)   & 
    250                &                          +         zlocalviscsponge(2:jpi  ,ilcj+1:nlcj-1) ) & 
     296            spe1ur(1:jpim1,ilcj+1:nlcj-1) = visc_tra                                   & 
     297               &                          * 0.5 * (  ztabramp(1:jpim1,ilcj+1:nlcj-1)   & 
     298               &                                   + ztabramp(2:jpi  ,ilcj+1:nlcj-1) ) & 
    251299               &                                * e2u(1:jpim1,ilcj+1:nlcj-1) / e1u(1:jpim1,ilcj+1:nlcj-1) 
    252             spe2vr(:      ,ilcj+1:nlcj-2) = 0.5 * ( zlocalviscsponge(:,ilcj+1:nlcj-2      )   & 
    253                &                          +         zlocalviscsponge(:,ilcj+2:nlcj-1)     )   & 
     300 
     301            spe2vr(:      ,ilcj+1:nlcj-2) = visc_tra                                   & 
     302               &                          * 0.5 * (  ztabramp(:      ,ilcj+1:nlcj-2)   & 
     303               &                                   + ztabramp(:      ,ilcj+2:nlcj-1) ) & 
    254304               &                                * e1v(:,ilcj+1:nlcj-2) / e2v(:,ilcj+1:nlcj-2) 
    255305         ENDIF 
     
    259309      ! Dynamics 
    260310      IF( .NOT. spongedoneU ) THEN 
    261          zlocalviscsponge(:,:) = 0. 
    262311         spe1ur2(:,:) = 0. 
    263312         spe2vr2(:,:) = 0. 
    264313 
    265314         IF( (nbondi == -1) .OR. (nbondi == 2) ) THEN 
    266             DO ji = 2, ispongearea 
    267                zlocalviscsponge(ji,:) = visc_dyn * ( ispongearea-ji ) * z1spongearea 
    268             ENDDO 
    269             spe1ur2(2:ispongearea-1,:      ) = 0.5 * ( zlocalviscsponge(2:ispongearea-1,:      ) & 
    270                                              &     +   zlocalviscsponge(3:ispongearea,:    ) ) 
    271             spe2vr2(2:ispongearea  ,1:jpjm1) = 0.5 * ( zlocalviscsponge(2:ispongearea  ,1:jpjm1) & 
    272                                              &     +   zlocalviscsponge(2:ispongearea,2:jpj) )  
     315            spe1ur2(2:ispongearea-1,:      ) = visc_dyn                                   & 
     316               &                             * 0.5 * (  ztabramp(2:ispongearea-1,:      ) & 
     317               &                                      + ztabramp(3:ispongearea  ,:      ) ) 
     318            spe2vr2(2:ispongearea  ,1:jpjm1) = visc_dyn                                   & 
     319               &                             * 0.5 * (  ztabramp(2:ispongearea  ,1:jpjm1) & 
     320               &                                      + ztabramp(2:ispongearea  ,2:jpj  ) )  
    273321         ENDIF 
    274322 
    275323         IF( (nbondi == 1) .OR. (nbondi == 2) ) THEN 
    276             DO ji = ilci+1,nlci-1 
    277                zlocalviscsponge(ji,:) = visc_dyn * (ji - (ilci+1) ) * z1spongearea 
    278             ENDDO 
    279             spe1ur2(ilci+1:nlci-2,:      ) = 0.5 * (  zlocalviscsponge(ilci+1:nlci-2,:) & 
    280                                            &        + zlocalviscsponge(ilci+2:nlci-1,:) )   
    281             spe2vr2(ilci+1:nlci-1,1:jpjm1) = 0.5 * (  zlocalviscsponge(ilci+1:nlci-1,1:jpjm1) & 
    282                                            &        + zlocalviscsponge(ilci+1:nlci-1,2:jpj  )  )  
     324            spe1ur2(ilci+1:nlci-2  ,:      ) = visc_dyn                                   & 
     325               &                             * 0.5 * (  ztabramp(ilci+1:nlci-2, :       ) & 
     326               &                                      + ztabramp(ilci+2:nlci-1, :       ) )                       
     327            spe2vr2(ilci+1:nlci-1  ,1:jpjm1) = visc_dyn                                   & 
     328               &                             * 0.5 * (  ztabramp(ilci+1:nlci-1,1:jpjm1  ) & 
     329               &                                      + ztabramp(ilci+1:nlci-1,2:jpj    ) )  
    283330         ENDIF 
    284331 
    285332         IF( (nbondj == -1) .OR. (nbondj == 2) ) THEN 
    286             DO jj = 2, ispongearea 
    287                zlocalviscsponge(:,jj) = visc_dyn * ( ispongearea-jj ) * z1spongearea 
    288             ENDDO 
    289             spe1ur2(1:jpim1,2:ispongearea  ) = 0.5 * ( zlocalviscsponge(1:jpim1,2:ispongearea) & 
    290                                              &      + zlocalviscsponge(2:jpi,2:ispongearea) )  
    291             spe2vr2(:      ,2:ispongearea-1) = 0.5 * ( zlocalviscsponge(:,2:ispongearea-1)     & 
    292                                              &      + zlocalviscsponge(:,3:ispongearea)     ) 
     333            spe1ur2(1:jpim1,2:ispongearea  ) = visc_dyn                                   &   
     334               &                             * 0.5 * (  ztabramp(1:jpim1,2:ispongearea  ) & 
     335               &                                      + ztabramp(2:jpi  ,2:ispongearea  ) )  
     336            spe2vr2(:      ,2:ispongearea-1) = visc_dyn                                   & 
     337               &                             * 0.5 * (  ztabramp(:      ,2:ispongearea-1) & 
     338               &                                      + ztabramp(:      ,3:ispongearea  ) ) 
    293339         ENDIF 
    294340 
    295341         IF( (nbondj == 1) .OR. (nbondj == 2) ) THEN 
    296             DO jj = ilcj+1,nlcj-1 
    297                zlocalviscsponge(:,jj) = visc_dyn * (jj - (ilcj+1) ) * z1spongearea 
    298             ENDDO 
    299             spe1ur2(1:jpim1,ilcj+1:nlcj-1) = 0.5 * ( zlocalviscsponge(1:jpim1,ilcj+1:nlcj-1) & 
    300                                            &         + zlocalviscsponge(2:jpi,ilcj+1:nlcj-1) )  
    301             spe2vr2(:      ,ilcj+1:nlcj-2) = 0.5 * ( zlocalviscsponge(:,ilcj+1:nlcj-2      ) & 
    302                                            &         + zlocalviscsponge(:,ilcj+2:nlcj-1)     ) 
     342            spe1ur2(1:jpim1,ilcj+1:nlcj-1  ) = visc_dyn                                   & 
     343               &                             * 0.5 * (  ztabramp(1:jpim1,ilcj+1:nlcj-1  ) & 
     344               &                                      + ztabramp(2:jpi  ,ilcj+1:nlcj-1  ) )  
     345            spe2vr2(:      ,ilcj+1:nlcj-2  ) = visc_dyn                                   & 
     346               &                             * 0.5 * (  ztabramp(:      ,ilcj+1:nlcj-2  ) & 
     347               &                                      + ztabramp(:      ,ilcj+2:nlcj-1  ) ) 
    303348         ENDIF 
    304349         spongedoneU = .TRUE. 
     
    306351      ENDIF 
    307352      ! 
    308       CALL wrk_dealloc( jpi, jpj, zlocalviscsponge ) 
     353      IF (.NOT.ll_spdone) CALL wrk_dealloc( jpi, jpj, ztabramp ) 
    309354      ! 
    310355#endif 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/ASM/asminc.F90

    r3785 r4219  
    682682      ! used to prevent the applied increments taking the temperature below the local freezing point  
    683683 
    684 #if defined key_cice  
    685         fzptnz(:,:,:) = -1.8_wp 
    686 #else  
    687         DO jk = 1, jpk 
    688            DO jj = 1, jpj 
    689               DO ji = 1, jpk 
    690                  fzptnz (ji,jj,jk) = ( -0.0575_wp + 1.710523e-3_wp * SQRT( tsn(ji,jj,jk,jp_sal) )                   &  
    691                                                   - 2.154996e-4_wp *       tsn(ji,jj,jk,jp_sal)   ) * tsn(ji,jj,jk,jp_sal)  &  
    692                                                   - 7.53e-4_wp * fsdepw(ji,jj,jk)       ! (pressure in dbar)  
    693               END DO 
    694            END DO 
    695         END DO 
    696 #endif  
     684      DO jk=1, jpkm1 
     685         fzptnz (:,:,jk) = tfreez( tsn(:,:,jk,jp_sal), fsdept(:,:,jk) ) 
     686      ENDDO 
    697687 
    698688      IF ( ln_asmiau ) THEN 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/BDY/bdydyn.F90

    r3294 r4219  
    3030   USE lbclnk          ! ocean lateral boundary conditions (or mpp link) 
    3131   USE in_out_manager  ! 
     32   USE domvvl          ! variable volume 
    3233 
    3334   IMPLICIT NONE 
     
    8485      pu2d(:,:) = 0.e0 
    8586      pv2d(:,:) = 0.e0 
    86       DO jk = 1, jpkm1   !! Vertically integrated momentum trends 
    87           pu2d(:,:) = pu2d(:,:) + fse3u(:,:,jk) * umask(:,:,jk) * ua(:,:,jk) 
    88           pv2d(:,:) = pv2d(:,:) + fse3v(:,:,jk) * vmask(:,:,jk) * va(:,:,jk) 
    89       END DO 
    90       pu2d(:,:) = pu2d(:,:) * phur(:,:) 
    91       pv2d(:,:) = pv2d(:,:) * phvr(:,:) 
     87      IF (lk_vvl) THEN 
     88         DO jk = 1, jpkm1   !! Vertically integrated momentum trends 
     89            pu2d(:,:) = pu2d(:,:) + fse3u_a(:,:,jk) * umask(:,:,jk) * ua(:,:,jk) 
     90            pv2d(:,:) = pv2d(:,:) + fse3v_a(:,:,jk) * vmask(:,:,jk) * va(:,:,jk) 
     91         END DO 
     92         pu2d(:,:) = pu2d(:,:) / ( hu_0(:,:) + sshu_a(:,:) + 1._wp - umask(:,:,1) ) 
     93         pv2d(:,:) = pv2d(:,:) / ( hv_0(:,:) + sshv_a(:,:) + 1._wp - vmask(:,:,1) ) 
     94      ELSE 
     95         DO jk = 1, jpkm1   !! Vertically integrated momentum trends 
     96            pu2d(:,:) = pu2d(:,:) + fse3u(:,:,jk) * umask(:,:,jk) * ua(:,:,jk) 
     97            pv2d(:,:) = pv2d(:,:) + fse3v(:,:,jk) * vmask(:,:,jk) * va(:,:,jk) 
     98         END DO 
     99         pu2d(:,:) = pu2d(:,:) * phur(:,:) 
     100         pv2d(:,:) = pv2d(:,:) * phvr(:,:) 
     101      ENDIF 
    92102      DO jk = 1 , jpkm1 
    93          ua(:,:,jk) = ua(:,:,jk) - pu2d(:,:) 
    94          va(:,:,jk) = va(:,:,jk) - pv2d(:,:) 
     103         ua(:,:,jk) = ua(:,:,jk) - pu2d(:,:) * umask(:,:,jk) 
     104         va(:,:,jk) = va(:,:,jk) - pv2d(:,:) * vmask(:,:,jk) 
    95105      END DO 
    96106 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/C1D/step_c1d.F90

    r3680 r4219  
    5959 
    6060                             indic = 0                ! reset to no error condition 
     61      IF( kstp == nit000 )   CALL iom_init            ! iom_put initialization (must be done after nemo_init for AGRIF+XIOS+OASIS) 
    6162      IF( kstp /= nit000 )   CALL day( kstp )         ! Calendar (day was already called at nit000 in day_init) 
    62                              CALL iom_setkt( kstp )   ! say to iom that we are at time step kstp 
     63                             CALL iom_setkt( kstp - nit000 + 1 )   ! say to iom that we are at time step kstp 
    6364 
    6465      !>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 
     
    106107      !<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< 
    107108                         CALL dia_wri( kstp )       ! ocean model: outputs 
     109      IF( lk_diahth  )   CALL dia_hth( kstp )       ! Thermocline depth (20°C) 
     110 
    108111 
    109112#if defined key_top 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DIA/diadct.F90

    r3680 r4219  
    4242#endif 
    4343#if defined key_lim3 
    44   USE ice_3 
     44  USE par_ice 
     45  USE ice 
    4546#endif 
    4647  USE domvvl 
     
    484485                 ijglo = secs(jsec)%listPoint(jpt)%J + jpjzoom - 1 + njmpp - 1 
    485486                 WRITE(numout,*)'         # I J : ',iiglo,ijglo 
     487                 CALL FLUSH(numout) 
    486488              ENDDO 
    487489           ENDIF 
     
    606608     
    607609     !! * Local variables 
    608      INTEGER             :: jk, jseg, jclass,                    &!loop on level/segment/classes   
     610     INTEGER             :: jk, jseg, jclass,jl,                 &!loop on level/segment/classes/ice categories 
    609611                            isgnu, isgnv                          !  
    610612     REAL(wp)            :: zumid, zvmid,                        &!U/V velocity on a cell segment  
     
    771773    
    772774              zTnorm=zumid_ice*e2u(k%I,k%J)+zvmid_ice*e1v(k%I,k%J) 
    773     
     775 
     776#if defined key_lim2    
    774777              transports_2d(1,jsec,jseg) = transports_2d(1,jsec,jseg) + (zTnorm)*   &  
    775778                                   (1.0 - frld(sec%listPoint(jseg)%I,sec%listPoint(jseg)%J))  &  
     
    778781              transports_2d(2,jsec,jseg) = transports_2d(2,jsec,jseg) + (zTnorm)*   &  
    779782                                    (1.0 -  frld(sec%listPoint(jseg)%I,sec%listPoint(jseg)%J)) 
     783#endif 
     784#if defined key_lim3 
     785              DO jl=1,jpl 
     786                 transports_2d(1,jsec,jseg) = transports_2d(1,jsec,jseg) + (zTnorm)*     & 
     787                                   a_i(sec%listPoint(jseg)%I,sec%listPoint(jseg)%J,jl) * & 
     788                                  ( ht_i(sec%listPoint(jseg)%I,sec%listPoint(jseg)%J,jl) +  & 
     789                                    ht_s(sec%listPoint(jseg)%I,sec%listPoint(jseg)%J,jl) ) 
     790                                    
     791                 transports_2d(2,jsec,jseg) = transports_2d(2,jsec,jseg) + (zTnorm)*   & 
     792                                   a_i(sec%listPoint(jseg)%I,sec%listPoint(jseg)%J,jl) 
     793              ENDDO 
     794#endif 
    780795    
    781796           ENDIF !end of ice case 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DIA/diahsb.F90

    r3625 r4219  
    2121   USE bdy_par         ! (for lk_bdy) 
    2222   USE timing          ! preformance summary 
     23   USE lib_fortran 
     24   USE sbcrnf 
    2325 
    2426   IMPLICIT NONE 
     
    3335   REAL(dp)                                ::   surf_tot   , vol_tot             ! 
    3436   REAL(dp)                                ::   frc_t      , frc_s     , frc_v   ! global forcing trends 
     37   REAL(dp)                                ::   frc_wn_t      , frc_wn_s ! global forcing trends 
    3538   REAL(dp)                                ::   fact1                            ! conversion factors 
    3639   REAL(dp)                                ::   fact21    , fact22               !     -         - 
     
    3841   REAL(dp), DIMENSION(:,:)  , ALLOCATABLE ::   surf      , ssh_ini              ! 
    3942   REAL(dp), DIMENSION(:,:,:), ALLOCATABLE ::   hc_loc_ini, sc_loc_ini, e3t_ini  ! 
     43   REAL(dp), DIMENSION(:,:)  , ALLOCATABLE ::   ssh_hc_loc_ini, ssh_sc_loc_ini 
    4044 
    4145   !! * Substitutions 
     
    6771      INTEGER    ::   jk                          ! dummy loop indice 
    6872      REAL(dp)   ::   zdiff_hc    , zdiff_sc      ! heat and salt content variations 
     73      REAL(dp)   ::   zdiff_hc1   , zdiff_sc1     ! heat and salt content variations of ssh 
    6974      REAL(dp)   ::   zdiff_v1    , zdiff_v2      ! volume variation 
     75      REAL(dp)   ::   zerr_hc1    , zerr_sc1      ! Non conservation due to free surface 
    7076      REAL(dp)   ::   z1_rau0                     ! local scalars 
    7177      REAL(dp)   ::   zdeltat                     !    -     - 
    7278      REAL(dp)   ::   z_frc_trd_t , z_frc_trd_s   !    -     - 
    7379      REAL(dp)   ::   z_frc_trd_v                 !    -     - 
     80      REAL(dp)   ::   z_wn_trd_t , z_wn_trd_s   !    -     - 
     81      REAL(dp)   ::   z_ssh_hc , z_ssh_sc   !    -     - 
    7482      !!--------------------------------------------------------------------------- 
    7583      IF( nn_timing == 1 )   CALL timing_start('dia_hsb') 
     
    7987      ! ------------------------- ! 
    8088      z1_rau0 = 1.e0 / rau0 
    81       z_frc_trd_v = z1_rau0 * SUM( - ( emp(:,:) - rnf(:,:) ) * surf(:,:) )     ! volume fluxes 
    82       z_frc_trd_t =           SUM( sbc_tsc(:,:,jp_tem) * surf(:,:) )     ! heat fluxes 
    83       z_frc_trd_s =           SUM( sbc_tsc(:,:,jp_sal) * surf(:,:) )     ! salt fluxes 
     89      z_frc_trd_v = z1_rau0 * glob_sum( - ( emp(:,:) - rnf(:,:) ) * surf(:,:) )     ! volume fluxes 
     90      z_frc_trd_t =           glob_sum( sbc_tsc(:,:,jp_tem) * surf(:,:) )     ! heat fluxes 
     91      z_frc_trd_s =           glob_sum( sbc_tsc(:,:,jp_sal) * surf(:,:) )     ! salt fluxes 
     92      ! Add runoff heat & salt input 
     93      IF( ln_rnf    )   z_frc_trd_t = z_frc_trd_t + glob_sum( rnf_tsc(:,:,jp_tem) * surf(:,:) ) 
     94      IF( ln_rnf_sal)   z_frc_trd_s = z_frc_trd_s + glob_sum( rnf_tsc(:,:,jp_sal) * surf(:,:) ) 
    8495      ! Add penetrative solar radiation 
    85       IF( ln_traqsr )   z_frc_trd_t = z_frc_trd_t + r1_rau0_rcp * SUM( qsr     (:,:) * surf(:,:) ) 
     96      IF( ln_traqsr )   z_frc_trd_t = z_frc_trd_t + r1_rau0_rcp * glob_sum( qsr     (:,:) * surf(:,:) ) 
    8697      ! Add geothermal heat flux 
    87       IF( ln_trabbc )   z_frc_trd_t = z_frc_trd_t + r1_rau0_rcp * SUM( qgh_trd0(:,:) * surf(:,:) ) 
    88       IF( lk_mpp ) THEN 
    89          CALL mpp_sum( z_frc_trd_v ) 
    90          CALL mpp_sum( z_frc_trd_t ) 
    91       ENDIF 
     98      IF( ln_trabbc )   z_frc_trd_t = z_frc_trd_t +  glob_sum( qgh_trd0(:,:) * surf(:,:) ) 
     99      IF( .NOT. lk_vvl ) THEN 
     100         z_wn_trd_t = - glob_sum( surf(:,:) * wn(:,:,1) * tsb(:,:,1,jp_tem) ) 
     101         z_wn_trd_s = - glob_sum( surf(:,:) * wn(:,:,1) * tsb(:,:,1,jp_sal) ) 
     102      ENDIF 
     103 
    92104      frc_v = frc_v + z_frc_trd_v * rdt 
    93105      frc_t = frc_t + z_frc_trd_t * rdt 
    94106      frc_s = frc_s + z_frc_trd_s * rdt 
     107      !                                          ! Advection flux through fixed surface (z=0) 
     108      IF( .NOT. lk_vvl ) THEN 
     109         frc_wn_t = frc_wn_t + z_wn_trd_t * rdt 
     110         frc_wn_s = frc_wn_s + z_wn_trd_s * rdt 
     111      ENDIF 
    95112 
    96113      ! ----------------------- ! 
     
    100117      zdiff_hc = 0.d0 
    101118      zdiff_sc = 0.d0 
     119 
    102120      ! volume variation (calculated with ssh) 
    103       zdiff_v1 = SUM( surf(:,:) * tmask(:,:,1) * ( sshn(:,:) - ssh_ini(:,:) ) ) 
     121      zdiff_v1 = glob_sum( surf(:,:) * ( sshn(:,:) - ssh_ini(:,:) ) ) 
     122 
     123      ! heat & salt content variation (associated with ssh) 
     124      IF( .NOT. lk_vvl ) THEN 
     125         z_ssh_hc = glob_sum( surf(:,:) * ( tsn(:,:,1,jp_tem) * sshn(:,:) - ssh_hc_loc_ini(:,:) ) ) 
     126         z_ssh_sc = glob_sum( surf(:,:) * ( tsn(:,:,1,jp_sal) * sshn(:,:) - ssh_sc_loc_ini(:,:) ) ) 
     127      ENDIF 
     128 
    104129      DO jk = 1, jpkm1 
    105          ! volume variation (calculated with scale factors) 
    106          zdiff_v2 = zdiff_v2 + SUM( surf(:,:) * tmask(:,:,jk)   & 
     130        ! volume variation (calculated with scale factors) 
     131         zdiff_v2 = zdiff_v2 + glob_sum( surf(:,:) * tmask(:,:,jk)   & 
    107132            &                       * ( fse3t_n(:,:,jk)         & 
    108133            &                           - e3t_ini(:,:,jk) ) ) 
    109134         ! heat content variation 
    110          zdiff_hc = zdiff_hc + SUM( surf(:,:) * tmask(:,:,jk)          & 
     135         zdiff_hc = zdiff_hc + glob_sum( surf(:,:) * tmask(:,:,jk)          & 
    111136            &                       * ( fse3t_n(:,:,jk) * tsn(:,:,jk,jp_tem)   & 
    112137            &                           - hc_loc_ini(:,:,jk) ) ) 
    113138         ! salt content variation 
    114          zdiff_sc = zdiff_sc + SUM( surf(:,:) * tmask(:,:,jk)          & 
     139         zdiff_sc = zdiff_sc + glob_sum( surf(:,:) * tmask(:,:,jk)          & 
    115140            &                       * ( fse3t_n(:,:,jk) * tsn(:,:,jk,jp_sal)   & 
    116141            &                           - sc_loc_ini(:,:,jk) ) ) 
    117142      ENDDO 
    118143 
    119       IF( lk_mpp ) THEN 
    120          CALL mpp_sum( zdiff_hc ) 
    121          CALL mpp_sum( zdiff_sc ) 
    122          CALL mpp_sum( zdiff_v1 ) 
    123          CALL mpp_sum( zdiff_v2 ) 
    124       ENDIF 
    125  
    126144      ! Substract forcing from heat content, salt content and volume variations 
    127145      zdiff_v1 = zdiff_v1 - frc_v 
    128       zdiff_v2 = zdiff_v2 - frc_v 
     146      IF( lk_vvl )   zdiff_v2 = zdiff_v2 - frc_v 
    129147      zdiff_hc = zdiff_hc - frc_t 
    130148      zdiff_sc = zdiff_sc - frc_s 
     149      IF( .NOT. lk_vvl ) THEN 
     150         zdiff_hc1 = zdiff_hc + z_ssh_hc  
     151         zdiff_sc1 = zdiff_sc + z_ssh_sc 
     152         zerr_hc1  = z_ssh_hc - frc_wn_t 
     153         zerr_sc1  = z_ssh_sc - frc_wn_s 
     154      ENDIF 
    131155       
    132156      ! ----------------------- ! 
     
    134158      ! ----------------------- ! 
    135159      zdeltat  = 1.e0 / ( ( kt - nit000 + 1 ) * rdt ) 
    136       WRITE(numhsb , 9020) kt , zdiff_hc / vol_tot , zdiff_hc * fact1  * zdeltat,                                & 
    137          &                      zdiff_sc / vol_tot , zdiff_sc * fact21 * zdeltat, zdiff_sc * fact22 * zdeltat,   & 
    138          &                      zdiff_v1           , zdiff_v1 * fact31 * zdeltat, zdiff_v1 * fact32 * zdeltat,   & 
    139          &                      zdiff_v2           , zdiff_v2 * fact31 * zdeltat, zdiff_v2 * fact32 * zdeltat 
     160      IF( lk_vvl ) THEN 
     161         WRITE(numhsb , 9020) kt , zdiff_hc / vol_tot , zdiff_hc * fact1  * zdeltat,                                & 
     162            &                      zdiff_sc / vol_tot , zdiff_sc * fact21 * zdeltat, zdiff_sc * fact22 * zdeltat,   & 
     163            &                      zdiff_v1           , zdiff_v1 * fact31 * zdeltat, zdiff_v1 * fact32 * zdeltat,   & 
     164            &                      zdiff_v2           , zdiff_v2 * fact31 * zdeltat, zdiff_v2 * fact32 * zdeltat 
     165      ELSE 
     166         WRITE(numhsb , 9030) kt , zdiff_hc1 / vol_tot , zdiff_hc1 * fact1  * zdeltat,                                & 
     167            &                      zdiff_sc1 / vol_tot , zdiff_sc1 * fact21 * zdeltat, zdiff_sc1 * fact22 * zdeltat,   & 
     168            &                      zdiff_v1            , zdiff_v1  * fact31 * zdeltat, zdiff_v1  * fact32 * zdeltat,   & 
     169            &                      zerr_hc1 / vol_tot  , zerr_sc1 / vol_tot 
     170      ENDIF 
    140171 
    141172      IF ( kt == nitend ) CLOSE( numhsb ) 
     
    144175 
    1451769020  FORMAT(I5,11D15.7) 
     1779030  FORMAT(I5,10D15.7) 
    146178      ! 
    147179   END SUBROUTINE dia_hsb 
     
    179211 
    180212      IF( .NOT. ln_diahsb )   RETURN 
     213      IF( .NOT. lk_mpp_rep ) & 
     214        CALL ctl_stop (' Your global mpp_sum if performed in single precision - 64 bits -', & 
     215             &         ' whereas the global sum to be precise must be done in double precision ',& 
     216             &         ' please add key_mpp_rep') 
    181217 
    182218      ! ------------------- ! 
    183219      ! 1 - Allocate memory ! 
    184220      ! ------------------- ! 
    185       ALLOCATE( hc_loc_ini(jpi,jpj,jpk), STAT=ierror ) 
     221      ALLOCATE( hc_loc_ini(jpi,jpj,jpk), sc_loc_ini(jpi,jpj,jpk), & 
     222         &      ssh_hc_loc_ini(jpi,jpj), ssh_sc_loc_ini(jpi,jpj), & 
     223         &      e3t_ini(jpi,jpj,jpk)                            , & 
     224         &      surf(jpi,jpj),  ssh_ini(jpi,jpj), STAT=ierror ) 
    186225      IF( ierror > 0 ) THEN 
    187226         CALL ctl_stop( 'dia_hsb: unable to allocate hc_loc_ini' )   ;   RETURN 
    188       ENDIF 
    189       ALLOCATE( sc_loc_ini(jpi,jpj,jpk), STAT=ierror ) 
    190       IF( ierror > 0 ) THEN 
    191          CALL ctl_stop( 'dia_hsb: unable to allocate sc_loc_ini' )   ;   RETURN 
    192       ENDIF 
    193       ALLOCATE( e3t_ini(jpi,jpj,jpk)   , STAT=ierror ) 
    194       IF( ierror > 0 ) THEN 
    195          CALL ctl_stop( 'dia_hsb: unable to allocate e3t_ini' )      ;   RETURN 
    196       ENDIF 
    197       ALLOCATE( surf(jpi,jpj)          , STAT=ierror ) 
    198       IF( ierror > 0 ) THEN 
    199          CALL ctl_stop( 'dia_hsb: unable to allocate surf' )         ;   RETURN 
    200       ENDIF 
    201       ALLOCATE( ssh_ini(jpi,jpj)       , STAT=ierror ) 
    202       IF( ierror > 0 ) THEN 
    203          CALL ctl_stop( 'dia_hsb: unable to allocate ssh_ini' )      ;   RETURN 
    204227      ENDIF 
    205228 
     
    214237      cl_name    = 'heat_salt_volume_budgets.txt'                         ! name of output file 
    215238      surf(:,:) = e1t(:,:) * e2t(:,:) * tmask(:,:,1) * tmask_i(:,:)      ! masked surface grid cell area 
    216       surf_tot  = SUM( surf(:,:) )                                       ! total ocean surface area 
     239      surf_tot  = glob_sum( surf(:,:) )                                       ! total ocean surface area 
    217240      vol_tot   = 0.d0                                                   ! total ocean volume 
    218241      DO jk = 1, jpkm1 
    219          vol_tot  = vol_tot + SUM( surf(:,:) * tmask(:,:,jk)     & 
    220             &                      * fse3t_n(:,:,jk)         ) 
     242         vol_tot  = vol_tot + glob_sum( surf(:,:) * tmask(:,:,jk)     & 
     243            &                         * fse3t_n(:,:,jk)         ) 
    221244      END DO 
    222       IF( lk_mpp ) THEN  
    223          CALL mpp_sum( vol_tot ) 
    224          CALL mpp_sum( surf_tot ) 
    225       ENDIF 
    226245 
    227246      CALL ctl_opn( numhsb , cl_name , 'UNKNOWN' , 'FORMATTED' , 'SEQUENTIAL' , 1 , numout , lwp , 1 ) 
    228       !                   12345678901234567890123456789012345678901234567890123456789012345678901234567890 -> 80 
    229       WRITE( numhsb, 9010 ) "kt   |     heat content budget     |            salt content budget             ",   & 
    230          !                                                   123456789012345678901234567890123456789012345 -> 45 
    231          &                                                  "|            volume budget (ssh)             ",   & 
    232          !                                                   678901234567890123456789012345678901234567890 -> 45 
    233          &                                                  "|            volume budget (e3t)             " 
    234       WRITE( numhsb, 9010 ) "     |      [C]         [W/m2]     |     [psu]        [mmm/s]          [SV]     ",   & 
    235          &                                                  "|     [m3]         [mmm/s]          [SV]     ",   & 
    236          &                                                  "|     [m3]         [mmm/s]          [SV]     " 
    237  
     247      IF( lk_vvl ) THEN 
     248         !                   12345678901234567890123456789012345678901234567890123456789012345678901234567890 -> 80 
     249         WRITE( numhsb, 9010 ) "kt   |     heat content budget     |            salt content budget             ",   & 
     250            !                                                   123456789012345678901234567890123456789012345 -> 45 
     251            &                                                  "|            volume budget (ssh)             ",   & 
     252            !                                                   678901234567890123456789012345678901234567890 -> 45 
     253            &                                                  "|            volume budget (e3t)             " 
     254         WRITE( numhsb, 9010 ) "     |      [C]         [W/m2]     |     [psu]        [mmm/s]          [SV]     ",   & 
     255            &                                                  "|     [m3]         [mmm/s]          [SV]     ",   & 
     256            &                                                  "|     [m3]         [mmm/s]          [SV]     " 
     257      ELSE 
     258         !                   12345678901234567890123456789012345678901234567890123456789012345678901234567890 -> 80 
     259         WRITE( numhsb, 9011 ) "kt   |     heat content budget     |            salt content budget             ",   & 
     260            !                                                   123456789012345678901234567890123456789012345 -> 45 
     261            &                                                  "|            volume budget (ssh)             ",   & 
     262            !                                                   678901234567890123456789012345678901234567890 -> 45 
     263            &                                                  "|  Non conservation due to free surface      " 
     264         WRITE( numhsb, 9011 ) "     |      [C]         [W/m2]     |     [psu]        [mmm/s]          [SV]     ",   & 
     265            &                                                  "|     [m3]         [mmm/s]          [SV]     ",   & 
     266            &                                                  "|  [heat - C]     [salt - psu]                " 
     267      ENDIF 
    238268      ! --------------- ! 
    239269      ! 3 - Conversions ! (factors will be multiplied by duration afterwards) 
     
    261291      frc_t = 0.d0                                           ! heat content   -    -   -    -    
    262292      frc_s = 0.d0                                           ! salt content   -    -   -    -          
     293      IF( .NOT. lk_vvl ) THEN 
     294         ssh_hc_loc_ini(:,:) = tsn(:,:,1,jp_tem) * ssh_ini(:,:)   ! initial heat content associated with ssh 
     295         ssh_sc_loc_ini(:,:) = tsn(:,:,1,jp_sal) * ssh_ini(:,:)   ! initial salt content associated with ssh 
     296         frc_wn_t = 0.d0 
     297         frc_wn_s = 0.d0 
     298      ENDIF 
    263299      ! 
    2643009010  FORMAT(A80,A45,A45) 
     3019011  FORMAT(A80,A45,A45) 
    265302      ! 
    266303   END SUBROUTINE dia_hsb_init 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DOM/closea.F90

    r3632 r4219  
    108108            ncsi1(2)   =  97  ;  ncsj1(2)   = 107 
    109109            ncsi2(2)   = 103  ;  ncsj2(2)   = 111 
    110             ncsir(2,1) = 110  ;  ncsjr(2,1) = 111 
    111             !                                            ! Black Sea 1 : west part of the Black Sea  
    112             ncsnr(3)   = 1    ; ncstt(3)   =   2            !            (ie west of the cyclic b.c.) 
    113             ncsi1(3)   = 174  ; ncsj1(3)   = 107            ! put in Med Sea 
    114             ncsi2(3)   = 181  ; ncsj2(3)   = 112 
    115             ncsir(3,1) = 171  ; ncsjr(3,1) = 106  
    116             !                                            ! Black Sea 2 : est part of the Black Sea  
    117             ncsnr(4)   =   1  ;  ncstt(4)   =   2           !               (ie est of the cyclic b.c.) 
    118             ncsi1(4)   =   2  ;  ncsj1(4)   = 107           ! put in Med Sea 
    119             ncsi2(4)   =   6  ;  ncsj2(4)   = 112 
    120             ncsir(4,1) = 171  ;  ncsjr(4,1) = 106  
     110            ncsir(2,1) = 110  ;  ncsjr(2,1) = 111            
     111            !                                            ! Black Sea (crossed by the cyclic boundary condition) 
     112            ncsnr(3:4) =   4  ;  ncstt(3:4) =   2           ! put in Med Sea (north of Aegean Sea) 
     113            ncsir(3:4,1) = 171;  ncsjr(3:4,1) = 106         ! 
     114            ncsir(3:4,2) = 170;  ncsjr(3:4,2) = 106  
     115            ncsir(3:4,3) = 171;  ncsjr(3:4,3) = 105  
     116            ncsir(3:4,4) = 170;  ncsjr(3:4,4) = 105  
     117            ncsi1(3)   = 174  ;  ncsj1(3)   = 107           ! 1 : west part of the Black Sea       
     118            ncsi2(3)   = 181  ;  ncsj2(3)   = 112           !            (ie west of the cyclic b.c.) 
     119            ncsi1(4)   =   2  ;  ncsj1(4)   = 107           ! 2 : east part of the Black Sea  
     120            ncsi2(4)   =   6  ;  ncsj2(4)   = 112           !           (ie east of the cyclic b.c.) 
     121              
     122           
     123 
    121124            !                                        ! ======================= 
    122125         CASE ( 4 )                                  !  ORCA_R4 configuration 
     
    372375      REAL(wp), DIMENSION(jpi,jpj), INTENT(inout) ::   p_rnfmsk   ! river runoff mask (rnfmsk array) 
    373376      ! 
    374       INTEGER  ::   jc, jn      ! dummy loop indices 
    375       INTEGER  ::   ii, ij      ! temporary integer 
     377      INTEGER  ::   jc, jn, ji, jj      ! dummy loop indices 
    376378      !!---------------------------------------------------------------------- 
    377379      ! 
     
    379381         IF( ncstt(jc) >= 1 ) THEN            ! runoff mask set to 1 at closed sea outflows 
    380382             DO jn = 1, 4 
    381                ii = mi0( ncsir(jc,jn) ) 
    382                ij = mj0( ncsjr(jc,jn) ) 
    383                p_rnfmsk(ii,ij) = MAX( p_rnfmsk(ii,ij), 1.0_wp ) 
     383                DO jj =    mj0( ncsjr(jc,jn) ), mj1( ncsjr(jc,jn) ) 
     384                   DO ji = mi0( ncsir(jc,jn) ), mi1( ncsir(jc,jn) ) 
     385                      p_rnfmsk(ji,jj) = MAX( p_rnfmsk(ji,jj), 1.0_wp ) 
     386                   END DO 
     387                END DO 
    384388            END DO  
    385389         ENDIF  
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DOM/daymod.F90

    r3851 r4219  
    238238               nday_year = 1 
    239239               nsec_year = ndt05 
     240               IF( nsec1jan000 >= 2 * (2**30 - nsecd * nyear_len(1) / 2 ) ) THEN   ! test integer 4 max value 
     241                  CALL ctl_stop( 'The number of seconds between Jan. 1st 00h of nit000 year and Jan. 1st 00h ',   & 
     242                     &           'of the current year is exceeding the INTEGER 4 max VALUE: 2^31-1 -> 68.09 years in seconds', & 
     243                     & 'You must do a restart at higher frequency (or remove this STOP and recompile everything in I8)' ) 
     244               ENDIF 
    240245               nsec1jan000 = nsec1jan000 + nsecd * nyear_len(1) 
    241246               IF( nleapy == 1 )   CALL day_mth 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DOM/domvvl.F90

    r3294 r4219  
    192192      INTEGER  ::   iku, ikv     ! local integers     
    193193      INTEGER  ::   ii0, ii1, ij0, ij1   ! temporary integers 
    194       REAL(wp) ::   zvt          ! local scalars 
     194      REAL(wp) ::   zvt, zvtip1, zvtjp1  ! local scalars 
    195195      !!---------------------------------------------------------------------- 
    196196      ! 
     
    202202         WRITE(numout,*) '~~~~~~~~~ ' 
    203203         pe3u_b(:,:,jpk) = fse3u_0(:,:,jpk) 
    204          pe3v_b(:,:,jpk) = fse3u_0(:,:,jpk) 
     204         pe3v_b(:,:,jpk) = fse3v_0(:,:,jpk) 
    205205      ENDIF 
    206206       
     
    208208         DO jj = 2, jpjm1 
    209209            DO ji = fs_2, fs_jpim1 
    210                zvt = fse3t_b(ji,jj,jk) * e1e2t(ji,jj) 
    211                pe3u_b(ji,jj,jk) = 0.5_wp * ( zvt + fse3t_b(ji+1,jj,jk) * e1e2t(ji+1,jj) ) / ( e1u(ji,jj) * e2u(ji,jj) ) 
    212                pe3v_b(ji,jj,jk) = 0.5_wp * ( zvt + fse3t_b(ji,jj+1,jk) * e1e2t(ji,jj+1) ) / ( e1v(ji,jj) * e2v(ji,jj) ) 
     210               zvt    = ( fse3t_b(ji  ,jj  ,jk) - fse3t_0(ji  ,jj  ,jk) ) * e1e2t(ji  ,jj  ) 
     211               zvtip1 = ( fse3t_b(ji+1,jj  ,jk) - fse3t_0(ji+1,jj  ,jk) ) * e1e2t(ji+1,jj  ) 
     212               zvtjp1 = ( fse3t_b(ji  ,jj+1,jk) - fse3t_0(ji  ,jj+1,jk) ) * e1e2t(ji  ,jj+1) 
     213               pe3u_b(ji,jj,jk) = fse3u_0(ji,jj,jk) + 0.5_wp * ( zvt + zvtip1 ) / ( e1u(ji,jj) * e2u(ji,jj) ) 
     214               pe3v_b(ji,jj,jk) = fse3v_0(ji,jj,jk) + 0.5_wp * ( zvt + zvtjp1 ) / ( e1v(ji,jj) * e2v(ji,jj) ) 
    213215            END DO 
    214216         END DO 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DOM/domzgr.F90

    r3926 r4219  
    11021102      INTEGER  ::   iip1, ijp1, iim1, ijm1   ! temporary integers 
    11031103      REAL(wp) ::   zrmax, ztaper   ! temporary scalars 
    1104       REAL(wp) ::   zrfact   ! temporary scalars 
    1105       REAL(wp), POINTER, DIMENSION(:,:  ) :: ztmpi1, ztmpi2, ztmpj1, ztmpj2 
    1106  
    1107       ! 
    1108       REAL(wp), POINTER, DIMENSION(:,:  ) :: zenv, zri, zrj, zhbat 
     1104      ! 
     1105      REAL(wp), POINTER, DIMENSION(:,:  ) :: zenv, ztmp, zmsk, zri, zrj, zhbat 
    11091106 
    11101107      NAMELIST/namzgr_sco/ln_s_sh94, ln_s_sf12, ln_sigcrit, rn_sbot_min, rn_sbot_max, rn_hc, rn_rmax,rn_theta, & 
     
    11141111      IF( nn_timing == 1 )  CALL timing_start('zgr_sco') 
    11151112      ! 
    1116       CALL wrk_alloc( jpi, jpj,      ztmpi1, ztmpi2, ztmpj1, ztmpj2         ) 
    1117       CALL wrk_alloc( jpi, jpj,      zenv, zri, zrj, zhbat     ) 
    1118      ! 
     1113      CALL wrk_alloc( jpi, jpj,      zenv, ztmp, zmsk, zri, zrj, zhbat                           ) 
     1114      ! 
    11191115      REWIND( numnam )                       ! Read Namelist namzgr_sco : sigma-stretching parameters 
    11201116      READ  ( numnam, namzgr_sco ) 
     
    11631159      !                                        ! ============================= 
    11641160      ! use r-value to create hybrid coordinates 
    1165 !     DO jj = 1, jpj 
    1166 !        DO ji = 1, jpi 
    1167 !           zenv(ji,jj) = MAX( bathy(ji,jj), 0._wp ) 
    1168 !        END DO 
    1169 !     END DO 
    1170 !     CALL lbc_lnk( zenv, 'T', 1._wp ) 
    1171       zenv(:,:) = bathy(:,:) 
     1161      DO jj = 1, jpj 
     1162         DO ji = 1, jpi 
     1163            zenv(ji,jj) = MAX( bathy(ji,jj), rn_sbot_min ) 
     1164         END DO 
     1165      END DO 
    11721166      !  
    11731167      ! Smooth the bathymetry (if required) 
     
    11771171      jl = 0 
    11781172      zrmax = 1._wp 
    1179       !      
    1180       ! set scaling factor used in reducing vertical gradients 
    1181       zrfact = ( 1._wp - rn_rmax ) / ( 1._wp + rn_rmax )  
    1182       ! 
    1183       ! initialise temporary evelope depth arrays 
    1184       ztmpi1(:,:) = zenv(:,:) 
    1185       ztmpi2(:,:) = zenv(:,:) 
    1186       ztmpj1(:,:) = zenv(:,:) 
    1187       ztmpj2(:,:) = zenv(:,:) 
    1188       ! 
    1189       ! initialise temporary r-value arrays 
    1190       zri(:,:) = 1._wp 
    1191       zrj(:,:) = 1._wp 
    1192       !                                                            ! ================ ! 
    1193       DO WHILE( jl <= 10000 .AND. ( zrmax - rn_rmax ) > 1.e-8_wp ) !  Iterative loop  ! 
    1194          !                                                         ! ================ ! 
     1173      !                                                     ! ================ ! 
     1174      DO WHILE( jl <= 10000 .AND. zrmax > rn_rmax )         !  Iterative loop  ! 
     1175         !                                                  ! ================ ! 
    11951176         jl = jl + 1 
    11961177         zrmax = 0._wp 
    1197          ! we set zrmax from previous r-values (zri abd zrj) first 
    1198          ! if set after current r-value calculation (as previously) 
    1199          ! we could exit DO WHILE prematurely before checking r-value 
    1200          ! of current zenv 
    1201          DO jj = 1, nlcj 
    1202             DO ji = 1, nlci 
    1203                zrmax = MAX( zrmax, ABS(zri(ji,jj)), ABS(zrj(ji,jj)) ) 
    1204             END DO 
    1205          END DO 
    1206          zri(:,:) = 0._wp 
    1207          zrj(:,:) = 0._wp 
     1178         zmsk(:,:) = 0._wp 
    12081179         DO jj = 1, nlcj 
    12091180            DO ji = 1, nlci 
    12101181               iip1 = MIN( ji+1, nlci )      ! force zri = 0 on last line (ji=ncli+1 to jpi) 
    12111182               ijp1 = MIN( jj+1, nlcj )      ! force zrj = 0 on last raw  (jj=nclj+1 to jpj) 
    1212                IF( (zenv(ji,jj) > 0._wp) .AND. (zenv(iip1,jj) > 0._wp)) THEN 
    1213                   zri(ji,jj) = ( zenv(iip1,jj  ) - zenv(ji,jj) ) / ( zenv(iip1,jj  ) + zenv(ji,jj) ) 
    1214                END IF 
    1215                IF( (zenv(ji,jj) > 0._wp) .AND. (zenv(ji,ijp1) > 0._wp)) THEN 
    1216                   zrj(ji,jj) = ( zenv(ji  ,ijp1) - zenv(ji,jj) ) / ( zenv(ji  ,ijp1) + zenv(ji,jj) ) 
    1217                END IF 
    1218                IF( zri(ji,jj) >  rn_rmax )   ztmpi1(ji  ,jj  ) = zenv(iip1,jj  ) * zrfact 
    1219                IF( zri(ji,jj) < -rn_rmax )   ztmpi2(iip1,jj  ) = zenv(ji  ,jj  ) * zrfact  
    1220                IF( zrj(ji,jj) >  rn_rmax )   ztmpj1(ji  ,jj  ) = zenv(ji  ,ijp1) * zrfact 
    1221                IF( zrj(ji,jj) < -rn_rmax )   ztmpj2(ji  ,ijp1) = zenv(ji  ,jj  ) * zrfact 
     1183               zri(ji,jj) = ABS( zenv(iip1,jj  ) - zenv(ji,jj) ) / ( zenv(iip1,jj  ) + zenv(ji,jj) ) 
     1184               zrj(ji,jj) = ABS( zenv(ji  ,ijp1) - zenv(ji,jj) ) / ( zenv(ji  ,ijp1) + zenv(ji,jj) ) 
     1185               zrmax = MAX( zrmax, zri(ji,jj), zrj(ji,jj) ) 
     1186               IF( zri(ji,jj) > rn_rmax )   zmsk(ji  ,jj  ) = 1._wp 
     1187               IF( zri(ji,jj) > rn_rmax )   zmsk(iip1,jj  ) = 1._wp 
     1188               IF( zrj(ji,jj) > rn_rmax )   zmsk(ji  ,jj  ) = 1._wp 
     1189               IF( zrj(ji,jj) > rn_rmax )   zmsk(ji  ,ijp1) = 1._wp 
    12221190            END DO 
    12231191         END DO 
    12241192         IF( lk_mpp )   CALL mpp_max( zrmax )   ! max over the global domain 
     1193         ! lateral boundary condition on zmsk: keep 1 along closed boundary (use of MAX) 
     1194         ztmp(:,:) = zmsk(:,:)   ;   CALL lbc_lnk( zmsk, 'T', 1._wp ) 
     1195         DO jj = 1, nlcj 
     1196            DO ji = 1, nlci 
     1197                zmsk(ji,jj) = MAX( zmsk(ji,jj), ztmp(ji,jj) ) 
     1198            END DO 
     1199         END DO 
    12251200         ! 
    1226          IF(lwp)WRITE(numout,*) 'zgr_sco :   iter= ',jl, ' rmax= ', zrmax 
     1201         IF(lwp)WRITE(numout,*) 'zgr_sco :   iter= ',jl, ' rmax= ', zrmax, ' nb of pt= ', INT( SUM(zmsk(:,:) ) ) 
    12271202         ! 
    12281203         DO jj = 1, nlcj 
    12291204            DO ji = 1, nlci 
    1230                zenv(ji,jj) = MAX(zenv(ji,jj), ztmpi1(ji,jj), ztmpi2(ji,jj), ztmpj1(ji,jj), ztmpj2(ji,jj) ) 
     1205               iip1 = MIN( ji+1, nlci )     ! last  line (ji=nlci) 
     1206               ijp1 = MIN( jj+1, nlcj )     ! last  raw  (jj=nlcj) 
     1207               iim1 = MAX( ji-1,  1  )      ! first line (ji=nlci) 
     1208               ijm1 = MAX( jj-1,  1  )      ! first raw  (jj=nlcj) 
     1209               IF( zmsk(ji,jj) == 1._wp ) THEN 
     1210                  ztmp(ji,jj) =   (                                                                                   & 
     1211             &      zenv(iim1,ijp1)*zmsk(iim1,ijp1) + zenv(ji,ijp1)*zmsk(ji,ijp1) + zenv(iip1,ijp1)*zmsk(iip1,ijp1)   & 
     1212             &    + zenv(iim1,jj  )*zmsk(iim1,jj  ) + zenv(ji,jj  )*    2._wp     + zenv(iip1,jj  )*zmsk(iip1,jj  )   & 
     1213             &    + zenv(iim1,ijm1)*zmsk(iim1,ijm1) + zenv(ji,ijm1)*zmsk(ji,ijm1) + zenv(iip1,ijm1)*zmsk(iip1,ijm1)   & 
     1214             &                    ) / (                                                                               & 
     1215             &                      zmsk(iim1,ijp1) +               zmsk(ji,ijp1) +                 zmsk(iip1,ijp1)   & 
     1216             &    +                 zmsk(iim1,jj  ) +                   2._wp     +                 zmsk(iip1,jj  )   & 
     1217             &    +                 zmsk(iim1,ijm1) +               zmsk(ji,ijm1) +                 zmsk(iip1,ijm1)   & 
     1218             &                        ) 
     1219               ENDIF 
    12311220            END DO 
    12321221         END DO 
    12331222         ! 
    1234          CALL lbc_lnk( zenv, 'T', 1._wp ) 
     1223         DO jj = 1, nlcj 
     1224            DO ji = 1, nlci 
     1225               IF( zmsk(ji,jj) == 1._wp )   zenv(ji,jj) = MAX( ztmp(ji,jj), bathy(ji,jj) ) 
     1226            END DO 
     1227         END DO 
     1228         ! 
     1229         ! Apply lateral boundary condition   CAUTION: keep the value when the lbc field is zero 
     1230         ztmp(:,:) = zenv(:,:)   ;   CALL lbc_lnk( zenv, 'T', 1._wp ) 
     1231         DO jj = 1, nlcj 
     1232            DO ji = 1, nlci 
     1233               IF( zenv(ji,jj) == 0._wp )   zenv(ji,jj) = ztmp(ji,jj) 
     1234            END DO 
     1235         END DO 
    12351236         !                                                  ! ================ ! 
    12361237      END DO                                                !     End loop     ! 
    12371238      !                                                     ! ================ ! 
    12381239      ! 
    1239 !     DO jj = 1, jpj 
    1240 !        DO ji = 1, jpi 
    1241 !           zenv(ji,jj) = MAX( zenv(ji,jj), rn_sbot_min ) ! set all points to avoid undefined scale values 
    1242 !        END DO 
    1243 !     END DO 
     1240      ! Fill ghost rows with appropriate values to avoid undefined e3 values with some mpp decompositions 
     1241      DO ji = nlci+1, jpi  
     1242         zenv(ji,1:nlcj) = zenv(nlci,1:nlcj) 
     1243      END DO 
     1244      ! 
     1245      DO jj = nlcj+1, jpj 
     1246         zenv(:,jj) = zenv(:,nlcj) 
     1247      END DO 
    12441248      ! 
    12451249      ! Envelope bathymetry saved in hbatt 
    12461250      hbatt(:,:) = zenv(:,:)  
    1247  
    12481251      IF( MINVAL( gphit(:,:) ) * MAXVAL( gphit(:,:) ) <= 0._wp ) THEN 
    12491252         CALL ctl_warn( ' s-coordinates are tapered in vicinity of the Equator' ) 
    12501253         DO jj = 1, jpj 
    12511254            DO ji = 1, jpi 
    1252                ztaper = EXP( -(gphit(ji,jj)/8._wp)**2 ) 
     1255               ztaper = EXP( -(gphit(ji,jj)/8._wp)**2._wp ) 
    12531256               hbatt(ji,jj) = rn_sbot_max * ztaper + hbatt(ji,jj) * ( 1._wp - ztaper ) 
    12541257            END DO 
     
    13651368      fsde3w(:,:,:) = gdep3w(:,:,:) 
    13661369      ! 
    1367       where (e3t   (:,:,:).eq.0.0)  e3t(:,:,:) = 1.0 
    1368       where (e3u   (:,:,:).eq.0.0)  e3u(:,:,:) = 1.0 
    1369       where (e3v   (:,:,:).eq.0.0)  e3v(:,:,:) = 1.0 
    1370       where (e3f   (:,:,:).eq.0.0)  e3f(:,:,:) = 1.0 
    1371       where (e3w   (:,:,:).eq.0.0)  e3w(:,:,:) = 1.0 
    1372       where (e3uw  (:,:,:).eq.0.0)  e3uw(:,:,:) = 1.0 
    1373       where (e3vw  (:,:,:).eq.0.0)  e3vw(:,:,:) = 1.0 
    1374  
     1370      where (e3t   (:,:,:).eq.0.0)  e3t(:,:,:) = 1._wp 
     1371      where (e3u   (:,:,:).eq.0.0)  e3u(:,:,:) = 1._wp 
     1372      where (e3v   (:,:,:).eq.0.0)  e3v(:,:,:) = 1._wp 
     1373      where (e3f   (:,:,:).eq.0.0)  e3f(:,:,:) = 1._wp 
     1374      where (e3w   (:,:,:).eq.0.0)  e3w(:,:,:) = 1._wp 
     1375      where (e3uw  (:,:,:).eq.0.0)  e3uw(:,:,:) = 1._wp 
     1376      where (e3vw  (:,:,:).eq.0.0)  e3vw(:,:,:) = 1._wp 
     1377 
     1378#if defined key_agrif 
     1379      ! Ensure meaningful vertical scale factors in ghost lines/columns 
     1380      IF( .NOT. Agrif_Root() ) THEN 
     1381         !   
     1382         IF((nbondi == -1).OR.(nbondi == 2)) THEN 
     1383            e3u(1,:,:) = e3u(2,:,:) 
     1384         ENDIF 
     1385         ! 
     1386         IF((nbondi ==  1).OR.(nbondi == 2)) THEN 
     1387            e3u(nlci-1,:,:) = e3u(nlci-2,:,:) 
     1388         ENDIF 
     1389         ! 
     1390         IF((nbondj == -1).OR.(nbondj == 2)) THEN 
     1391            e3v(:,1,:) = e3v(:,2,:) 
     1392         ENDIF 
     1393         ! 
     1394         IF((nbondj ==  1).OR.(nbondj == 2)) THEN 
     1395            e3v(:,nlcj-1,:) = e3v(:,nlcj-2,:) 
     1396         ENDIF 
     1397         ! 
     1398      ENDIF 
     1399#endif 
    13751400 
    13761401      fsdept(:,:,:) = gdept (:,:,:) 
     
    14211446         WRITE(numout,"(10x,i4,4f9.2)") ( jk, fsdept(1,1,jk), fsdepw(1,1,jk),     & 
    14221447            &                                 fse3t (1,1,jk), fse3w (1,1,jk), jk=1,jpk ) 
    1423          DO jj = mj0(20), mj1(20) 
    1424             DO ji = mi0(20), mi1(20) 
     1448         iip1 = MIN(20, jpiglo-1)  ! for config with i smaller than 20 points 
     1449         ijp1 = MIN(20, jpjglo-1)  ! for config with j smaller than 20 points 
     1450         DO jj = mj0(ijp1), mj1(ijp1) 
     1451            DO ji = mi0(iip1), mi1(iip1) 
    14251452               WRITE(numout,*) 
    1426                WRITE(numout,*) ' domzgr: vertical coordinates : point (20,20,k)   bathy = ', bathy(ji,jj), hbatt(ji,jj) 
     1453               WRITE(numout,*) ' domzgr: vertical coordinates : point (',iip1,',',ijp1,',k)   bathy = ',  & 
     1454                  &                                              bathy(ji,jj), hbatt(ji,jj) 
    14271455               WRITE(numout,*) ' ~~~~~~  --------------------' 
    14281456               WRITE(numout,"(9x,' level   gdept    gdepw    gde3w     e3t      e3w  ')") 
     
    14311459            END DO 
    14321460         END DO 
    1433          DO jj = mj0(74), mj1(74) 
    1434             DO ji = mi0(100), mi1(100) 
     1461         iip1 = MIN(  74, jpiglo-1) 
     1462         ijp1 = MIN( 100, jpjglo-1) 
     1463         DO jj = mj0(ijp1), mj1(ijp1) 
     1464            DO ji = mi0(iip1), mi1(iip1) 
    14351465               WRITE(numout,*) 
    1436                WRITE(numout,*) ' domzgr: vertical coordinates : point (100,74,k)   bathy = ', bathy(ji,jj), hbatt(ji,jj) 
     1466               WRITE(numout,*) ' domzgr: vertical coordinates : point (',iip1,',',ijp1,',k)   bathy = ',  & 
     1467                  &                                              bathy(ji,jj), hbatt(ji,jj) 
    14371468               WRITE(numout,*) ' ~~~~~~  --------------------' 
    14381469               WRITE(numout,"(9x,' level   gdept    gdepw    gde3w     e3t      e3w  ')") 
     
    14911522      END DO 
    14921523      ! 
    1493       CALL wrk_dealloc( jpi, jpj,      zenv, ztmpi1, ztmpi2, ztmpj1, ztmpj2, zri, zrj, zhbat                           )      ! 
     1524      CALL wrk_dealloc( jpi, jpj,      zenv, ztmp, zmsk, zri, zrj, zhbat                           ) 
     1525      ! 
    14941526      IF( nn_timing == 1 )  CALL timing_stop('zgr_sco') 
    14951527      ! 
     
    17201752      ENDDO 
    17211753      ! 
    1722       CALL lbc_lnk(e3t ,'T',1.) ; CALL lbc_lnk(e3u ,'T',1.) 
    1723       CALL lbc_lnk(e3v ,'T',1.) ; CALL lbc_lnk(e3f ,'T',1.) 
    1724       CALL lbc_lnk(e3w ,'T',1.) 
    1725       CALL lbc_lnk(e3uw,'T',1.) ; CALL lbc_lnk(e3vw,'T',1.) 
    1726       ! 
    17271754      !                                               ! ============= 
    17281755 
     
    18211848      !!---------------------------------------------------------------------- 
    18221849      ! 
    1823       pf =   (   TANH( rn_theta * ( -(pk-0.5_wp) / REAL(jpkm1) + rn_thetb )  )   & 
     1850      pf =   (   TANH( rn_theta * ( -(pk-0.5_wp) / REAL(jpkm1,wp) + rn_thetb )  )   & 
    18241851         &     - TANH( rn_thetb * rn_theta                                )  )   & 
    18251852         & * (   COSH( rn_theta                           )                      & 
     
    18471874      ! 
    18481875      IF ( rn_theta == 0 ) then      ! uniform sigma 
    1849          pf1 = - ( pk1 - 0.5_wp ) / REAL( jpkm1 ) 
     1876         pf1 = - ( pk1 - 0.5_wp ) / REAL( jpkm1,wp ) 
    18501877      ELSE                        ! stretched sigma 
    1851          pf1 =   ( 1._wp - pbb ) * ( SINH( rn_theta*(-(pk1-0.5_wp)/REAL(jpkm1)) ) ) / SINH( rn_theta )              & 
    1852             &  + pbb * (  (TANH( rn_theta*( (-(pk1-0.5_wp)/REAL(jpkm1)) + 0.5_wp) ) - TANH( 0.5_wp * rn_theta )  )  & 
     1878         pf1 =   ( 1._wp - pbb ) * ( SINH( rn_theta*(-(pk1-0.5_wp)/REAL(jpkm1,wp)) ) ) / SINH( rn_theta )              & 
     1879            &  + pbb * (  (TANH( rn_theta*( (-(pk1-0.5_wp)/REAL(jpkm1,wp)) + 0.5_wp) ) - TANH( 0.5_wp * rn_theta )  )  & 
    18531880            &        / ( 2._wp * TANH( 0.5_wp * rn_theta ) )  ) 
    18541881      ENDIF 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/DYN/dynspg_flt.F90

    r3765 r4219  
    109109      INTEGER  ::   ji, jj, jk   ! dummy loop indices 
    110110      REAL(wp) ::   z2dt, z2dtg, zgcb, zbtd, ztdgu, ztdgv   ! local scalars 
    111       REAL(wp), POINTER, DIMENSION(:,:,:) ::  zub, zvb 
    112111      !!---------------------------------------------------------------------- 
    113112      ! 
    114113      IF( nn_timing == 1 )  CALL timing_start('dyn_spg_flt') 
    115114      ! 
    116       CALL wrk_alloc( jpi,jpj,jpk, zub, zvb ) 
    117115      ! 
    118116      IF( kt == nit000 ) THEN 
     
    213211         DO jk = 1, jpkm1 
    214212            DO ji = 1, jpij 
    215                spgu(ji,1) = spgu(ji,1) + fse3u(ji,1,jk) * ua(ji,1,jk) 
    216                spgv(ji,1) = spgv(ji,1) + fse3v(ji,1,jk) * va(ji,1,jk) 
     213               spgu(ji,1) = spgu(ji,1) + fse3u_a(ji,1,jk) * ua(ji,1,jk) 
     214               spgv(ji,1) = spgv(ji,1) + fse3v_a(ji,1,jk) * va(ji,1,jk) 
    217215            END DO 
    218216         END DO 
     
    221219            DO jj = 2, jpjm1 
    222220               DO ji = 2, jpim1 
    223                   spgu(ji,jj) = spgu(ji,jj) + fse3u(ji,jj,jk) * ua(ji,jj,jk) 
    224                   spgv(ji,jj) = spgv(ji,jj) + fse3v(ji,jj,jk) * va(ji,jj,jk) 
     221                  spgu(ji,jj) = spgu(ji,jj) + fse3u_a(ji,jj,jk) * ua(ji,jj,jk) 
     222                  spgv(ji,jj) = spgv(ji,jj) + fse3v_a(ji,jj,jk) * va(ji,jj,jk) 
    225223               END DO 
    226224            END DO 
     
    360358      IF( lrst_oce ) CALL flt_rst( kt, 'WRITE' ) 
    361359      ! 
    362       CALL wrk_dealloc( jpi,jpj,jpk, zub, zvb ) 
    363360      ! 
    364361      IF( nn_timing == 1 )  CALL timing_stop('dyn_spg_flt') 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/ICB/icb_oce.F90

    r3614 r4219  
    3737   USE par_oce   ! ocean parameters 
    3838   USE lib_mpp   ! MPP library 
    39    USE fldread   ! read input fields (FLD type) 
    4039 
    4140   IMPLICIT NONE 
     
    151150   REAL(wp), PUBLIC, ALLOCATABLE, SAVE, DIMENSION(:,:,:)   :: griddata                           !: work array for icbrst 
    152151 
    153    TYPE(FLD), PUBLIC, ALLOCATABLE     , DIMENSION(:)       ::   sf_icb   !: structure: file information, fields read 
    154  
    155152   !!---------------------------------------------------------------------- 
    156153   !! NEMO/OPA 3.3 , NEMO Consortium (2011) 
     
    168165      ! 
    169166      icb_alloc = 0 
    170       ALLOCATE( berg_grid                      ,                                               & 
    171          &      berg_grid%calving    (jpi,jpj) , berg_grid%calving_hflx (jpi,jpj)          ,   & 
     167      ALLOCATE( berg_grid%calving    (jpi,jpj) , berg_grid%calving_hflx (jpi,jpj)          ,   & 
    172168         &      berg_grid%stored_heat(jpi,jpj) , berg_grid%floating_melt(jpi,jpj)          ,   & 
    173169         &      berg_grid%maxclass   (jpi,jpj) , berg_grid%stored_ice   (jpi,jpj,nclasses) ,   & 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/ICB/icbini.F90

    r3785 r4219  
    3535   PUBLIC   icb_init  ! routine called in nemogcm.F90 module 
    3636 
    37    CHARACTER(len=100) ::   cn_dir = './'   ! Root directory for location of icb files 
    38    TYPE(FLD_N)        ::   sn_icb          ! information about the calving file to be read 
     37   CHARACTER(len=100)                                 ::   cn_dir = './'   !: Root directory for location of icb files 
     38   TYPE(FLD_N)                                        ::   sn_icb          !: information about the calving file to be read 
     39   TYPE(FLD), PUBLIC, ALLOCATABLE     , DIMENSION(:)  ::   sf_icb          !: structure: file information, fields read 
     40                                                                           !: used in icbini and icbstp 
    3941 
    4042   !!---------------------------------------------------------------------- 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/ICB/icbstp.F90

    r3614 r4219  
    2424   USE lib_mpp 
    2525   USE iom 
     26   USE fldread 
    2627   USE timing         ! timing 
    2728 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/IOM/iom.F90

    r3940 r4219  
    3131   USE sbc_oce, ONLY :   nn_fsbc         ! ocean space and time domain 
    3232   USE trc_oce, ONLY :   nn_dttrc        !  !: frequency of step on passive tracers 
     33   USE icb_oce, ONLY :   class_num       !  !: iceberg classes 
    3334   USE domngb          ! ocean space and time domain 
    3435   USE phycst          ! physical constants 
     
    99100      clname = "nemo" 
    100101      IF( TRIM(Agrif_CFixed()) /= '0' )   clname = TRIM(Agrif_CFixed())//"_"//TRIM(clname) 
     102# if defined key_mpp_mpi 
    101103      CALL xios_context_initialize(TRIM(clname), mpi_comm_opa) 
     104# else 
     105      CALL xios_context_initialize(TRIM(clname), 0) 
     106# endif 
    102107      CALL iom_swap 
    103108 
     
    124129      CALL iom_set_axis_attr( "depthw", gdepw_0 ) 
    125130# if defined key_floats 
    126       CALL iom_set_axis_attr( "nfloat", (ji, ji=1,nfloat) ) 
     131      CALL iom_set_axis_attr( "nfloat", (/ (REAL(ji,wp), ji=1,nfloat) /) ) 
    127132# endif 
     133      CALL iom_set_axis_attr( "icbcla", class_num ) 
    128134       
    129135      ! automatic definitions of some of the xml attributs 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/LBC/lbclnk.F90

    r3768 r4219  
    283283   END SUBROUTINE lbc_lnk_3d 
    284284 
    285    SUBROUTINE lbc_bdy_lnk_3d( pt3d, cd_type, psgn, ib_bdy ) 
    286       !!--------------------------------------------------------------------- 
    287       !!                  ***  ROUTINE lbc_bdy_lnk  *** 
    288       !! 
    289       !! ** Purpose :   wrapper rountine to 'lbc_lnk_3d'. This wrapper is used 
    290       !!                to maintain the same interface with regards to the mpp case 
    291       !! 
    292       !!---------------------------------------------------------------------- 
    293       CHARACTER(len=1)                , INTENT(in   )           ::   cd_type   ! nature of pt3d grid-points 
    294       REAL(wp), DIMENSION(jpi,jpj,jpk), INTENT(inout)           ::   pt3d      ! 3D array on which the lbc is applied 
    295       REAL(wp)                        , INTENT(in   )           ::   psgn      ! control of the sign  
    296       INTEGER                                                   ::   ib_bdy    ! BDY boundary set 
    297       !! 
    298       CALL lbc_lnk_3d( pt3d, cd_type, psgn) 
    299  
    300    END SUBROUTINE lbc_bdy_lnk_3d 
    301  
    302    SUBROUTINE lbc_bdy_lnk_2d( pt2d, cd_type, psgn, ib_bdy ) 
    303       !!--------------------------------------------------------------------- 
    304       !!                  ***  ROUTINE lbc_bdy_lnk  *** 
    305       !! 
    306       !! ** Purpose :   wrapper rountine to 'lbc_lnk_3d'. This wrapper is used 
    307       !!                to maintain the same interface with regards to the mpp case 
    308       !! 
    309       !!---------------------------------------------------------------------- 
    310       CHARACTER(len=1)                , INTENT(in   )           ::   cd_type   ! nature of pt3d grid-points 
    311       REAL(wp), DIMENSION(jpi,jpj),     INTENT(inout)           ::   pt2d      ! 3D array on which the lbc is applied 
    312       REAL(wp)                        , INTENT(in   )           ::   psgn      ! control of the sign  
    313       INTEGER                                                   ::   ib_bdy    ! BDY boundary set 
    314       !! 
    315       CALL lbc_lnk_2d( pt2d, cd_type, psgn) 
    316  
    317    END SUBROUTINE lbc_bdy_lnk_2d 
    318  
    319285   SUBROUTINE lbc_lnk_2d( pt2d, cd_type, psgn, cd_mpp, pval ) 
    320286      !!--------------------------------------------------------------------- 
     
    406372   END SUBROUTINE lbc_lnk_2d 
    407373 
     374#endif 
     375 
     376 
     377   SUBROUTINE lbc_bdy_lnk_3d( pt3d, cd_type, psgn, ib_bdy ) 
     378      !!--------------------------------------------------------------------- 
     379      !!                  ***  ROUTINE lbc_bdy_lnk  *** 
     380      !! 
     381      !! ** Purpose :   wrapper rountine to 'lbc_lnk_3d'. This wrapper is used 
     382      !!                to maintain the same interface with regards to the mpp 
     383      !case 
     384      !! 
     385      !!---------------------------------------------------------------------- 
     386      CHARACTER(len=1)                , INTENT(in   )           ::   cd_type   ! nature of pt3d grid-points 
     387      REAL(wp), DIMENSION(jpi,jpj,jpk), INTENT(inout)           ::   pt3d      ! 3D array on which the lbc is applied 
     388      REAL(wp)                        , INTENT(in   )           ::   psgn      ! control of the sign  
     389      INTEGER                                                   ::   ib_bdy    ! BDY boundary set 
     390      !! 
     391      CALL lbc_lnk_3d( pt3d, cd_type, psgn) 
     392 
     393   END SUBROUTINE lbc_bdy_lnk_3d 
     394 
     395   SUBROUTINE lbc_bdy_lnk_2d( pt2d, cd_type, psgn, ib_bdy ) 
     396      !!--------------------------------------------------------------------- 
     397      !!                  ***  ROUTINE lbc_bdy_lnk  *** 
     398      !! 
     399      !! ** Purpose :   wrapper rountine to 'lbc_lnk_3d'. This wrapper is used 
     400      !!                to maintain the same interface with regards to the mpp 
     401      !case 
     402      !! 
     403      !!---------------------------------------------------------------------- 
     404      CHARACTER(len=1)                , INTENT(in   )           ::   cd_type   ! nature of pt3d grid-points 
     405      REAL(wp), DIMENSION(jpi,jpj),     INTENT(inout)           ::   pt2d      ! 3D array on which the lbc is applied 
     406      REAL(wp)                        , INTENT(in   )           ::   psgn      ! control of the sign  
     407      INTEGER                                                   ::   ib_bdy    ! BDY boundary set 
     408      !! 
     409      CALL lbc_lnk_2d( pt2d, cd_type, psgn) 
     410 
     411   END SUBROUTINE lbc_bdy_lnk_2d 
     412 
     413 
    408414   SUBROUTINE lbc_lnk_2d_e( pt2d, cd_type, psgn, jpri, jprj ) 
    409415      !!--------------------------------------------------------------------- 
     
    430436   END SUBROUTINE lbc_lnk_2d_e 
    431437 
    432 # endif 
    433438#endif 
    434439 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/LBC/lib_mpp.F90

    r3918 r4219  
    21792179!!gm Remark : this is very time consumming!!! 
    21802180      !                                         ! ------------------------ ! 
    2181             IF( ijpt0 > ijpt1 .OR. iipt0 > iipt1 ) THEN 
     2181        IF(((nbondi .ne. 0) .AND. (ktype .eq. 2)) .OR. ((nbondj .ne. 0) .AND. (ktype .eq. 1))) THEN 
    21822182            ! there is nothing to be migrated 
    2183                lmigr = .FALSE. 
     2183              lmigr = .TRUE. 
    21842184            ELSE 
    2185               lmigr = .TRUE. 
     2185              lmigr = .FALSE. 
    21862186            ENDIF 
    21872187 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/LBC/mppini_2.h90

    r3818 r4219  
    122122      irestj = 1 + MOD( jpjglo - nrecj -1 , jpnj ) 
    123123 
     124#if defined key_nemocice_decomp 
     125      ! Change padding to be consistent with CICE 
     126      ilci(1:jpni-1      ,:) = jpi 
     127      ilci(jpni          ,:) = jpiglo - (jpni - 1) * (jpi - nreci) 
     128 
     129      ilcj(:,      1:jpnj-1) = jpj 
     130      ilcj(:,          jpnj) = jpjglo - (jpnj - 1) * (jpj - nrecj) 
     131#else 
    124132      ilci(1:iresti      ,:) = jpi 
    125133      ilci(iresti+1:jpni ,:) = jpi-1 
     
    127135      ilcj(:,      1:irestj) = jpj 
    128136      ilcj(:, irestj+1:jpnj) = jpj-1 
     137#endif 
    129138 
    130139      IF(lwp) WRITE(numout,*) 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/SBC/geo2ocean.F90

    r2715 r4219  
    187187         &      gsinf(jpi,jpj), gcosf(jpi,jpj), STAT=ierr ) 
    188188      IF(lk_mpp)   CALL mpp_sum( ierr ) 
    189       IF( ierr /= 0 )   CALL ctl_stop('STOP', 'angle_msh_geo: unable to allocate arrays' ) 
     189      IF( ierr /= 0 )   CALL ctl_stop('angle: unable to allocate arrays' ) 
    190190 
    191191      ! ============================= ! 
     
    361361            &      gsinlat(jpi,jpj,4) , gcoslat(jpi,jpj,4) , STAT=ierr ) 
    362362         IF( lk_mpp    )   CALL mpp_sum( ierr ) 
    363          IF( ierr /= 0 )   CALL ctl_stop('STOP', 'angle_msh_geo: unable to allocate arrays' ) 
     363         IF( ierr /= 0 )   CALL ctl_stop('geo2oce: unable to allocate arrays' ) 
    364364      ENDIF 
    365365 
     
    438438      !!---------------------------------------------------------------------- 
    439439 
    440       IF( ALLOCATED( gsinlon ) ) THEN 
     440      IF( .NOT. ALLOCATED( gsinlon ) ) THEN 
    441441         ALLOCATE( gsinlon(jpi,jpj,4) , gcoslon(jpi,jpj,4) ,   & 
    442442            &      gsinlat(jpi,jpj,4) , gcoslat(jpi,jpj,4) , STAT=ierr ) 
    443443         IF( lk_mpp    )   CALL mpp_sum( ierr ) 
    444          IF( ierr /= 0 )   CALL ctl_stop('STOP', 'angle_msh_geo: unable to allocate arrays' ) 
     444         IF( ierr /= 0 )   CALL ctl_stop('oce2geo: unable to allocate arrays' ) 
    445445      ENDIF 
    446446 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/SBC/sbccpl.F90

    r3914 r4219  
    388388      ! 
    389389      IF( TRIM( sn_rcv_tau%cldes ) /= 'oce and ice' ) THEN        ! 'oce and ice' case ocean stress on ocean mesh used 
    390          srcv(jpr_itz1:jpr_itz2)%laction = .FALSE.    ! ice components not received (itx1 and ity1 used later) 
     390         srcv(jpr_itx1:jpr_itz2)%laction = .FALSE.    ! ice components not received 
    391391         srcv(jpr_itx1)%clgrid = 'U'                  ! ocean stress used after its transformation 
    392392         srcv(jpr_ity1)%clgrid = 'V'                  ! i.e. it is always at U- & V-points for i- & j-comp. resp. 
     
    407407      SELECT CASE( TRIM( sn_rcv_emp%cldes ) ) 
    408408      CASE( 'oce only'      )   ;   srcv(                                 jpr_oemp   )%laction = .TRUE.  
    409       CASE( 'conservative'  )   ;   srcv( (/jpr_rain, jpr_snow, jpr_ievp, jpr_tevp/) )%laction = .TRUE. 
     409      CASE( 'conservative'  ) 
     410         srcv( (/jpr_rain, jpr_snow, jpr_ievp, jpr_tevp/) )%laction = .TRUE. 
     411         IF ( k_ice <= 1 )  srcv(jpr_ivep)%laction = .FALSE. 
    410412      CASE( 'oce and ice'   )   ;   srcv( (/jpr_ievp, jpr_sbpr, jpr_semp, jpr_oemp/) )%laction = .TRUE. 
    411413      CASE default              ;   CALL ctl_stop( 'sbc_cpl_init: wrong definition of sn_rcv_emp%cldes' ) 
     
    465467         CALL ctl_stop( 'sbc_cpl_init: namsbc_cpl namelist mismatch between sn_rcv_qns%cldes and sn_rcv_dqnsdt%cldes' ) 
    466468      !                                                      ! ------------------------- ! 
    467       !                                                      !    Ice Qsr penetration    !    
    468       !                                                      ! ------------------------- ! 
    469       ! fraction of net shortwave radiation which is not absorbed in the thin surface layer  
    470       ! and penetrates inside the ice cover ( Maykut and Untersteiner, 1971 ; Elbert anbd Curry, 1993 ) 
    471       ! Coupled case: since cloud cover is not received from atmosphere  
    472       !               ===> defined as constant value -> definition done in sbc_cpl_init 
    473       fr1_i0(:,:) = 0.18 
    474       fr2_i0(:,:) = 0.82 
    475       !                                                      ! ------------------------- ! 
    476469      !                                                      !      10m wind module      !    
    477470      !                                                      ! ------------------------- ! 
     
    508501      ! Allocate taum part of frcv which is used even when not received as coupling field 
    509502      IF ( .NOT. srcv(jpr_taum)%laction ) ALLOCATE( frcv(jpr_taum)%z3(jpi,jpj,srcv(jn)%nct) ) 
     503      ! Allocate itx1 and ity1 as they are used in sbc_cpl_ice_tau even if srcv(jpr_itx1)%laction = .FALSE. 
     504      IF( k_ice /= 0 ) THEN 
     505         IF ( .NOT. srcv(jpr_itx1)%laction ) ALLOCATE( frcv(jpr_itx1)%z3(jpi,jpj,srcv(jn)%nct) ) 
     506         IF ( .NOT. srcv(jpr_ity1)%laction ) ALLOCATE( frcv(jpr_ity1)%z3(jpi,jpj,srcv(jn)%nct) ) 
     507      END IF 
    510508 
    511509      ! ================================ ! 
     
    13291327      END SELECT 
    13301328 
     1329      !    Ice Qsr penetration used (only?)in lim2 or lim3  
     1330      ! fraction of net shortwave radiation which is not absorbed in the thin surface layer  
     1331      ! and penetrates inside the ice cover ( Maykut and Untersteiner, 1971 ; Elbert anbd Curry, 1993 ) 
     1332      ! Coupled case: since cloud cover is not received from atmosphere  
     1333      !               ===> defined as constant value -> definition done in sbc_cpl_init 
     1334      fr1_i0(:,:) = 0.18 
     1335      fr2_i0(:,:) = 0.82 
     1336 
     1337 
    13311338      CALL wrk_dealloc( jpi,jpj, zcptn, ztmp, zicefr ) 
    13321339      ! 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/SBC/sbcmod.F90

    r3905 r4219  
    221221      ENDIF 
    222222      ! 
     223                          CALL sbc_ssm_init               ! Sea-surface mean fields initialisation 
     224      ! 
    223225      IF( ln_ssr      )   CALL sbc_ssr_init               ! Sea-Surface Restoring initialisation 
    224226      ! 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/SOL/solmat.F90

    r3609 r4219  
    3030   USE lbclnk          ! lateral boudary conditions 
    3131   USE lib_mpp         ! distributed memory computing 
     32   USE c1d               ! 1D vertical configuration 
    3233   USE in_out_manager  ! I/O manager 
    3334   USE timing          ! timing 
     
    271272       
    272273      ! SOR and PCG solvers 
     274      IF( lk_c1d ) CALL lbc_lnk( gcdmat, 'T', 1._wp ) ! 1D case bmask =/0  but gcdmat not define everywhere  
    273275      DO jj = 1, jpj 
    274276         DO ji = 1, jpi 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/TRA/eosbn2.F90

    r3625 r4219  
    675675 
    676676 
    677    FUNCTION tfreez( psal ) RESULT( ptf ) 
     677   FUNCTION tfreez( psal, pdep ) RESULT( ptf ) 
    678678      !!---------------------------------------------------------------------- 
    679679      !!                 ***  ROUTINE eos_init  *** 
     
    688688      !!---------------------------------------------------------------------- 
    689689      REAL(wp), DIMENSION(jpi,jpj), INTENT(in   ) ::   psal   ! salinity             [psu] 
     690      REAL(wp), DIMENSION(jpi,jpj), INTENT(in   ), OPTIONAL ::   pdep   ! depth      [decibars] 
    690691      ! Leave result array automatic rather than making explicitly allocated 
    691692      REAL(wp), DIMENSION(jpi,jpj)                ::   ptf    ! freezing temperature [Celcius] 
     
    694695      ptf(:,:) = ( - 0.0575_wp + 1.710523e-3_wp * SQRT( psal(:,:) )   & 
    695696         &                     - 2.154996e-4_wp *       psal(:,:)   ) * psal(:,:) 
     697      IF ( PRESENT( pdep ) ) THEN    
     698         ptf(:,:) = ptf(:,:) - 7.53e-4_wp * pdep(:,:) 
     699      ENDIF 
    696700      ! 
    697701   END FUNCTION tfreez 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/OPA_SRC/step.F90

    r3769 r4219  
    271271      ! 
    272272#if defined key_iomput 
    273       IF( kstp == nitend   )   CALL xios_context_finalize() ! needed for XIOS+AGRIF 
     273      IF( kstp == nitend .OR. indic < 0 )   CALL xios_context_finalize() ! needed for XIOS+AGRIF 
    274274#endif 
    275275      ! 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/SAS_SRC/daymod.F90

    r3851 r4219  
    246246               nday_year = 1 
    247247               nsec_year = ndt05 
     248               IF( nsec1jan000 >= 2 * (2**30 - nsecd * nyear_len(1) / 2 ) ) THEN   ! test integer 4 max value 
     249                  CALL ctl_stop( 'The number of seconds between Jan. 1st 00h of nit000 year and Jan. 1st 00h ',   & 
     250                     &           'of the current year is exceeding the INTEGER 4 max VALUE: 2^31-1 -> 68.09 years in seconds', & 
     251                     & 'You must do a restart at higher frequency (or remove this STOP and recompile everything in I8)' ) 
     252               ENDIF 
    248253               nsec1jan000 = nsec1jan000 + nsecd * nyear_len(1) 
    249254               IF( nleapy == 1 )   CALL day_mth 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/TOP_SRC/PISCES/P4Z/p4zfechem.F90

    r3904 r4219  
    129129                  zoxy   = trn(ji,jj,jk,jpoxy) * ( rhop(ji,jj,jk) / 1.e3 ) 
    130130                  ! Fe2+ oxydation rate from Santana-Casiano et al. (2005) 
    131                   zkox   = 35.407 - 6.7109 * zph + 0.5342 * zph * zph - 5362.6 / ( tsn(ji,jj,1,jp_tem) + 273.15 )  & 
     131                  zkox   = 35.407 - 6.7109 * zph + 0.5342 * zph * zph - 5362.6 / ( tsn(ji,jj,jk,jp_tem) + 273.15 )  & 
    132132                    &    - 0.04406 * SQRT( tsn(ji,jj,jk,jp_sal) ) - 0.002847 * tsn(ji,jj,jk,jp_sal) 
    133133                  zkox   = ( 10.** zkox ) * spd 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/TOP_SRC/PISCES/P4Z/p4zsed.F90

    r3905 r4219  
    8282      IF( nn_timing == 1 )  CALL timing_start('p4z_sed') 
    8383      ! 
    84       IF( kt == nit000 .AND. jnt == 1 )  THEN 
     84      IF( kt == nittrc000 .AND. jnt == 1 )  THEN 
    8585         ryyss    = nyear_len(1) * rday    ! number of seconds per year and per month 
    8686         rmtss    = ryyss / raamo 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/TOP_SRC/PISCES/P4Z/p4zsms.F90

    r3882 r4219  
    7676      ENDIF 
    7777      ! 
    78       IF( ln_rsttr .AND. kt == nittrc000 )                         CALL p4z_rst( nittrc000, 'READ' )  !* read or initialize all required fields  
     78      IF( kt == nittrc000 ) THEN 
     79        ! 
     80        CALL p4z_che                              ! initialize the chemical constants 
     81        ! 
     82        IF( .NOT. ln_rsttr ) THEN  ;   CALL p4z_ph_ini   !  set PH at kt=nit000  
     83        ELSE                       ;   CALL p4z_rst( nittrc000, 'READ' )  !* read or initialize all required fields  
     84        ENDIF 
     85        ! 
     86      ENDIF 
     87 
    7988      IF( ln_pisdmp .AND. MOD( kt - nn_dttrc, nn_pisdmp ) == 0 )   CALL p4z_dmp( kt )      ! Relaxation of some tracers 
    8089      ! 
     
    238247   END SUBROUTINE p4z_sms_init 
    239248 
     249   SUBROUTINE p4z_ph_ini 
     250      !!--------------------------------------------------------------------- 
     251      !!                   ***  ROUTINE p4z_ini_ph  *** 
     252      !! 
     253      !!  ** Purpose : Initialization of chemical variables of the carbon cycle 
     254      !!--------------------------------------------------------------------- 
     255      INTEGER  ::  ji, jj, jk 
     256      REAL(wp) ::  zcaralk, zbicarb, zco3 
     257      REAL(wp) ::  ztmas, ztmas1 
     258      !!--------------------------------------------------------------------- 
     259 
     260      ! Set PH from  total alkalinity, borat (???), akb3 (???) and ak23 (???) 
     261      ! -------------------------------------------------------- 
     262      DO jk = 1, jpk 
     263         DO jj = 1, jpj 
     264            DO ji = 1, jpi 
     265               ztmas   = tmask(ji,jj,jk) 
     266               ztmas1  = 1. - tmask(ji,jj,jk) 
     267               zcaralk = trn(ji,jj,jk,jptal) - borat(ji,jj,jk) / (  1. + 1.E-8 / ( rtrn + akb3(ji,jj,jk) )  ) 
     268               zco3    = ( zcaralk - trn(ji,jj,jk,jpdic) ) * ztmas + 0.5e-3 * ztmas1 
     269               zbicarb = ( 2. * trn(ji,jj,jk,jpdic) - zcaralk ) 
     270               hi(ji,jj,jk) = ( ak23(ji,jj,jk) * zbicarb / zco3 ) * ztmas + 1.e-9 * ztmas1 
     271            END DO 
     272         END DO 
     273     END DO 
     274     ! 
     275   END SUBROUTINE p4z_ph_ini 
     276 
    240277   SUBROUTINE p4z_rst( kt, cdrw ) 
    241278      !!--------------------------------------------------------------------- 
     
    266303         ELSE 
    267304!            hi(:,:,:) = 1.e-9  
    268             ! Set PH from  total alkalinity, borat (???), akb3 (???) and ak23 (???) 
    269             ! -------------------------------------------------------- 
    270             DO jk = 1, jpk 
    271                DO jj = 1, jpj 
    272                   DO ji = 1, jpi 
    273                      ztmas   = tmask(ji,jj,jk) 
    274                      ztmas1  = 1. - tmask(ji,jj,jk) 
    275                      zcaralk = trn(ji,jj,jk,jptal) - borat(ji,jj,jk) / (  1. + 1.E-8 / ( rtrn + akb3(ji,jj,jk) )  ) 
    276                      zco3    = ( zcaralk - trn(ji,jj,jk,jpdic) ) * ztmas + 0.5e-3 * ztmas1 
    277                      zbicarb = ( 2. * trn(ji,jj,jk,jpdic) - zcaralk ) 
    278                      hi(ji,jj,jk) = ( ak23(ji,jj,jk) * zbicarb / zco3 ) * ztmas + 1.e-9 * ztmas1 
    279                   END DO 
    280                END DO 
    281             END DO 
     305            CALL p4z_ph_ini 
    282306         ENDIF 
    283307         CALL iom_get( numrtr, jpdom_autoglo, 'Silicalim', xksi(:,:) ) 
     
    392416#endif 
    393417            &                    + trn(:,:,:,jpsfe)                     & 
    394             &                    + trn(:,:,:,jpzoo)                     & 
     418            &                    + trn(:,:,:,jpzoo) * ferat3            & 
    395419            &                    + trn(:,:,:,jpmes) * ferat3            ) * cvol(:,:,:)  ) 
    396420 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/NEMO/TOP_SRC/PISCES/trcini_pisces.F90

    r3757 r4219  
    122122      rdenita =   3._wp /  5._wp 
    123123      o2ut    = 131._wp / 122._wp 
    124  
    125       CALL p4z_che        ! initialize the chemical constants 
    126124 
    127125      ! Initialization of tracer concentration in case of  no restart  
     
    162160         xksi(:,:)    = 2.e-6 
    163161         xksimax(:,:) = xksi(:,:) 
    164  
    165          ! Initialization of chemical variables of the carbon cycle 
    166          ! -------------------------------------------------------- 
    167          DO jk = 1, jpk 
    168             DO jj = 1, jpj 
    169                DO ji = 1, jpi 
    170                   ztmas   = tmask(ji,jj,jk) 
    171                   ztmas1  = 1. - tmask(ji,jj,jk) 
    172                   zcaralk = trn(ji,jj,jk,jptal) - borat(ji,jj,jk) / (  1. + 1.E-8 / ( rtrn + akb3(ji,jj,jk) )  ) 
    173                   zco3    = ( zcaralk - trn(ji,jj,jk,jpdic) ) * ztmas + 0.5e-3 * ztmas1 
    174                   zbicarb = ( 2. * trn(ji,jj,jk,jpdic) - zcaralk ) 
    175                   hi(ji,jj,jk) = ( ak23(ji,jj,jk) * zbicarb / zco3 ) * ztmas + 1.e-9 * ztmas1 
    176                END DO 
    177             END DO 
    178          END DO 
    179          ! 
     162        ! 
    180163      END IF 
    181164 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/SETTE/all_functions.sh

    r3294 r4219  
    9696   if [ ${#} -lt ${minargcount} ] 
    9797   then 
    98       echo "not enought arguments for set_namelist" 
     98      echo "not enough arguments for set_namelist" 
    9999      echo "${usage}" 
    100100      exit 1 
     
    113113      echo "doing \"set_namelist $@\". " 
    114114      echo "variable: \"$2\" is empty" 
    115       echo "control that variable $2 is in \"${EXE_DIR}/$1\" " 
     115      echo "confirm that variable $2 is in \"${EXE_DIR}/$1\" " 
    116116      echo "exit" 
    117117      echo "error in executing script : set_namelist $@" >> ${SETTE_DIR}/output.sette 
     
    128128        echo "                " >> ${SETTE_DIR}/output.sette 
    129129} 
     130 
    130131 
    131132# function to tidy up after each test and populate the NEMO_VALIDATION store 
     
    216217    fi 
    217218} 
     219 
     220############################################################# 
     221# extra functions to manipulate settings in the iodef.xml file 
     222# 
     223# Examples: 
     224#   set_xio_file_type    iodef.xml one_file 
     225#   set_xio_using_server iodef.xml true 
     226#   set_xio_buffer_size  iodef.xml 50000000 
     227# 
     228############################################################# 
     229 
     230usage2=" Usage : set_xio_file_type input_iodef.xml one_file||multiple_file" 
     231usage3=" Usage : set_xio_using_server input_iodef.xml true||false" 
     232usage4=" Usage : set_xio_buffer_size input_iodef.xml int_buffer_size" 
     233 
     234set_xio_file_type () { 
     235        minargcount=2 
     236        if [ ${#} -lt ${minargcount} ] 
     237        then 
     238                echo "not enough arguments for set_xio_file_type" 
     239                echo "${usage2}" 
     240                exit 1 
     241        fi 
     242        if [ $2 != "one_file" ] && [ $2 != "multiple_file" ] 
     243        then 
     244                echo "unrecognised argument for set_xio_file_type" 
     245                echo "${usage2}" 
     246                echo $2 
     247                exit 1 
     248        fi 
     249        unset minargcount 
     250        if [  ! -f ${SETTE_DIR}/output.sette ] ; then 
     251                touch ${SETTE_DIR}/output.sette 
     252        fi 
     253 
     254        echo "executing script : set_xio_file_type $@" >> ${SETTE_DIR}/output.sette 
     255        echo "################" >> ${SETTE_DIR}/output.sette 
     256 
     257        VAR_NAME=$( grep "^.*<.*file_definition.*type.*=" ${EXE_DIR}/$1 | sed -e "s% *\!.*%%" ) 
     258        if [ ${#VAR_NAME} -eq 0 ] 
     259        then 
     260                echo "doing \"set_xio_file_type $@\". " 
     261                echo "xml_tag: file_definition with variable: type is empty" 
     262                echo "confirm that an appropriate file_definition is in \"${EXE_DIR}/$1\" " 
     263                echo "exit" 
     264                echo "error in executing script : set_xio_file_type $@" >> ${SETTE_DIR}/output.sette 
     265                echo "....." >> ${SETTE_DIR}/output.sette 
     266                exit 1 
     267        fi 
     268        if [ $2 == "one_file" ]  
     269        then 
     270           sed -e "s:multiple_file:one_file:" ${EXE_DIR}/$1 > ${EXE_DIR}/$1.tmp 
     271        else 
     272           sed -e "s:one_file:multiple_file:" ${EXE_DIR}/$1 > ${EXE_DIR}/$1.tmp 
     273        fi 
     274        mv ${EXE_DIR}/$1.tmp ${EXE_DIR}/$1 
     275 
     276        echo "finished script : set_xio_file_type $@" >> ${SETTE_DIR}/output.sette 
     277        echo "++++++++++++++++" >> ${SETTE_DIR}/output.sette 
     278        echo "                " >> ${SETTE_DIR}/output.sette 
     279} 
     280 
     281set_xio_using_server () { 
     282        minargcount=2 
     283        if [ ${#} -lt ${minargcount} ] 
     284        then 
     285                echo "not enough arguments for set_xio_using_server" 
     286                echo "${usage2}" 
     287                exit 1 
     288        fi 
     289        if [ $2 != "true" ] && [ $2 != "false" ] 
     290        then 
     291                echo "unrecognised argument for set_xio_using_server" 
     292                echo "${usage2}" 
     293                echo $2 
     294                exit 1 
     295        fi 
     296        unset minargcount 
     297        if [  ! -f ${SETTE_DIR}/output.sette ] ; then 
     298                touch ${SETTE_DIR}/output.sette 
     299        fi 
     300 
     301        echo "executing script : set_xio_using_server $@" >> ${SETTE_DIR}/output.sette 
     302        echo "################" >> ${SETTE_DIR}/output.sette 
     303 
     304        VAR_NAME=$( grep "^.*<.*variable id.*=.*using_server.*=.*boolean" ${EXE_DIR}/$1 | sed -e "s% *\!.*%%" ) 
     305        if [ ${#VAR_NAME} -eq 0 ] 
     306        then 
     307                echo "doing \"set_xio_using_server $@\". " 
     308                echo "xml_tag: "variable id=using_server" with variable: boolean is empty" 
     309                echo "confirm that an appropriate variable id is in \"${EXE_DIR}/$1\" " 
     310                echo "exit" 
     311                echo "error in executing script : set_xio_using_server $@" >> ${SETTE_DIR}/output.sette 
     312                echo "....." >> ${SETTE_DIR}/output.sette 
     313                exit 1 
     314        fi 
     315        if [ $2 == "false" ] 
     316        then 
     317           sed -e "/using_server/s:true:false:" ${EXE_DIR}/$1 > ${EXE_DIR}/$1.tmp 
     318           export USING_MPMD=no 
     319        else 
     320           sed -e "/using_server/s:false:true:" ${EXE_DIR}/$1 > ${EXE_DIR}/$1.tmp 
     321           export USING_MPMD=yes 
     322        fi 
     323        mv ${EXE_DIR}/$1.tmp ${EXE_DIR}/$1 
     324 
     325        echo "finished script : set_xio_using_server $@" >> ${SETTE_DIR}/output.sette 
     326        echo "++++++++++++++++" >> ${SETTE_DIR}/output.sette 
     327        echo "                " >> ${SETTE_DIR}/output.sette 
     328} 
     329 
     330set_xio_buffer_size () { 
     331        minargcount=2 
     332        if [ ${#} -lt ${minargcount} ] 
     333        then 
     334                echo "not enough arguments for set_xio_buffer_size" 
     335                echo "${usage4}" 
     336                exit 1 
     337        fi 
     338        unset minargcount 
     339        if [  ! -f ${SETTE_DIR}/output.sette ] ; then 
     340                touch ${SETTE_DIR}/output.sette 
     341        fi 
     342 
     343        echo "executing script : set_xio_buffer_size $@" >> ${SETTE_DIR}/output.sette 
     344        echo "################" >> ${SETTE_DIR}/output.sette 
     345 
     346        VAR_NAME=$( grep "^.*<.*variable id.*=.*buffer_size.*=.*integer" ${EXE_DIR}/$1 | sed -e "s% *\!.*%%" ) 
     347        if [ ${#VAR_NAME} -eq 0 ] 
     348        then 
     349                echo "doing \"set_xio_buffer_size $@\". " 
     350                echo "xml_tag: "variable id=buffer_size" with variable: integer is empty" 
     351                echo "confirm that an appropriate variable id is in \"${EXE_DIR}/$1\" " 
     352                echo "exit" 
     353                echo "error in executing script : set_xio_buffer_size $@" >> ${SETTE_DIR}/output.sette 
     354                echo "....." >> ${SETTE_DIR}/output.sette 
     355                exit 1 
     356        fi 
     357        sed -e "/buffer_size/s:>.*<:>$2<:" ${EXE_DIR}/$1 > ${EXE_DIR}/$1.tmp 
     358        mv ${EXE_DIR}/$1.tmp ${EXE_DIR}/$1 
     359 
     360        echo "finished script : set_xio_buffer_size $@" >> ${SETTE_DIR}/output.sette 
     361        echo "++++++++++++++++" >> ${SETTE_DIR}/output.sette 
     362        echo "                " >> ${SETTE_DIR}/output.sette 
     363} 
     364 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/SETTE/iodef_sette.xml

    r3764 r4219  
    2121    --> 
    2222     
    23     <file_definition type="multiple_file" sync_freq="1d" min_digits="4"> 
     23    <file_definition type="multiple_file" name="@expname@_@freq@_@startdate@_@enddate@" sync_freq="1d" min_digits="4"> 
    2424     
    2525      <file_group id="1h" output_freq="1h"  output_level="10" enabled=".FALSE."/> <!-- 1h files --> 
     
    5454     
    5555   <axis_definition>   
    56       <axis id="deptht" long_name="Vertical T levels" unit="m"  /><!-- positive=".FALSE." --> 
    57       <axis id="depthu" long_name="Vertical U levels" unit="m"  /><!-- positive=".FALSE." --> 
    58       <axis id="depthv" long_name="Vertical V levels" unit="m"  /><!-- positive=".FALSE." --> 
    59       <axis id="depthw" long_name="Vertical W levels" unit="m"  /><!-- positive=".FALSE." --> 
     56      <axis id="deptht" long_name="Vertical T levels" unit="m" positive="down" /> 
     57      <axis id="depthu" long_name="Vertical U levels" unit="m" positive="down" /> 
     58      <axis id="depthv" long_name="Vertical V levels" unit="m" positive="down" /> 
     59      <axis id="depthw" long_name="Vertical W levels" unit="m" positive="down" /> 
    6060      <axis id="nfloat" long_name="Float number"      unit="-"  /> 
     61      <axis id="icbcla" long_name="Iceberg class"     unit="-"  /> 
    6162   </axis_definition>  
    6263     
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/SETTE/prepare_job.sh

    r3680 r4219  
    6868# 
    6969 
    70 usage=" Usage : ./prepare_job.sh INPUT_FILE_CONFIG_NAME NUMBER_PROC TEST_NAME MPI_FLAG JOB_FILE" 
    71 usage=" example : ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg 8 SHORT no/yes $JOB_FILE" 
    72  
    73  
    74 minargcount=5 
     70usage=" Usage : ./prepare_job.sh INPUT_FILE_CONFIG_NAME NUMBER_PROC TEST_NAME MPI_FLAG JOB_FILE NUM_XIO_SERVERS" 
     71usage=" example : ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg 8 SHORT no/yes $JOB_FILE 0" 
     72 
     73 
     74minargcount=6 
    7575        if [ ${#} -lt ${minargcount} ] 
    7676        then 
     
    9393MPI_FLAG=$4 
    9494JOB_FILE=$5 
     95NXIO_PROC=$6 
    9596 
    9697# export EXE_DIR. This directory is used to execute model  
     
    185186      case ${COMPILER} in  
    186187         ALTIX_NAUTILUS_MPT) 
    187                                 NB_REM=$( echo $NB_PROC | awk '{print $1 % 4}') 
     188                                NB_REM=$( echo $NB_PROC $NXIO_PROC | awk '{print ( $1 + $2 ) % 4}') 
    188189               if [ ${NB_REM} == 0 ] ; then 
    189190               # number of processes required is an integer multiple of 4 
    190191               # 
    191                NB_NODES=$( echo $NB_PROC | awk '{print $1 / 4}') 
     192               NB_NODES=$( echo $NB_PROC $NXIO_PROC | awk '{print ($1 + $2 ) / 4}') 
    192193            else 
    193194               # 
     
    195196               # round up the number of nodes required. 
    196197               # 
    197                NB_NODES=$( echo $NB_PROC | awk '{printf("%d",$1 / 4 + 1 )}') 
     198               NB_NODES=$( echo $NB_PROC $NXIO_PROC | awk '{printf("%d",($1 + $2 ) / 4 + 1 )}') 
    198199                  fi 
    199200            ;; 
     
    229230# Pass settings into job file by using sed to edit predefined strings 
    230231# 
    231         cat ${SETTE_DIR}/job_batch_template | sed -e"s/NODES/${NB_NODES}/" -e"s/NPROCS/${NB_PROC}/" \ 
     232        cat ${SETTE_DIR}/job_batch_template | sed -e"s/NODES/${NB_NODES}/" \ 
     233             -e"s/NPROCS/${NB_PROC}/" \ 
     234             -e"s/NXIOPROCS/${NXIO_PROC}/" \ 
    232235             -e"s:DEF_SETTE_DIR:${SETTE_DIR}:" -e"s:DEF_INPUT_DIR:${INPUT_DIR}:" \ 
    233236             -e"s:DEF_EXE_DIR:${EXE_DIR}:" \ 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/SETTE/sette.sh

    r3708 r4219  
    22############################################################ 
    33# Author : Simona Flavoni for NEMO 
    4 # Contact : sflod@locean-ipsl.upmc.fr 
     4# Contact: sflod@locean-ipsl.upmc.fr 
     5# 2013   : A.C. Coward added options for testing with XIOS in dettached mode 
    56# 
    67# sette.sh   : principal script of SET TEsts for NEMO (SETTE) 
     
    1516#set -u 
    1617#set -e 
    17 #+ 
    18 # 
    19 # ================ 
    20 # sette.sh 
    21 # ================ 
    22 # 
    23 # ---------------------------------------------- 
    24 # Set of tests for NEMO 
    25 # ---------------------------------------------- 
    26 # 
    27 # SYNOPSIS 
    28 # ======== 
    29 # 
    30 # :: 
    31 # 
    32 #  $ ./sette.sh 
    33 # 
     18# =========== 
    3419# DESCRIPTION 
    3520# =========== 
     
    3722# Variables to be checked by user: 
    3823# 
    39 # COMPILER : name of compiler as defined in NEMOGCM/ARCH directory  
    40 # 
    41 # BATCH_COMMAND :  name of the command for batch submission 
    42 # 
    43 # INTERACT_FLAG :  flag to run in interactive mode "yes" 
    44 #                       to run in batch mode "no" 
    45 # 
    46 # MPIRUN_FLAG   :  flag to run in parallel (MPI) "yes" 
    47 #                       to run in sequential mode (NB_PROC = 1) "no" 
     24# COMPILER          : name of compiler as defined in NEMOGCM/ARCH directory  
     25# BATCH_COMMAND_PAR :  name of the command for submitting parallel batch jobs 
     26# BATCH_COMMAND_SEQ :  name of the command for submitting sequential batch jobs   
     27# INTERACT_FLAG     : flag to run in interactive mode "yes" 
     28#                           to run in batch mode "no" 
     29# MPIRUN_FLAG       : flag to run in parallel (MPI) "yes" 
     30#                           to run in sequential mode (NB_PROC = 1) "no" 
     31# USING_XIOS        : flag to control the activation of key_iomput 
     32#                      "yes" to compile using key_iomput and link to the external XIOS library 
     33#                      "no"  to compile without key_iomput and link to the old IOIPSL library 
     34# USING_MPMD        : flag to control the use of stand-alone IO servers 
     35#                     requires USING_XIOS="yes" 
     36#                      "yes" to run in MPMD (detached) mode with stand-alone IO servers 
     37#                      "no"  to run in SPMD (attached) mode without separate IO servers  
     38# NUM_XIOSERVERS    : number of stand-alone IO servers to employ 
     39#                     set to zero if USING_MPMD="no" 
    4840# 
    4941# Principal script is sette.sh, that calls  
    5042# 
    51 #  makenemo  
    52 # 
    53 #   creates the exectuable in ${CONFIG_NAME}/BLD/bin/nemo.exe  (and its link opa in ${CONFIG_NAME}/EXP00) 
     43#  makenemo  : to create successive exectuables in ${CONFIG_NAME}/BLD/bin/nemo.exe  
     44#              and links to opa in ${CONFIG_NAME}/EXP00) 
    5445# 
    5546#  param.cfg : sets and loads following directories: 
    5647# 
    57 #   FORCING_DIR : is the directory for forcing files (tarfile) 
    58 # 
    59 #   INPUT_DIR : is the directory for input files storing  
    60 # 
    61 #   TMPDIR : is the temporary directory (if needed) 
     48#   FORCING_DIR         : is the directory for forcing files (tarfile) 
     49#   INPUT_DIR           : is the directory for input files storing  
     50#   TMPDIR              : is the temporary directory (if needed) 
     51#   NEMO_VALIDATION_DIR : is the validation directory 
     52# 
     53#   (NOTE: this file is the same for all configrations to be tested with sette) 
     54# 
     55#   all_functions.sh : loads functions used by sette (note: new functions can be added here) 
     56#   set_namelist     : function declared in all_functions that sets namelist parameters  
     57#   post_test_tidyup : creates validation storage directory and copies required output files  
     58#                      (solver.stat and ocean.output) in it after execution of test. 
     59# 
     60#  VALIDATION tree is: 
     61# 
     62#   NEMO_VALIDATION_DIR/WCONFIG_NAME/WCOMPILER_NAME/TEST_NAME/REVISION_NUMBER(or DATE) 
     63# 
     64#  prepare_exe_dir.sh : defines and creates directory where the test is executed 
     65#                       execution directory takes name of TEST_NAME defined for every test  
     66#                       in sette.sh. (each test in executed in its own directory) 
     67# 
     68#  prepare_job.sh     : to generate the script run_job.sh 
     69# 
     70#  fcm_job.sh         : run in batch (INTERACT_FLAG="no") or interactive (INTERACT_FLAG="yes") 
     71#                        see sette.sh and BATCH_TEMPLATE directory 
     72# 
     73#  NOTE: jobs requiring initial or forcing data need to have an input_CONFIG.cfg in which  
     74#        can be found paths to the input tar file) 
     75#  NOTE: if job is not launched for any reason you have the executable ready in ${EXE_DIR}  
     76#        directory 
     77#  NOTE: the changed namelists are left in ${EXE_DIR} directory whereas original namelists  
     78#        remain in ${NEW_CONF}/EXP00 
    6279#  
    63 #   NEMO_VALIDATION_DIR : is the validation directory 
    64 # 
    65 #   (NOTE: this file is the same for all configrations to be tested with sette) 
    66 # 
    67 # 
    68 #  all_functions.sh : loads functions used by sette (note: new functions can be added here) 
    69 # 
    70 #   set_namelist : function declared in all_functions that set namelist parameters for tests 
    71 # 
    72 #   post_test_tidyup : creates validation storage directory and copy needed output files (solver.stat and ocean.output) in it after execution of test. 
    73 # 
    74 #   Tree of VALIDATION is: 
    75 # 
    76 #   NEMO_VALIDATION_DIR/WCONFIG_NAME/WCOMPILER_NAME/TEST_NAME/REVISION_NUMBER(or DATE) 
    77 # 
    78 # 
    79 #  prepare_exe_dir.sh : defines and creates directory where the test is executed 
    80 # 
    81 #       execution directory takes name of TEST_NAME defined in every test in sette.sh 
    82 # 
    83 #       ( each test in executed in its own directory ) 
    84 # 
    85 # 
    86 #  prepare_job.sh 
    87 # 
    88 #  to generate the script run_job.sh 
    89 # 
    90 #  fcm_job.sh  
    91 # 
    92 #   run in batch (INTERACT_FLAG="no") or interactive (INTERACT_FLAG="yes") see sette.sh and BATCH_TEMPLATE directory 
    93 # 
    94 #   (note this job needs to have an input_CONFIG.cfg in which can be found input tar file) 
    95 #  
    96 #  NOTE: if job is not launched for some problems you have executable ready in ${EXE_DIR} directory 
    97 # 
    98 #  NOTE: the changed namelists are leaved in ${EXE_DIR} directory whereas original namelist remains in ${NEW_CONF}/EXP00 
    99 #  
    100 #  in ${SETTE_DIR} is created output.sette with the echo of executed commands 
    101 # 
    102 #  if sette.sh is stopped in output.sette there is written the last command executed by sette.sh 
    103 # 
    104 #  if you run: ./sette.sh 2>&1 | tee out.sette 
    105 # 
    106 #  in ${SETTE_DIR} out.sette is redirected standard error & standard output 
    107 # 
    108 # 
    109 # EXAMPLES 
    110 # ======== 
    111 # 
    112 # :: 
    113 # 
    114 #  $ ./sette.sh  
    115 # 
    116 # 
    117 # TODO 
    118 # ==== 
    119 # 
    120 # option debug 
    121 # 
    122 # EVOLUTIONS 
    123 # ========== 
    124 # 
    125 # $Id$ 
    126 # 
    127 #   * creation 
    128 # 
    129 #- 
    130 # 
    131 #- 
     80#  NOTE: a log file, output.sette, is created in ${SETTE_DIR} with the echoes of  
     81#        executed commands 
     82# 
     83#  NOTE: if sette.sh is stopped in output.sette there is written the last command  
     84#        executed by sette.sh 
     85# 
     86# example use: ./sette.sh  
     87######################################################################################### 
     88# 
    13289# Compiler among those in NEMOGCM/ARCH 
    13390COMPILER=PW6_VARGAS 
     
    13693export INTERACT_FLAG="no" 
    13794export MPIRUN_FLAG="yes" 
    138 # IF YOU DON'T WANT TO USE XIOS : (this is a list of keys to be delete) 
    139 export KEY_XIOS="key_iomput" 
    140 # IF YOU WANT TO USE XIOS : 
    141 #export KEY_XIOS="" 
    142  
     95export USING_XIOS="yes" 
     96# 
     97export DEL_KEYS="key_iomput" 
     98if [ ${USING_XIOS} == "yes" ]  
     99 then  
     100   export DEL_KEYS="" 
     101fi 
     102# 
     103# Settings which control the use of stand alone servers (only relevant if using xios) 
     104# 
     105export USING_MPMD="no" 
     106export NUM_XIOSERVERS=4 
     107export JOB_PREFIX=batch-mpmd 
     108# 
     109if [ ${USING_MPMD} == "no" ]  
     110 then 
     111   export NUM_XIOSERVERS=0 
     112   export JOB_PREFIX=batch 
     113fi 
     114# 
     115# 
     116if [ ${USING_MPMD} == "yes" ] && [ ${USING_XIOS} == "no"] 
     117 then 
     118   echo "Incompatible choices. MPMD mode requires the XIOS server" 
     119   exit 
     120fi 
     121# 
    143122 
    144123# Directory to run the tests 
     
    152131# Copy job_batch_COMPILER file for specific compiler into job_batch_template 
    153132cd ${SETTE_DIR} 
    154 cp BATCH_TEMPLATE/batch-${COMPILER} job_batch_template || exit 
     133cp BATCH_TEMPLATE/${JOB_PREFIX}-${COMPILER} job_batch_template || exit 
    155134 
    156135for config in 1 2 3 4 5 6 7 8 9 
     
    162141    export TEST_NAME="LONG" 
    163142    cd ${SETTE_DIR} 
    164     . ../CONFIG/makenemo -m ${CMP_NAM} -n GYRE_LONG -r GYRE -j 8 add_key "key_mpp_mpi" del_key ${KEY_XIOS} 
     143    . ../CONFIG/makenemo -m ${CMP_NAM} -n GYRE_LONG -r GYRE -j 8 add_key "key_mpp_mpi" del_key ${DEL_KEYS} 
    165144    cd ${SETTE_DIR} 
    166145    . param.cfg 
     
    169148    JOB_FILE=${EXE_DIR}/run_job.sh 
    170149    NPROC=4 
    171     \rm ${JOB_FILE} 
     150    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    172151    cd ${EXE_DIR} 
    173152    set_namelist namelist cn_exp \"GYRE_LONG\" 
     
    180159    set_namelist namelist jpnj 2 
    181160    set_namelist namelist jpnij 4 
    182     cd ${SETTE_DIR} 
    183     . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     161    if [ ${USING_MPMD} == "yes" ] ; then 
     162       set_xio_using_server iodef.xml true 
     163    else 
     164       set_xio_using_server iodef.xml false 
     165    fi 
     166    cd ${SETTE_DIR} 
     167    . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    184168 
    185169    cd ${SETTE_DIR} 
     
    199183    set_namelist namelist jpnij 4 
    200184    set_namelist namelist cn_ocerst_in \"GYRE_LONG_00000060_restart\" 
     185    if [ ${USING_MPMD} == "yes" ] ; then 
     186       set_xio_using_server iodef.xml true 
     187    else 
     188       set_xio_using_server iodef.xml false 
     189    fi 
    201190    for (( i=1; i<=$NPROC; i++)) ; do 
    202191        L_NPROC=$(( $i - 1 )) 
     
    204193        ln -sf ../LONG/GYRE_LONG_00000060_restart_${L_NPROC}.nc . 
    205194    done 
    206     cd ${SETTE_DIR} 
    207     . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     195    if [ ${USING_MPMD} == "yes" ] ; then 
     196       set_xio_using_server iodef.xml true 
     197    else 
     198       set_xio_using_server iodef.xml false 
     199    fi 
     200    cd ${SETTE_DIR} 
     201    . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    208202    cd ${SETTE_DIR} 
    209203    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    214208    export TEST_NAME="REPRO_1_4" 
    215209    cd ${SETTE_DIR} 
    216     . ../CONFIG/makenemo -m ${CMP_NAM} -n GYRE_4 -r GYRE -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${KEY_XIOS} 
     210    . ../CONFIG/makenemo -m ${CMP_NAM} -n GYRE_4 -r GYRE -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${DEL_KEYS} 
    217211    cd ${SETTE_DIR} 
    218212    . param.cfg 
     
    221215    JOB_FILE=${EXE_DIR}/run_job.sh 
    222216    NPROC=4 
    223     \rm ${JOB_FILE} 
     217    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    224218    cd ${EXE_DIR} 
    225219    set_namelist namelist cn_exp \"GYRE_14\" 
     
    234228    set_namelist namelist jpnj 4 
    235229    set_namelist namelist jpnij 4 
    236     cd ${SETTE_DIR} 
    237     . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     230    if [ ${USING_MPMD} == "yes" ] ; then 
     231       set_xio_using_server iodef.xml true 
     232    else 
     233       set_xio_using_server iodef.xml false 
     234    fi 
     235    cd ${SETTE_DIR} 
     236    . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    238237    cd ${SETTE_DIR} 
    239238    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    244243    JOB_FILE=${EXE_DIR}/run_job.sh 
    245244    NPROC=4 
    246     \rm $JOB_FILE 
     245    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    247246    cd ${EXE_DIR} 
    248247    set_namelist namelist cn_exp \"GYRE_22\" 
     
    256255    set_namelist namelist jpnj 2 
    257256    set_namelist namelist jpnij 4 
    258     cd ${SETTE_DIR} 
    259     . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     257    if [ ${USING_MPMD} == "yes" ] ; then 
     258       set_xio_using_server iodef.xml true 
     259    else 
     260       set_xio_using_server iodef.xml false 
     261    fi 
     262    cd ${SETTE_DIR} 
     263    . ./prepare_job.sh input_GYRE.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    260264    cd ${SETTE_DIR} 
    261265    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    268272    export TEST_NAME="LONG" 
    269273    cd ${SETTE_DIR} 
    270     . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2LIMPIS_LONG -r ORCA2_LIM_PISCES -j 8 add_key "key_mpp_mpi" del_key ${KEY_XIOS} 
     274    . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2LIMPIS_LONG -r ORCA2_LIM_PISCES -j 8 add_key "key_mpp_mpi" del_key ${DEL_KEYS} 
    271275    cd ${SETTE_DIR} 
    272276    . param.cfg 
     
    275279    JOB_FILE=${EXE_DIR}/run_job.sh 
    276280    NPROC=4 
    277     \rm ${JOB_FILE} 
     281    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    278282    cd ${EXE_DIR} 
    279283    set_namelist namelist cn_exp \"O2LP_LONG\" 
     
    298302    set_namelist namelist_pisces ln_ironsed .false. 
    299303    set_namelist namelist_pisces ln_hydrofe .false. 
    300     cd ${SETTE_DIR} 
    301     . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     304    if [ ${USING_MPMD} == "yes" ] ; then 
     305       set_xio_using_server iodef.xml true 
     306    else 
     307       set_xio_using_server iodef.xml false 
     308    fi 
     309    cd ${SETTE_DIR} 
     310    . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    302311     
    303312    cd ${SETTE_DIR} 
     
    341350        ln -sf ../LONG/O2LP_LONG_00000075_restart_ice_${L_NPROC}.nc . 
    342351    done 
    343     cd ${SETTE_DIR} 
    344     . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     352    if [ ${USING_MPMD} == "yes" ] ; then 
     353       set_xio_using_server iodef.xml true 
     354    else 
     355       set_xio_using_server iodef.xml false 
     356    fi 
     357    cd ${SETTE_DIR} 
     358    . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    345359    cd ${SETTE_DIR} 
    346360    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    351365    export TEST_NAME="REPRO_4_4" 
    352366    cd ${SETTE_DIR} 
    353     . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2LIMPIS_16 -r ORCA2_LIM_PISCES -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${KEY_XIOS} 
     367    . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2LIMPIS_16 -r ORCA2_LIM_PISCES -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${DEL_KEYS} 
    354368    cd ${SETTE_DIR} 
    355369    . param.cfg 
     
    358372    JOB_FILE=${EXE_DIR}/run_job.sh 
    359373    NPROC=16 
    360     \rm $JOB_FILE 
     374    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    361375    cd ${EXE_DIR} 
    362376    set_namelist namelist nn_it000 1 
     
    383397    # put ln_pisdmp to false : no restoring to global mean value 
    384398    set_namelist namelist_pisces ln_pisdmp .false. 
    385     cd ${SETTE_DIR} 
    386     . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     399    if [ ${USING_MPMD} == "yes" ] ; then 
     400       set_xio_using_server iodef.xml true 
     401    else 
     402       set_xio_using_server iodef.xml false 
     403    fi 
     404    cd ${SETTE_DIR} 
     405    . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    387406    cd ${SETTE_DIR} 
    388407    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    393412    JOB_FILE=${EXE_DIR}/run_job.sh 
    394413    NPROC=16 
    395     \rm $JOB_FILE 
     414    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    396415    cd ${EXE_DIR} 
    397416    set_namelist namelist nn_it000 1 
     
    417436    # put ln_pisdmp to false : no restoring to global mean value 
    418437    set_namelist namelist_pisces ln_pisdmp .false. 
    419     cd ${SETTE_DIR} 
    420     . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     438    if [ ${USING_MPMD} == "yes" ] ; then 
     439       set_xio_using_server iodef.xml true 
     440    else 
     441       set_xio_using_server iodef.xml false 
     442    fi 
     443    cd ${SETTE_DIR} 
     444    . ./prepare_job.sh input_ORCA2_LIM_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    421445    cd ${SETTE_DIR} 
    422446    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    428452    export TEST_NAME="LONG" 
    429453    cd ${SETTE_DIR} 
    430     . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2OFFPIS_LONG -r ORCA2_OFF_PISCES -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${KEY_XIOS} 
     454    . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2OFFPIS_LONG -r ORCA2_OFF_PISCES -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${DEL_KEYS} 
    431455    cd ${SETTE_DIR} 
    432456    . param.cfg 
     
    435459    JOB_FILE=${EXE_DIR}/run_job.sh 
    436460    NPROC=4 
    437     \rm $JOB_FILE 
     461    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    438462    cd ${EXE_DIR} 
    439463    set_namelist namelist cn_exp \"OFFP_LONG\" 
     
    459483    # put ln_pisdmp to false : no restoring to global mean value 
    460484    set_namelist namelist_pisces ln_pisdmp .false. 
    461     cd ${SETTE_DIR} 
    462     . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     485    if [ ${USING_MPMD} == "yes" ] ; then 
     486       set_xio_using_server iodef.xml true 
     487    else 
     488       set_xio_using_server iodef.xml false 
     489    fi 
     490    cd ${SETTE_DIR} 
     491    . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    463492     
    464493    cd ${SETTE_DIR} 
     
    495524    # put ln_pisdmp to false : no restoring to global mean value 
    496525    set_namelist namelist_pisces ln_pisdmp .false. 
    497     cd ${SETTE_DIR} 
    498     . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME}  ${MPIRUN_FLAG} ${JOB_FILE} 
     526    if [ ${USING_MPMD} == "yes" ] ; then 
     527       set_xio_using_server iodef.xml true 
     528    else 
     529       set_xio_using_server iodef.xml false 
     530    fi 
     531    cd ${SETTE_DIR} 
     532    . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME}  ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    499533    cd ${SETTE_DIR} 
    500534    . ./fcm_job.sh $NPROC  ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    505539    export TEST_NAME="REPRO_4_4" 
    506540    cd ${SETTE_DIR} 
    507     . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2OFFPIS_16 -r ORCA2_OFF_PISCES -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${KEY_XIOS} 
     541    . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2OFFPIS_16 -r ORCA2_OFF_PISCES -j 8 add_key "key_mpp_mpi key_mpp_rep" del_key ${DEL_KEYS} 
    508542    cd ${SETTE_DIR} 
    509543    . param.cfg 
     
    512546    JOB_FILE=${EXE_DIR}/run_job.sh 
    513547    NPROC=16 
    514     \rm $JOB_FILE 
     548    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    515549    cd ${EXE_DIR} 
    516550    set_namelist namelist nn_it000 1 
     
    535569    # put ln_pisdmp to false : no restoring to global mean value 
    536570    set_namelist namelist_pisces ln_pisdmp .false. 
    537     cd ${SETTE_DIR} 
    538     . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     571    if [ ${USING_MPMD} == "yes" ] ; then 
     572       set_xio_using_server iodef.xml true 
     573    else 
     574       set_xio_using_server iodef.xml false 
     575    fi 
     576    cd ${SETTE_DIR} 
     577    . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    539578    cd ${SETTE_DIR} 
    540579    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    545584    JOB_FILE=${EXE_DIR}/run_job.sh 
    546585    NPROC=16 
    547     \rm $JOB_FILE 
     586    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    548587    cd ${EXE_DIR} 
    549588    set_namelist namelist nn_it000 1 
     
    568607    # put ln_pisdmp to false : no restoring to global mean value 
    569608    set_namelist namelist_pisces ln_pisdmp .false.  
    570     cd ${SETTE_DIR} 
    571     . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     609    if [ ${USING_MPMD} == "yes" ] ; then 
     610       set_xio_using_server iodef.xml true 
     611    else 
     612       set_xio_using_server iodef.xml false 
     613    fi 
     614    cd ${SETTE_DIR} 
     615    . ./prepare_job.sh input_ORCA2_OFF_PISCES.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    572616    cd ${SETTE_DIR} 
    573617    . ./fcm_job.sh $NPROC  ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    579623    export TEST_NAME="LONG" 
    580624    cd ${SETTE_DIR} 
    581     . ../CONFIG/makenemo -m ${CMP_NAM} -n AMM12_LONG -r AMM12 -j 8 add_key "key_tide" del_key ${KEY_XIOS} 
     625    . ../CONFIG/makenemo -m ${CMP_NAM} -n AMM12_LONG -r AMM12 -j 8 add_key "key_tide" del_key ${DEL_KEYS} 
    582626    cd ${SETTE_DIR} 
    583627    . param.cfg 
     
    586630    JOB_FILE=${EXE_DIR}/run_job.sh 
    587631    NPROC=32 
    588     \rm $JOB_FILE 
     632    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    589633    cd ${EXE_DIR} 
    590634    set_namelist namelist nn_it000 1 
     
    600644    set_namelist namelist jpnj 4 
    601645    set_namelist namelist jpnij 32 
    602     cd ${SETTE_DIR} 
    603     . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     646    if [ ${USING_MPMD} == "yes" ] ; then 
     647       set_xio_using_server iodef.xml true 
     648    else 
     649       set_xio_using_server iodef.xml false 
     650    fi 
     651    cd ${SETTE_DIR} 
     652    . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    604653 
    605654    cd ${SETTE_DIR} 
     
    625674        ln -sf ../LONG/AMM12_00000006_restart_${L_NPROC}.nc . 
    626675    done 
    627     cd ${SETTE_DIR} 
    628     . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     676    if [ ${USING_MPMD} == "yes" ] ; then 
     677       set_xio_using_server iodef.xml true 
     678    else 
     679       set_xio_using_server iodef.xml false 
     680    fi 
     681    cd ${SETTE_DIR} 
     682    . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    629683    cd ${SETTE_DIR} 
    630684    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    635689    export TEST_NAME="REPRO_8_4" 
    636690    cd ${SETTE_DIR} 
    637     . ../CONFIG/makenemo -m ${CMP_NAM} -n AMM12_32 -r AMM12 -j 8 add_key "key_mpp_rep key_tide" del_key ${KEY_XIOS} 
     691    . ../CONFIG/makenemo -m ${CMP_NAM} -n AMM12_32 -r AMM12 -j 8 add_key "key_mpp_rep key_tide" del_key ${DEL_KEYS} 
    638692    cd ${SETTE_DIR} 
    639693    . param.cfg 
     
    642696    JOB_FILE=${EXE_DIR}/run_job.sh 
    643697    NPROC=32 
    644     \rm ${JOB_FILE} 
     698    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    645699    cd ${EXE_DIR} 
    646700    set_namelist namelist nn_it000 1 
     
    655709    set_namelist namelist jpnj 4 
    656710    set_namelist namelist jpnij 32 
    657     cd ${SETTE_DIR} 
    658     . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     711    if [ ${USING_MPMD} == "yes" ] ; then 
     712       set_xio_using_server iodef.xml true 
     713    else 
     714       set_xio_using_server iodef.xml false 
     715    fi 
     716    cd ${SETTE_DIR} 
     717    . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    659718    cd ${SETTE_DIR} 
    660719    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    675734    set_namelist namelist jpnj 8 
    676735    set_namelist namelist jpnij 32 
    677     cd ${SETTE_DIR} 
    678     . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     736    if [ ${USING_MPMD} == "yes" ] ; then 
     737       set_xio_using_server iodef.xml true 
     738    else 
     739       set_xio_using_server iodef.xml false 
     740    fi 
     741    cd ${SETTE_DIR} 
     742    . ./prepare_job.sh input_AMM12.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    679743    cd ${SETTE_DIR} 
    680744    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
     
    686750    export TEST_NAME="SHORT" 
    687751    cd ${SETTE_DIR} 
    688     . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2AGUL_1_2 -r ORCA2_LIM -j 8 add_key "key_mpp_mpi key_mpp_rep key_agrif" del_key "key_zdftmx" del_key ${KEY_XIOS} 
     752    . ../CONFIG/makenemo -m ${CMP_NAM} -n ORCA2AGUL_1_2 -r ORCA2_LIM -j 8 add_key "key_mpp_mpi key_mpp_rep key_agrif" del_key "key_zdftmx" del_key ${DEL_KEYS} 
    689753    cd ${SETTE_DIR} 
    690754    . param.cfg 
     
    693757    JOB_FILE=${EXE_DIR}/run_job.sh 
    694758    NPROC=2 
    695     \rm ${JOB_FILE} 
     759    if [ -f ${JOB_FILE} ] ; then \rm ${JOB_FILE} ; fi 
    696760    cd ${EXE_DIR} 
    697761    set_namelist namelist nn_it000 1 
     
    706770    set_namelist 1_namelist ln_ctl .false. 
    707771    set_namelist 1_namelist ln_clobber .true. 
    708     cd ${SETTE_DIR} 
    709     . ./prepare_job.sh input_ORCA2_LIM_AGRIF.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} 
     772    if [ ${USING_MPMD} == "yes" ] ; then 
     773       set_xio_using_server iodef.xml true 
     774    else 
     775       set_xio_using_server iodef.xml false 
     776    fi 
     777    cd ${SETTE_DIR} 
     778    . ./prepare_job.sh input_ORCA2_LIM_AGRIF.cfg $NPROC ${TEST_NAME} ${MPIRUN_FLAG} ${JOB_FILE} ${NUM_XIOSERVERS} 
    710779    cd ${SETTE_DIR} 
    711780    . ./fcm_job.sh $NPROC ${JOB_FILE} ${INTERACT_FLAG} ${MPIRUN_FLAG} 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/TOOLS/COMPILE/Fcheck_archfile.sh

    r3925 r4219  
    4040# :: 
    4141# 
    42 #  $ ./Fcheck_archfile.sh ARCHFILE COMPILER 
     42#  $ ./Fcheck_archfile.sh ARCHFILE CPPFILE COMPILER 
    4343# 
    4444# 
     
    9494   else 
    9595       if [ -f ${COMPIL_DIR}/$1 ]; then 
    96       # has the cpp keys file been changed since we copied the arch file in ${COMPIL_DIR}? 
    97       mycpp=$( ls -l ${COMPIL_DIR}/$2 | sed -e "s/.* -> //" ) 
    98       if [ "$mycpp" != "$( cat ${COMPIL_DIR}/cpp.history )" ]; then 
    99           echo $mycpp > ${COMPIL_DIR}/cpp.history 
    100           cpeval ${myarch} ${COMPIL_DIR}/$1 
     96      if [ "$2" != "nocpp" ]  
     97      then 
     98          # has the cpp keys file been changed since we copied the arch file in ${COMPIL_DIR}? 
     99          mycpp=$( ls -l ${COMPIL_DIR}/$2 | sed -e "s/.* -> //" ) 
     100          if [ "$mycpp" != "$( cat ${COMPIL_DIR}/cpp.history )" ]; then 
     101         echo $mycpp > ${COMPIL_DIR}/cpp.history 
     102         cpeval ${myarch} ${COMPIL_DIR}/$1 
     103          fi 
     104          # has the cpp keys file been updated since we copied the arch file in ${COMPIL_DIR}? 
     105          mycpp=$( find -L ${COMPIL_DIR} -cnewer ${COMPIL_DIR}/$1 -name $2 -print ) 
     106          [ ${#mycpp} -ne 0 ] && cpeval ${myarch} ${COMPIL_DIR}/$1 
    101107      fi 
    102       # has the cpp keys file been updated since we copied the arch file in ${COMPIL_DIR}? 
    103       mycpp=$( find -L ${COMPIL_DIR} -cnewer ${COMPIL_DIR}/$1 -name $2 -print ) 
    104       [ ${#mycpp} -ne 0 ] && cpeval ${myarch} ${COMPIL_DIR}/$1 
    105108      # has myarch file been updated since we copied it in ${COMPIL_DIR}? 
    106109      myarchdir=$( dirname ${myarch} ) 
     
    134137    if [ "$myarch" == "$( cat ${COMPIL_DIR}/arch.history )" ]; then  
    135138   if [ -f ${COMPIL_DIR}/$1 ]; then 
    136        # has the cpp keys file been changed since we copied the arch file in ${COMPIL_DIR}? 
    137        mycpp=$( ls -l ${COMPIL_DIR}/$2 | sed -e "s/.* -> //" ) 
    138        if [ "$mycpp" != "$( cat ${COMPIL_DIR}/cpp.history )" ]; then 
    139       echo $mycpp > ${COMPIL_DIR}/cpp.history 
    140       cpeval ${myarch} ${COMPIL_DIR}/$1 
     139       if [ "$2" != "nocpp" ]  
     140       then 
     141      # has the cpp keys file been changed since we copied the arch file in ${COMPIL_DIR}? 
     142      mycpp=$( ls -l ${COMPIL_DIR}/$2 | sed -e "s/.* -> //" ) 
     143      if [ "$mycpp" != "$( cat ${COMPIL_DIR}/cpp.history )" ]; then 
     144          echo $mycpp > ${COMPIL_DIR}/cpp.history 
     145          cpeval ${myarch} ${COMPIL_DIR}/$1 
     146      fi 
     147      # has the cpp keys file been updated since we copied the arch file in ${COMPIL_DIR}? 
     148      mycpp=$( find -L ${COMPIL_DIR} -cnewer ${COMPIL_DIR}/$1 -name $2 -print ) 
     149      [ ${#mycpp} -ne 0 ] && cpeval ${myarch} ${COMPIL_DIR}/$1 
    141150       fi 
    142        # has the cpp keys file been updated since we copied the arch file in ${COMPIL_DIR}? 
    143        mycpp=$( find -L ${COMPIL_DIR} -cnewer ${COMPIL_DIR}/$1 -name $2 -print ) 
    144        [ ${#mycpp} -ne 0 ] && cpeval ${myarch} ${COMPIL_DIR}/$1 
    145151       # has myarch file been updated since we copied it in ${COMPIL_DIR}? 
    146152       myarch=$( find -L ${MAIN_DIR}/ARCH -cnewer ${COMPIL_DIR}/$1 -name arch-${3}.fcm -print ) 
     
    150156   fi 
    151157    else 
    152    ls -l ${COMPIL_DIR}/$2 | sed -e "s/.* -> //" > ${COMPIL_DIR}/cpp.history 
     158   if [ "$2" != "nocpp" ]  
     159   then 
     160       ls -l ${COMPIL_DIR}/$2 | sed -e "s/.* -> //" > ${COMPIL_DIR}/cpp.history 
     161   fi 
    153162   echo ${myarch} > ${COMPIL_DIR}/arch.history 
    154163   cpeval ${myarch} ${COMPIL_DIR}/$1 
     
    157166 
    158167#- do we need xios library? 
    159 use_iom=$( sed -e "s/#.*$//" ${COMPIL_DIR}/$2 | grep -c key_iomput ) 
     168if [ "$2" != "nocpp" ]  
     169then 
     170    use_iom=$( sed -e "s/#.*$//" ${COMPIL_DIR}/$2 | grep -c key_iomput ) 
     171else 
     172    use_iom=0 
     173fi 
    160174have_lxios=$( sed -e "s/#.*$//" ${COMPIL_DIR}/$1 | grep -c "\-lxios" ) 
    161175if [[ ( $use_iom -eq 0 ) && ( $have_lxios -ge 1 ) ]] 
     
    166180 
    167181#- do we need oasis libraries? 
    168 use_oasis=$( sed -e "s/#.*$//" ${COMPIL_DIR}/$2 | grep -c key_oasis3 ) 
     182if [ "$2" != "nocpp" ]  
     183then 
     184    use_oasis=$( sed -e "s/#.*$//" ${COMPIL_DIR}/$2 | grep -c key_oasis3 ) 
     185else 
     186    use_oasis=0 
     187fi 
    169188for liboa in psmile.MPI1 mct mpeu scrip mpp_io 
    170189do 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/TOOLS/MISCELLANEOUS/chk_iomput.sh

    r2404 r4219  
    3535       echo ' --insrc              only print all variable definitions found in the source code' 
    3636       echo 'Examples' 
    37        echo '      chk_iomput.sh' 
    38        echo '      chk_iomput.sh --help' 
    39        echo '      chk_iomput.sh ../../CONFIG/ORCA2_LIM/EXP00/iodef.xml "../../NEMO/OPA_SRC/ ../../NEMO/LIM_SRC_2/"' 
     37       echo '      ./chk_iomput.sh' 
     38       echo '      ./chk_iomput.sh --help' 
     39       echo '      ./chk_iomput.sh ../../CONFIG/ORCA2_LIM/EXP00/iodef.xml "../../NEMO/OPA_SRC/ ../../NEMO/LIM_SRC_2/"' 
    4040       echo 
    4141       exit ;; 
     
    5959#------------------------------------------------ 
    6060# 
    61 [ $inxml -eq 1 ] && grep "< *field * id *=" $xmlfile 
     61external=$( grep -c "<field_definition  *\([^ ].* \)*src=" $xmlfile ) 
     62if [ $external -eq 1 ] 
     63then 
     64    xmlfield_def=$( grep "<field_definition  *\([^ ].* \)*src=" $xmlfile | sed -e 's/.*src="\([^"]*\)".*/\1/' ) 
     65    xmlfield_def=$( dirname $xmlfile )/$xmlfield_def    
     66else 
     67    xmlfield_def=$xmlfile 
     68fi 
     69[ $inxml -eq 1 ] && grep "< *field  *\([^ ].* \)*id *=" $xmlfield_def 
    6270[ $insrc -eq 1 ] && find $srcdir -name "*.[Ffh]90" -exec grep -iH "^[^\!]*call  *iom_put *(" {} \; 
    6371[ $(( $insrc + $inxml )) -ge 1 ] && exit 
     
    7179# list of variables used in "CALL iom_put" 
    7280# 
    73 varlistsrc=$( find $srcdir -name "*.[Ffh]90" -exec grep -i  "^[^\!]*call  *iom_put *(" {} \; | sed -e "s/.*iom_put *( *[\"\']\([^\"\']*\)[\"\'] *,.*/\1/" | sort -d ) 
     81badvarsrc=$( find $srcdir -name "*.[Ffh]90" -exec grep -i  "^[^\!]*call  *iom_put *(" {} \; | sed -e "s/.*iom_put *( *[\"\']\([^\"\']*\)[\"\'] *,.*/\1/" | grep -ic iom_put ) 
     82if [ $badvarsrc -ne 0 ] 
     83then 
     84    echo "The following call to iom_put cannot be checked" 
     85    echo 
     86    find $srcdir -name "*.[Ffh]90" -exec grep -i  "^[^\!]*call  *iom_put *(" {} \; | sed -e "s/.*iom_put *( *[\"\']\([^\"\']*\)[\"\'] *,.*/\1/" | grep -i iom_put | sort -d  
     87    echo 
     88fi 
     89varlistsrc=$( find $srcdir -name "*.[Ffh]90" -exec grep -i  "^[^\!]*call  *iom_put *(" {} \; | sed -e "s/.*iom_put *( *[\"\']\([^\"\']*\)[\"\'] *,.*/\1/" | grep -vi iom_put | sort -d ) 
    7490# 
    7591# list of variables defined in the xml file 
    7692# 
    77 varlistxml=$( grep "< *field * id *=" $xmlfile  | sed -e "s/^.*< *field * id *= *[\"\']\([^\"\']*\)[\"\'].*/\1/" | sort -d ) 
     93varlistxml=$( grep "< *field  *\([^ ].* \)*id *=" $xmlfield_def  | sed -e "s/^.*< *field .*id *= *[\"\']\([^\"\']*\)[\"\'].*/\1/" | sort -d ) 
    7894# 
    7995# list of variables to be outputed in the xml file 
    8096# 
    81 varlistout=$( grep "< *field * ref *=" $xmlfile  | sed -e "s/^.*< *field * ref *= *[\"\']\([^\"\']*\)[\"\'].*/\1/" | sort -d ) 
     97varlistout=$( grep "< *field  *\([^ ].* \)*field_ref *=" $xmlfile  | sed -e "s/^.*< *field .*field_ref *= *[\"\']\([^\"\']*\)[\"\'].*/\1/" | sort -d ) 
    8298# 
    8399echo "--------------------------------------------------" 
    84100echo  check if all iom_put found in $srcdir 
    85 echo  have a corresponding variable definition in $xmlfile 
     101echo  have a corresponding variable definition in $xmlfield_def 
    86102echo "--------------------------------------------------" 
    87103for var in $varlistsrc 
     
    90106    if [ $tst -ne 1 ]  
    91107    then 
    92    echo "problem with $var: $tst lines corresponding to its definition in $xmlfile, but defined in the code in" 
     108   echo "problem with $var: $tst lines corresponding to its definition in $xmlfield_def, but defined in the code in" 
    93109   for f in $srclist 
    94110   do 
  • branches/2013/dev_r3948_NOC_FK/NEMOGCM/TOOLS/maketools

    r3294 r4219  
    146146 
    147147#- When used for the first time, choose a compiler --- 
    148 . ${COMPIL_DIR}/Fcheck_archfile.sh arch_tools.fcm ${CMP_NAM} || exit 
     148. ${COMPIL_DIR}/Fcheck_archfile.sh arch_tools.fcm nocpp ${CMP_NAM} || exit 
    149149 
    150150#- Choose a default tool if needed --- 
Note: See TracChangeset for help on using the changeset viewer.