New URL for NEMO forge!   http://forge.nemo-ocean.eu

Since March 2022 along with NEMO 4.2 release, the code development moved to a self-hosted GitLab.
This present forge is now archived and remained online for history.
ticket/0974 – NEMO
wiki:ticket/0974

Version 5 (modified by rblod, 12 years ago) (diff)

--

Last edited Timestamp?


Author : S. Masson

ticket : #974

Branch : https://forge.ipsl.jussieu.fr/nemo/browser/branches/2012/dev_r3406_LOCEAN4_XIOS


Description

This is a brand new code so old one (XMLF90 and XML_IOSERVER) can be removed.
XIOS imported from http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk (rev 348) and added under EXTERNAL with the following commands:

#!/bin/bash

prjname=nemo
userlog=rblod
forge=forge.ipsl.jussieu.fr

rootsrc=${userlog}@${forge}/ipsl/forge/projets/${prjname}/svn

#==========================
# create XIOS vendor
#==========================
#
vendor=XIOS
#
# get it:
svn export http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/trunk XIOS_SRC
# stupid untar
cd XIOS_SRC
for tarname in tools/archive/*.tar.gz; do tar -xzf "$tarname"; done
rm -rf tools/archive/*
rm -rf tools/FCM
cd -

# import it in vendors
svn import XIOS_SRC svn+ssh://${rootsrc}/vendors/${vendor}/current -m "importing initial ${vendor} vendor drop"
revname=r_348
svn copy svn+ssh://${rootsrc}/vendors/${vendor}/current svn+ssh://${rootsrc}/vendors/${vendor}/${revname} -m "tagging ${vendor} ${revname}"
rm -rf XIOS_SRC
#

#==========================
# import in external
#==========================
#svn copy  svn+ssh://rblod@forge.ipsl.jussieu.fr/ipsl/forge/projets/nemo/svn/vendors/XIOS_r348 svn+ssh://rblod@forge.ipsl.jussieu.fr/ipsl/forge/projets/nemo/svn/branches/2012/dev_r3406_LOCEAN4_XIOS/NEMOGCM/EXTERNAL/XIOS -m "Import XIOS in dev_r3406_LOCEAN4_XIOS"

NEMO code changes

Very few, iom.F90 totally modified, nemogcm.F90 for init and finalize, diawri.F90 for new variables, step.F90 for iom_setkt, sbcmod.F90

svn diff OPA_SRC/step.F90 OPA_SRC/nemogcm.F90 OPA_SRC/SBC/sbcmod.F90 OPA_SRC/DIA/diawri.F90

Index: OPA_SRC/step.F90
===================================================================
--- OPA_SRC/step.F90	(revision 3410)
+++ OPA_SRC/step.F90	(working copy)
@@ -88,7 +88,7 @@
 #endif   
                              indic = 0                ! reset to no error condition
       IF( kstp /= nit000 )   CALL day( kstp )         ! Calendar (day was already called at nit000 in day_init)
-                             CALL iom_setkt( kstp )   ! say to iom that we are at time step kstp
+                             CALL iom_setkt( kstp - nit000 + 1 )   ! say to iom that we are at time step kstp
 
       !>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
       ! Update data, open boundaries, surface boundary condition (including sea-ice)
Index: OPA_SRC/nemogcm.F90
===================================================================
--- OPA_SRC/nemogcm.F90	(revision 3410)
+++ OPA_SRC/nemogcm.F90	(working copy)
@@ -72,7 +72,7 @@
 #endif
    USE lib_mpp         ! distributed memory computing
 #if defined key_iomput
-   USE mod_ioclient
+   USE xios
 #endif
 
    IMPLICIT NONE
@@ -183,8 +183,13 @@
       CALL nemo_closefile
 #if defined key_oasis3 || defined key_oasis4
       CALL cpl_prism_finalize           ! end coupling and mpp communications with OASIS
-#else
+# if defined key_iomput
+      IF( Agrif_Root() ) THEN
+         CALL xios_finalize             ! end mpp communications
+      ENDIF
+# else
       IF( lk_mpp )   CALL mppstop       ! end mpp communications
+# endif
 #endif
       !
    END SUBROUTINE nemo_gcm
@@ -218,9 +223,11 @@
 #if defined key_iomput
       IF( Agrif_Root() ) THEN
 # if defined key_oasis3 || defined key_oasis4
-         CALL cpl_prism_init( ilocal_comm )                 ! nemo local communicator given by oasis
+         CALL cpl_prism_init( ilocal_comm )      ! nemo local communicator given by oasis
+         CALL xios_initialize( "oceanx",local_comm=ilocal_comm )
+# else
+         CALL  xios_initialize( "nemo",return_comm=ilocal_comm )
 # endif
-         CALL  init_ioclient( ilocal_comm )                 ! exchange io_server nemo local communicator with the io_server
       ENDIF
       narea = mynode( cltxt, numnam, nstop, ilocal_comm )   ! Nodes selection
 #else
Index: OPA_SRC/SBC/sbcmod.F90
===================================================================
--- OPA_SRC/SBC/sbcmod.F90	(revision 3410)
+++ OPA_SRC/SBC/sbcmod.F90	(working copy)
@@ -240,7 +240,7 @@
       !                                            !        forcing field computation         !
       !                                            ! ---------------------------------------- !
 
-      CALL iom_setkt( kt + nn_fsbc - 1 )                 ! in sbc, iom_put is called every nn_fsbc time step
+!      CALL iom_setkt( kt + nn_fsbc - 1 )                 ! in sbc, iom_put is called every nn_fsbc time step
       !
       IF( ln_apr_dyn ) CALL sbc_apr( kt )                ! atmospheric pressure provided at kt+0.5*nn_fsbc
                                                          ! (caution called before sbc_ssm)
@@ -343,7 +343,7 @@
          IF( nn_ice > 0 )   CALL iom_put( "ice_cover", fr_i )   ! ice fraction 
       ENDIF
       !
-      CALL iom_setkt( kt )           ! iom_put outside of sbc is called at every time step
+!      CALL iom_setkt( kt )           ! iom_put outside of sbc is called at every time step
       !
       CALL iom_put( "utau", utau )   ! i-wind stress   (stress can be updated at 
       CALL iom_put( "vtau", vtau )   ! j-wind stress    each time step in sea-ice)
Index: OPA_SRC/DIA/diawri.F90
===================================================================
--- OPA_SRC/DIA/diawri.F90	(revision 3410)
+++ OPA_SRC/DIA/diawri.F90	(working copy)
@@ -144,7 +144,9 @@
       CALL iom_put( "sss"    , tsn(:,:,1,jp_sal)                     )    ! sea surface salinity
       CALL iom_put( "sss2"   , tsn(:,:,1,jp_sal) * tsn(:,:,1,jp_sal) )    ! square of sea surface salinity
       CALL iom_put( "uoce"   , un                                    )    ! i-current      
+      CALL iom_put( "suoce"  , un(:,:,1)                             )    ! surface i-current      
       CALL iom_put( "voce"   , vn                                    )    ! j-current
+      CALL iom_put( "svoce"  , vn(:,:,1)                             )    ! surface j-current
       
       CALL iom_put( "avt"    , avt                                   )    ! T vert. eddy diff. coef.
       CALL iom_put( "avm"    , avmu                                  )    ! T vert. eddy visc. coef.

Compilation

The new code is located under EXTERNAL/XIOS. The old one has none been removed yet but is useless.

  • netcdf has to be version 4 build with hdf5 support (hdf5 installed too)
  • variables for XIOS have been added to arch files. For now, just ARCH/arch-X64_CURIE.fcm. I didn't try to mix options for NEMO and XIOS so there is duplication
  • XIOS is compiled to generate a library under XIOS/lib and a binary under XIOS/bin. So makenemo has few modifications, just 3 lines to before NEMO compilations. For this part -j option is forced to 1
  • TOOLS/COMPILE/bldxag.cfg has been modified to include correct path to XIOS
  • clean option doesn't include cleaning of XIOS, since it takes a while

Note it can't be compiled without key_mpp_mpi, also OASIS case is not treated correctly at compilation. I'm not sure this branch can be compiled with AGRIF or without key_iomput

Execution

  • file xmlio_server.def is not needed anymore, informations are now included at the end of iodef.xml
  • iodef.xml can be split into szveral files using include files, here we define domain.xml and field.xml wchi shouldn't need to be changed, unless we had a new variables or new zoom

Testing

Testing could consider (where appropriate) other configurations in addition to NVTK].

NVTK Tested'''YES/NO'''
Other model configurations'''YES/NO'''
Processor configurations tested[ Enter processor configs tested here ]
If adding new functionality please confirm that the
New code doesn't change results when it is switched off
and ''works'' when switched on
'''YES/NO/NA'''

(Answering UNSURE is likely to generate further questions from reviewers.)

'Please add further summary details here'

  • Processor configurations tested
  • etc----

Bit Comparability

Does this change preserve answers in your tested standard configurations (to the last bit) ?'''YES/NO '''
Does this change bit compare across various processor configurations. (1xM, Nx1 and MxN are recommended)'''YES/NO'''
Is this change expected to preserve answers in all possible model configurations?'''YES/NO'''
Is this change expected to preserve all diagnostics?
,,''Preserving answers in model runs does not necessarily imply preserved diagnostics. ''
'''YES/NO'''

If you answered '''NO''' to any of the above, please provide further details:

  • Which routine(s) are causing the difference?
  • Why the changes are not protected by a logical switch or new section-version
  • What is needed to achieve regression with the previous model release (e.g. a regression branch, hand-edits etc). If this is not possible, explain why not.
  • What do you expect to see occur in the test harness jobs?
  • Which diagnostics have you altered and why have they changed?Please add details here........

System Changes

Does your change alter namelists?'''YES/NO '''
Does your change require a change in compiler options?'''YES/NO '''

If any of these apply, please document the changes required here.......


Resources

''Please ''summarize'' any changes in runtime or memory use caused by this change......''


IPR issues

Has the code been wholly (100%) produced by NEMO developers staff working exclusively on NEMO?'''YES/ NO '''

If No:

  • Identify the collaboration agreement details
  • Ensure the code routine header is in accordance with the agreement, (Copyright/Redistribution? etc).Add further details here if required..........