- Timestamp:
- 2016-11-06T17:31:33+01:00 (7 years ago)
- File:
-
- 1 edited
Legend:
- Unmodified
- Added
- Removed
-
branches/2016/dev_r6409_SIMPLIF_2_usrdef/NEMOGCM/NEMO/SAO_SRC/nemogcm.F90
r6982 r7200 4 4 !! Ocean system : NEMO GCM (ocean dynamics, on-line tracers, biochemistry and sea-ice) 5 5 !!====================================================================== 6 !! History : OPA ! 1990-10 (C. Levy, G. Madec) Original code 7 !! 7.0 ! 1991-11 (M. Imbard, C. Levy, G. Madec) 8 !! 7.1 ! 1993-03 (M. Imbard, C. Levy, G. Madec, O. Marti, M. Guyon, A. Lazar, 9 !! P. Delecluse, C. Perigaud, G. Caniaux, B. Colot, C. Maes) release 7.1 10 !! - ! 1992-06 (L.Terray) coupling implementation 11 !! - ! 1993-11 (M.A. Filiberti) IGLOO sea-ice 12 !! 8.0 ! 1996-03 (M. Imbard, C. Levy, G. Madec, O. Marti, M. Guyon, A. Lazar, 13 !! P. Delecluse, L.Terray, M.A. Filiberti, J. Vialar, A.M. Treguier, M. Levy) release 8.0 14 !! 8.1 ! 1997-06 (M. Imbard, G. Madec) 15 !! 8.2 ! 1999-11 (M. Imbard, H. Goosse) LIM sea-ice model 16 !! ! 1999-12 (V. Thierry, A-M. Treguier, M. Imbard, M-A. Foujols) OPEN-MP 17 !! ! 2000-07 (J-M Molines, M. Imbard) Open Boundary Conditions (CLIPPER) 18 !! NEMO 1.0 ! 2002-08 (G. Madec) F90: Free form and modules 19 !! - ! 2004-06 (R. Redler, NEC CCRLE, Germany) add OASIS[3/4] coupled interfaces 20 !! - ! 2004-08 (C. Talandier) New trends organization 21 !! - ! 2005-06 (C. Ethe) Add the 1D configuration possibility 22 !! - ! 2005-11 (V. Garnier) Surface pressure gradient organization 23 !! - ! 2006-03 (L. Debreu, C. Mazauric) Agrif implementation 24 !! - ! 2006-04 (G. Madec, R. Benshila) Step reorganization 25 !! - ! 2007-07 (J. Chanut, A. Sellar) Unstructured open boundaries (BDY) 26 !! 3.2 ! 2009-08 (S. Masson) open/write in the listing file in mpp 27 !! 3.3 ! 2010-05 (K. Mogensen, A. Weaver, M. Martin, D. Lea) Assimilation interface 28 !! - ! 2010-10 (C. Ethe, G. Madec) reorganisation of initialisation phase 29 !! 3.3.1! 2011-01 (A. R. Porter, STFC Daresbury) dynamical allocation 30 !! 3.4 ! 2011-10 (A. C. Coward, NOCS & J. Donners, PRACE) add nemo_northcomms 31 !! - ! 2011-11 (C. Harris) decomposition changes for running with CICE 32 !! 3.6 ! 2012-05 (C. Calone, J. Simeon, G. Madec, C. Ethe) Add grid coarsening 33 !! - ! 2013-06 (I. Epicoco, S. Mocavero, CMCC) nemo_northcomms: setup avoiding MPI communication 34 !! - ! 2014-12 (G. Madec) remove KPP scheme and cross-land advection (cla) 6 !! History : 3.6 ! 2015-12 (A. Ryan) Original code (from OPA_SRC/) 7 !! 4.0 ! 2016-11 (G. Madec, S. Flavoni) domain configuration / user defined interface 35 8 !!---------------------------------------------------------------------- 36 9 37 10 !!---------------------------------------------------------------------- 38 !! nemo_gcm 39 !! nemo_init 40 !! nemo_ctl 41 !! nemo_closefile 42 !! nemo_alloc 43 !! nemo_partition 44 !! factorise 11 !! nemo_gcm : solve ocean dynamics, tracer, biogeochemistry and/or sea-ice 12 !! nemo_init : initialization of the NEMO system 13 !! nemo_ctl : initialisation of the contol print 14 !! nemo_closefile: close remaining open files 15 !! nemo_alloc : dynamical allocation 16 !! nemo_partition: calculate MPP domain decomposition 17 !! factorise : calculate the factors of the no. of MPI processes 45 18 !!---------------------------------------------------------------------- 46 19 USE step_oce ! module used in the ocean time stepping module (step.F90) 47 USE domcfg ! domain configuration (dom_cfg routine) 48 USE mppini ! shared/distributed memory setting (mpp_init routine) 49 USE domain ! domain initialization (dom_init routine) 20 USE domain ! domain initialization (dom_init & dom_cfg routines) 21 USE istate ! initial state setting (istate_init routine) 22 USE phycst ! physical constant (par_cst routine) 23 USE step ! NEMO time-stepping (stp routine) 24 USE cpl_oasis3 ! OASIS3 coupling 25 USE diaobs ! Observation diagnostics (dia_obs_init routine) 50 26 #if defined key_nemocice_decomp 51 27 USE ice_domain_size, only: nx_global, ny_global 52 28 #endif 53 USE istate ! initial state setting (istate_init routine) 54 USE phycst ! physical constant (par_cst routine) 55 USE diaobs ! Observation diagnostics (dia_obs_init routine) 56 USE lib_fortran ! Fortran utilities (allows no signed zero when 'key_nosignedzero' defined) 57 USE step ! NEMO time-stepping (stp routine) 58 USE cpl_oasis3 ! OASIS3 coupling 59 USE lib_mpp ! distributed memory computing 29 ! ! Stand Alone Observation operator modules 30 USE sao_data 31 USE sao_intp 32 ! 33 USE lib_mpp ! distributed memory computing 34 USE mppini ! shared/distributed memory setting (mpp_init routine) 35 USE lbcnfd , ONLY : isendto, nsndto, nfsloop, nfeloop ! Setup of north fold exchanges 36 USE lib_fortran ! Fortran utilities (allows no signed zero when 'key_nosignedzero' defined) 60 37 #if defined key_iomput 61 38 USE xios ! xIOserver 62 39 #endif 63 USE lbcnfd, ONLY: isendto, nsndto, nfsloop, nfeloop ! Setup of north fold exchanges64 65 ! Stand Alone Observation operator modules66 USE sao_data67 USE sao_intp68 40 69 41 IMPLICIT NONE … … 94 66 !! 3. Cycle through match ups 95 67 !! 4. Write results to file 96 !!97 68 !!---------------------------------------------------------------------- 98 !! Initialise NEMO 99 CALL nemo_init 100 !! Initialise Stand Alone Observation operator data 101 CALL sao_data_init 102 !! Initialise obs_oper 103 CALL dia_obs_init 104 !! Interpolate to observation space 105 CALL sao_interp 106 !! Pipe to output files 107 CALL dia_obs_wri 108 !! Reset the obs_oper between 109 CALL dia_obs_dealloc 110 !! Safely stop MPI 111 IF(lk_mpp) CALL mppstop ! end mpp communications 69 ! 70 CALL nemo_init ! Initialise NEMO 71 ! 72 CALL sao_data_init ! Initialise Stand Alone Observation operator data 73 ! 74 CALL dia_obs_init ! Initialise obs_operator 75 ! 76 CALL sao_interp ! Interpolate to observation space 77 ! 78 CALL dia_obs_wri ! Pipe to output files 79 ! 80 CALL dia_obs_dealloc ! Reset the obs_oper between 81 ! 82 IF(lk_mpp) CALL mppstop ! Safely stop MPI (end mpp communications) 83 ! 112 84 END SUBROUTINE nemo_gcm 113 85 … … 119 91 !! ** Purpose : initialization of the NEMO GCM 120 92 !!---------------------------------------------------------------------- 121 INTEGER :: ji ! dummy loop indices 122 INTEGER :: ilocal_comm ! local integer 123 INTEGER :: ios 124 !!gm CHARACTER(len=80), DIMENSION(16) :: cltxt 125 CHARACTER(len=80), DIMENSION(-10:16) :: cltxt 126 ! 127 NAMELIST/namctl/ ln_ctl, nn_print, nn_ictls, nn_ictle, & 128 & nn_isplt, nn_jsplt, nn_jctls, nn_jctle, & 93 INTEGER :: ji ! dummy loop indices 94 INTEGER :: ios, ilocal_comm ! local integer 95 CHARACTER(len=120), DIMENSION(30) :: cltxt, cltxt2, clnam 96 ! 97 NAMELIST/namctl/ ln_ctl , nn_print, nn_ictls, nn_ictle, & 98 & nn_isplt , nn_jsplt, nn_jctls, nn_jctle, & 129 99 & nn_timing, nn_diacfl 130 NAMELIST/namcfg/ ln_read_cfg, ln_write_cfg, cp_cfg, jp_cfg, ln_use_jattr 131 !!---------------------------------------------------------------------- 132 ! 133 cltxt = '' 100 NAMELIST/namcfg/ ln_read_cfg, cn_domcfg, ln_write_cfg, cn_domcfg_out, ln_use_jattr 101 !!---------------------------------------------------------------------- 102 ! 103 cltxt = '' 104 cltxt2 = '' 105 clnam = '' 134 106 cxios_context = 'nemo' 135 107 ! … … 138 110 CALL ctl_opn( numnam_cfg, 'namelist_cfg', 'OLD', 'FORMATTED', 'SEQUENTIAL', -1, 6, .FALSE. ) 139 111 ! 140 REWIND( numnam_ref ) ! Namelist namctl in reference namelist : Control prints & Benchmark112 REWIND( numnam_ref ) ! Namelist namctl in reference namelist : Control prints 141 113 READ ( numnam_ref, namctl, IOSTAT = ios, ERR = 901 ) 142 114 901 IF( ios /= 0 ) CALL ctl_nam ( ios , 'namctl in reference namelist', .TRUE. ) 143 144 REWIND( numnam_cfg ) ! Namelist namctl in confguration namelist : Control prints & Benchmark115 ! 116 REWIND( numnam_cfg ) ! Namelist namctl in confguration namelist 145 117 READ ( numnam_cfg, namctl, IOSTAT = ios, ERR = 902 ) 146 118 902 IF( ios /= 0 ) CALL ctl_nam ( ios , 'namctl in configuration namelist', .TRUE. ) 147 148 ! 149 REWIND( numnam_ref ) ! Namelist namcfg in reference namelist : Control prints & Benchmark 119 ! 120 REWIND( numnam_ref ) ! Namelist namcfg in reference namelist : Control prints 150 121 READ ( numnam_ref, namcfg, IOSTAT = ios, ERR = 903 ) 151 122 903 IF( ios /= 0 ) CALL ctl_nam ( ios , 'namcfg in reference namelist', .TRUE. ) … … 155 126 904 IF( ios /= 0 ) CALL ctl_nam ( ios , 'namcfg in configuration namelist', .TRUE. ) 156 127 157 !!gm WRITE(clbug(3),*) ' after namelist namcfg read nstop', nstop158 159 128 ! !--------------------------! 160 129 ! ! Set global domain size ! (control print return in cltxt2) 161 130 ! !--------------------------! 162 IF( ln_read_cfg ) THEN ! Read sizes in configuration "mesh_mask" file 163 CALL iom_open( 'domain_cfg', inum ) 164 CALL iom_get( inum, 'jpiglo', ziglo ) ; jpiglo = INT( ziglo ) 165 CALL iom_get( inum, 'jpjglo', zjglo ) ; jpjglo = INT( zjglo ) 166 CALL iom_get( inum, 'jpkglo', zkglo ) ; jpkglo = INT( zkglo ) 167 CALL iom_get( inum, 'jperio', zperio ) ; jperio = INT( zperio ) 168 CALL iom_close( inum ) 169 WRITE(cltxt2(1),*) 170 WRITE(cltxt2(2),*) 'domain_cfg : domain size read in "domain_cfg" file : jp(i,j,k)glo = ', jpiglo, jpjglo, jpkglo 171 WRITE(cltxt2(3),*) '~~~~~~~~~~ lateral boudary type of the global domain jperio= ', jperio 172 ! 131 IF( ln_read_cfg ) THEN ! Read sizes in domain configuration file 132 CALL domain_cfg ( cltxt2, clnam, cn_cfg, nn_cfg, jpiglo, jpjglo, jpkglo, jperio ) 133 ! 173 134 ELSE ! user-defined namelist 174 CALL usr_def_nam( cltxt2, clnam, jpiglo, jpjglo, jpkglo, jperio ) 175 ENDIF 176 jpk = jpkglo 135 CALL usr_def_nam( cltxt2, clnam, cn_cfg, nn_cfg, jpiglo, jpjglo, jpkglo, jperio ) 136 ENDIF 137 ! 138 jpk = jpkglo 177 139 ! 178 140 #if defined key_agrif … … 217 179 ENDIF 218 180 #endif 181 219 182 narea = narea + 1 ! mynode return the rank of proc (0 --> jpnij -1 ) 220 183 … … 230 193 IF( .NOT.ln_read_cfg ) THEN 231 194 DO ji = 1, SIZE(clnam) 232 IF( TRIM(clnam (ji)) /= '' ) WRITE(numond, * ) clnam(ji)! namusr_def print195 IF( TRIM(clnam(ji)) /= '' ) WRITE(numond, * ) clnam(ji) ! namusr_def print 233 196 END DO 234 197 ENDIF … … 239 202 IF( jpni < 1 .OR. jpnj < 1 ) THEN 240 203 #if defined key_mpp_mpi 241 IF( Agrif_Root() ) CALL nemo_partition(mppsize)204 IF( Agrif_Root() ) CALL nemo_partition( mppsize ) 242 205 #else 243 206 jpni = 1 … … 247 210 ENDIF 248 211 249 ! Calculate domain dimensions given calculated jpni and jpnj 250 ! This used to be done in par_oce.F90 when they were parameters rather 251 ! than variables 252 IF( Agrif_Root() ) THEN 212 IF( Agrif_Root() ) THEN ! AGRIF mother: specific setting from jpni and jpnj 253 213 #if defined key_nemocice_decomp 254 jpi = ( nx_global+2-2*jpreci + (jpni-1) ) / jpni + 2*jpreci ! first dim.255 jpj = ( ny_global+2-2*jprecj + (jpnj-1) ) / jpnj + 2*jprecj ! second dim.214 jpi = ( nx_global+2-2*jpreci + (jpni-1) ) / jpni + 2*jpreci ! first dim. 215 jpj = ( ny_global+2-2*jprecj + (jpnj-1) ) / jpnj + 2*jprecj ! second dim. 256 216 #else 257 jpi = ( jpiglo -2*jpreci + (jpni-1) ) / jpni + 2*jpreci! first dim.258 jpj = ( jpjglo -2*jprecj + (jpnj-1) ) / jpnj + 2*jprecj! second dim.217 jpi = ( jpiglo -2*jpreci + (jpni-1) ) / jpni + 2*jpreci ! first dim. 218 jpj = ( jpjglo -2*jprecj + (jpnj-1) ) / jpnj + 2*jprecj ! second dim. 259 219 #endif 260 220 ENDIF … … 272 232 CALL ctl_opn( numout, 'ocean.output', 'REPLACE', 'FORMATTED', 'SEQUENTIAL', -1, 6, .FALSE., narea ) 273 233 ! 274 275 DO ji = 1, SIZE(clbug)276 IF( TRIM(clbug (ji)) /= '' ) WRITE(numout,*) clbug(ji) ! bug print277 END DO278 WRITE(numout,*)279 280 281 282 283 234 WRITE(numout,*) 284 235 WRITE(numout,*) ' CNRS - NERC - Met OFFICE - MERCATOR-ocean - INGV - CMCC' 285 236 WRITE(numout,*) ' NEMO team' 286 237 WRITE(numout,*) ' Stand Alone Observation operator' 287 WRITE(numout,*) ' version 1.0(2015) '238 WRITE(numout,*) ' NEMO version 3.7 (2015) ' 288 239 WRITE(numout,*) 289 240 WRITE(numout,*) 290 241 DO ji = 1, SIZE(cltxt) 291 IF( TRIM(cltxt (ji)) /= '' ) WRITE(numout,*) cltxt(ji)! control print of mynode242 IF( TRIM(cltxt (ji)) /= '' ) WRITE(numout,*) cltxt(ji) ! control print of mynode 292 243 END DO 293 WRITE(numout,cform_aaa) ! Flag AAAAAAA 294 ! 295 ENDIF 296 297 ! Now we know the dimensions of the grid and numout has been set we can 298 ! allocate arrays 244 WRITE(numout,*) 245 WRITE(numout,*) 246 DO ji = 1, SIZE(cltxt2) 247 IF( TRIM(cltxt2(ji)) /= '' ) WRITE(numout,*) cltxt2(ji) ! control print of domain size 248 END DO 249 ! 250 WRITE(numout,cform_aaa) ! Flag AAAAAAA 251 ! 252 ENDIF 253 254 ! Now we know the dimensions of the grid and numout has been set: we can allocate arrays 299 255 CALL nemo_alloc() 300 256 … … 315 271 CALL phy_cst ! Physical constants 316 272 CALL eos_init ! Equation of state 317 CALL dom_cfg ! Domain configuration318 273 CALL dom_init ! Domain 319 274 … … 337 292 IF(lwp) THEN ! control print 338 293 WRITE(numout,*) 339 WRITE(numout,*) 'nemo_ctl: Control prints & Benchmark'294 WRITE(numout,*) 'nemo_ctl: Control prints' 340 295 WRITE(numout,*) '~~~~~~~ ' 341 296 WRITE(numout,*) ' Namelist namctl' … … 364 319 WRITE(numout,*) '~~~~~~~ ' 365 320 WRITE(numout,*) ' Namelist namcfg' 366 WRITE(numout,*) ' read configuration definition files ln_read_cfg = ', ln_read_cfg 367 WRITE(numout,*) ' configuration name cp_cfg = ', TRIM(cp_cfg) 368 WRITE(numout,*) ' configuration resolution jp_cfg = ', jp_cfg 369 WRITE(numout,*) ' use file attribute if exists as i/p j-start ln_use_jattr = ', ln_use_jattr 321 WRITE(numout,*) ' read domain configuration file ln_read_cfg = ', ln_read_cfg 322 WRITE(numout,*) ' filename to be read cn_domcfg = ', TRIM(cn_domcfg) 323 WRITE(numout,*) ' write configuration definition file ln_write_cfg = ', ln_write_cfg 324 WRITE(numout,*) ' filename to be written cn_domcfg_out = ', TRIM(cn_domcfg_out) 325 WRITE(numout,*) ' use file attribute if exists as i/p j-start ln_use_jattr = ', ln_use_jattr 370 326 ENDIF 371 327 ! ! Parameter control … … 439 395 IF( numdct_heat /= -1 ) CLOSE( numdct_heat ) ! heat transports 440 396 IF( numdct_salt /= -1 ) CLOSE( numdct_salt ) ! salt transports 441 442 397 ! 443 398 numout = 6 ! redefine numout in case it is used after this point... … … 478 433 !! ** Method : 479 434 !!---------------------------------------------------------------------- 480 INTEGER, INTENT(in) :: num_pes! The number of MPI processes we have435 INTEGER, INTENT(in) :: num_pes ! The number of MPI processes we have 481 436 ! 482 437 INTEGER, PARAMETER :: nfactmax = 20 … … 532 487 INTEGER :: ifac, jl, inu 533 488 INTEGER, PARAMETER :: ntest = 14 534 INTEGER :: ilfax(ntest) 489 INTEGER, DIMENSION(ntest) :: ilfax 490 !!---------------------------------------------------------------------- 535 491 ! 536 492 ! lfax contains the set of allowed factors. 537 data (ilfax(jl),jl=1,ntest) / 16384, 8192, 4096, 2048, 1024, 512, 256, & 538 & 128, 64, 32, 16, 8, 4, 2 / 539 !!---------------------------------------------------------------------- 540 493 ilfax(:) = (/(2**jl,jl=ntest,1,-1)/) 494 ! 541 495 ! Clear the error flag and initialise output vars 542 kerr = 0543 kfax = 1496 kerr = 0 497 kfax = 1 544 498 knfax = 0 545 499 ! 546 500 ! Find the factors of n. 547 501 IF( kn == 1 ) GOTO 20 … … 551 505 ! l points to the allowed factor list. 552 506 ! ifac holds the current factor. 553 507 ! 554 508 inu = kn 555 509 knfax = 0 556 510 ! 557 511 DO jl = ntest, 1, -1 558 512 ! … … 578 532 ! 579 533 END DO 580 534 ! 581 535 20 CONTINUE ! Label 20 is the exit point from the factor search loop. 582 536 ! … … 586 540 587 541 SUBROUTINE nemo_northcomms 588 !! ======================================================================542 !!---------------------------------------------------------------------- 589 543 !! *** ROUTINE nemo_northcomms *** 590 !! nemo_northcomms : Setup for north fold exchanges with explicit 591 !! point-to-point messaging 592 !!===================================================================== 593 !!---------------------------------------------------------------------- 594 !! 595 !! ** Purpose : Initialization of the northern neighbours lists. 544 !! ** Purpose : Setup for north fold exchanges with explicit 545 !! point-to-point messaging 546 !! 547 !! ** Method : Initialization of the northern neighbours lists. 596 548 !!---------------------------------------------------------------------- 597 549 !! 1.0 ! 2011-10 (A. C. Coward, NOCS & J. Donners, PRACE) 598 550 !! 2.0 ! 2013-06 Setup avoiding MPI communication (I. Epicoco, S. Mocavero, CMCC) 599 551 !!---------------------------------------------------------------------- 600 601 552 INTEGER :: sxM, dxM, sxT, dxT, jn 602 553 INTEGER :: njmppmax 603 554 !!---------------------------------------------------------------------- 555 ! 604 556 njmppmax = MAXVAL( njmppt ) 605 557 ! 606 558 !initializes the north-fold communication variables 607 559 isendto(:) = 0 608 nsndto = 0609 560 nsndto = 0 561 ! 610 562 !if I am a process in the north 611 563 IF ( njmpp == njmppmax ) THEN … … 629 581 IF ((sxM .gt. sxT) .AND. (sxM .lt. dxT)) THEN 630 582 nsndto = nsndto + 1 631 583 isendto(nsndto) = jn 632 584 ELSEIF ((sxM .le. sxT) .AND. (dxM .ge. dxT)) THEN 633 585 nsndto = nsndto + 1 634 586 isendto(nsndto) = jn 635 587 ELSEIF ((dxM .lt. dxT) .AND. (sxT .lt. dxM)) THEN 636 588 nsndto = nsndto + 1 637 638 END 589 isendto(nsndto) = jn 590 ENDIF 639 591 END DO 640 592 nfsloop = 1 … … 654 606 l_north_nogather = .TRUE. 655 607 END SUBROUTINE nemo_northcomms 608 656 609 #else 657 610 SUBROUTINE nemo_northcomms ! Dummy routine … … 663 616 END MODULE nemogcm 664 617 665
Note: See TracChangeset
for help on using the changeset viewer.