Changes between Version 247 and Version 248 of DevelopmentActivities/ORCHIDEE-DOFOCO


Ignore:
Timestamp:
2020-01-22T21:09:00+01:00 (4 years ago)
Author:
luyssaert
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • DevelopmentActivities/ORCHIDEE-DOFOCO

    v247 v248  
    636636|| LCC – Duveiller et al 2018 || ??? || 
    637637 
    638 ==== Settings for spin-up and re-parameterization ==== 
    639 As a spin-up is costly in both time and computer resources, we need a strategy to avoid wasting these resources. Thus, the spin-up will like the parameterization, be done across scale moving from pixel to global scale. The spin-up will be done in parallel with the parameterization. Often the problems with the spin-up have a technical characters and show up for the pixels with extreme climate conditions. Before launching a longitudinal, regional or global spin-up, we should agree on the model version to use, because structural changes to the code will necessitate re-running the spin-up. The model version to use, will most likely be the version ready once the parameterization at the 1-pixel level is satisfying, and no more changes to the code need be added.  
    640 * Identify the variables that are targeted by the spin-up (such as NEP, heterotrophic respiration, decomposition etc). The spin-up will reveal whether parameters affecting these variables need to be tuned. The spin-up itself is also an interesting test case that could be loosely compared against data. 
    641 * We could compare the spin-ups to maps of soil carbon stocks to check the order of magnitude. Soil carbon maps should only be formally compared with the control run (spin-up + transient simulation) for the year 2000 because that run includes the simulated effects of N-deposition, management, litter raking and land cover change. The more simple configurations of the spin-up do not account for these processes or do not account for the right sequence of processes. 
    642  
    643 == Testing guidelines before committing new code to ORCHIDEE-CN-CAN on svn == 
    644  
    645 In some rare cases after bugfixes or implementation of new code, problems with reproducibility or 1+1=2 might be introduced unintentionally. Often these are related to incorrect variable dimensions in different sub-routines, memory issues or the lack of variables in the restart files. Such issues are easier to catch sooner than later. Thus, to minimize the time spent on debugging reproducibility and 1+1=2 issues, the following simple test are suggested/required before each commit of substantial code changes*: 
    646   
    647 === 1+1=2 === 
    648 If you do not run these test globally, make sure to use impose_veg=y. 
    649 The standard F2 run.def settings have been tested and 1+1=2 from revision r6272. Thus, please always make the test for the standard settings. In case of other run.def settings during your developments, make same tests for your settings also. More recent tests have shown that 1+1=2 for LCC with r6279 at the global scale.  
    650  
    651 '''The standard test''' 
    652  
    653 1) 1Y vs. 12*1M, i.e. do two simulations for a full year; one with period length of 1 year; the other with period length of 1 month. Afterwards compare their final restart files both from stomate and sechiba. 
    654  
    655 Most issues should be caught with (1). In case of problems, it will make the debugging easier, if you can track down the onset of difference between the restart files (i.e. start of year, onset of growing season, end of year etc.) Thus, continue with test like 
    656  
    657 2) 1D+1D=2D (compare the final restart files) 
    658  
    659 3) 1M+1M=2M (compare the final restart files) 
    660  
    661 === How to compare netcdf files === 
    662 The comparison is easiest if the same variables are contained in the two netcdf files and the variables are in the same order. The differ100.sh script by Josefine Ghattas, nicely does this. Moreover, it uses cdo diffv to compare the files. Howeve,r 5dim variables are ignored by the cdo diffv command, thus not all variables in the restart files can be compared by the differ100.sh 
    663  
    664 === Have to check for differences between to netcdf files that have variables with dimensions higher than 4 === 
    665 The matlab function nccmp are able to compare all variables contained within two netcdf files. The original version can be found here: https://fr.mathworks.com/matlabcentral/fileexchange/47857-comparing-two-netcdf-files. 
    666 I have made some small modifications such that the information produced by the script are put into a file instead of being printed to the screen. The update version can be found here on IRENE:/ccc/work/cont003/dofoco/dofoco/SCRIPTS/debug/nccmp.m or here on obelix:/home/data03/dofoco/SCRIPTS_obelix/debug/nccmp.m. 
    667  
    668 Sadly, matlab is not on obelix, but on IRENE. To open matlab on IRENE type ''Matlab'' or if you wish to run from the terminal type ''matlab -nodesktop''.  
    669  
    670 Next run the function by typing: 
    671  
    672 {{{ 
    673 NCCMP(ncfile1,ncfile2,tolerance,forceCompare) 
    674 }}} 
    675  
    676 ''Tolerance'' is if you allow some variation in the variables between the two files. We want identical files thus put [] here. 
    677  
    678 ''forceCompare'' can be set to true or false.  
    679  
    680 - True - write all occurrences of differences in a variable (specifically gives all the indices) to the file: all_diff.txt.  
    681  
    682 - False - only write if there is differences in a variable and its first occurrence of such differences to the file: first_diff.txt.  
    683  
    684 For global simulation the True option can produce a large file and the information might be hard to process, if there are many differences between the compared restart files. In addition, the True option makes the much script slower. However, for small simulation the true option is very useful.  
    685  
    686 I recommend that you use the re-ordered files from the difffer100.sh script as inputs to nccmp.  
    687  
    688 '''Debugging suggestions:''' 
    689 - If possible limit the spatial scale (to maximize speed).  
    690 - Track down the onset of the deviation between the restart files. 
    691 - Track down the problem. Hopefully, the differences in the restarts files will give you a clue on which variable to start the investigation from. The best approach depend on the source of the problem (memory issue or lack variable in the restart file etc.). For memory issue a debugger could be the best choice. For lack of variables in restart file it is best to run two identical runs with different period lenghts – either manually or by Totalview while tracking down which variables are causing the differences. 
    692 - Once you have fixed the problem, verify that it is also valid at the global scale (i.e. run the global tests again, if you chose to zoom in on a smaller region)  
    693  
    694  
    695  
    696  
     638 
     639 
     640 
     641