wiki:Branches/Driver_Improvements

Version 66 (modified by jpolcher, 7 years ago) (diff)

--

Improvements of the Driver of ORCHIDEE

These improvements will concern first the specificities of the driver (its capacity to read forcing files), not its performances that will be improved hopefully by the use of the IOServer. The focus is mainly on the use of forcing files at a 6 or 3-hourly time resolution (such as re-analysis), for which the driver interpolates the meteorological fields to calculate half-hourly fields.

Reminder

The driver of ORCHIDEE is expected to read forcing files with a very specific temporal structure. All time information are defined on UTC time. The fields read at a certain time step t0 in the NetCDF file correspond to:

  • the instantaneous value at t0+dt of scalar variables : Temperature, Wind Speed, Air Pressure and Specific Humidity.
  • the mean value between t0 and t0+dt for fluxes : Long and Short Wave incoming Radiation and the Precipitation.

Where dt is the time step of the forcing file.

AD (26/5/16): Note that, in the above case, the time at which we read the forcing is t0+dt.

NVui (09/06/16): To my opinion, the time at which we read the forcing is t0, not t0+dt because the instantaneous value at t0+dt is needed to interpolate the field from the t0+dt' time step and the mean value between t0 and t0+dt is needed from the t0 time step.

I would like also to remind that in the current driver, the information stored in the time variable (or timestp variable) is not used for defining the time stamp. Consequently, what is written here about the timestamp and the time at which the data is relevant (instanteneous, +dt, ...) is only implicit here. The time or timestp variable in the Netcdf file is only used for defining the timestep (dt) of the forcing.

Let's introduce an index itau of the forcing records: itau=1 is the first record in the nc file for all variables, etc.

Imagine we are currently reading the record itau_n, coming juste after the record itau_nm1. These are the notation of dim2driver/read2dim, and we have itau_n=itau_nm1+1. The "time_stamps" corresponding to these two records are separated by dt (the dt between two records, and not dt_sechiba=dt': dt=split*dt'). To make the link with the above notations of time, the time stamp of the record itau_n is t0+dt, and the one of of itau_nm1 is t0.

Now we need to remember where we are in the code, and what we need the forcing for. When we read itau_n, with values corresponding to t0+dt, sechiba hasn't yet run between t0 and t0+dt. It has run (or been initialized) up to t0, with forcing controlled by the forcing records itau_nm1 and before. So when we read the record itau_n corresponding to the time stamp t0+dt, we know the state of sechiba at t0, the forcing values at t0 (from itau_nm1), and we want to correctly define the forcing variables over each dt' between t0 and t0+dt (dt=split*dt').

For me, these are general principles which need to be true whichever the forcing if we decide to read together all the variables that have the same time stamp. The fact that fluxes are averages over a period of length dt after or before the time stamp can be dealt with in the above general framework (but the latter is an opinion).

NVui (09/06/16): I agree with the above comment

Interpolation

  • Instantaneous fields and the Long-Wave Radiation are then linearly interpolated for each model time-step t' (typically 1/2 hour) based on the values available at t0 and t0+dt with t0 < t' < t0+dt. Should we not change that for long-wave radiation so that we have one consistent procedure for scalars and another for fluxes ?
  • The Precipitation field is downscaled to model's temporal resolution using a step distribution function with one parameter (nb_spread, that is the number of half-hourly time steps over which we spread the precipitation). nb_spread can vary from 1 (all precipitations over a forcing time period (3 ou 6 hours) are put on the first model time step) to SPLIT_DT (number of integration time steps with a forcing time period, in that case, the precipitation are distributed uniformly). By default, nb_spread is set to 1. This default value is not appropriate for all climates and it would make sense to have typical values for various regions of the world (see thesis of T. d'Orgeval.).
  • The Short-Wave radiation is interpolated using a function distribution that corresponds to the solar angle distribution over a forcing time period and conserving the average flux for the [t0,t0+dt] interval. Under low incidence angles this interpolation gives at times strange values, this needs to be checked and improved.

AD (26/5/2016): I agree on the above general guidelines for interpolation, but :

  • LW should not be interpolated linearly: the mean value over the appropriate interval ([t0,t0+dt] in dim2driver) is the value read at t0+dt ; if we linearly interpolate LW between t0 and t0+dt, the mean value becomes LW(t0)+LW(t0+dt), which is usually different from LW(t0+dt). So if we do a linear interpolation, we change the amount of energy that is received by the surface compared to what is defined in the forcing file.

    NVui (09/06/16): I agree with this.

  • Apparently, the default behavior in dim2driver is to linearly interpolate LW (using the same interpolation scheme used for the instantaneous variables like Tair). This results from inter_lin=F by default, which induces netrad_cons=F and leads to use a linear interpolation, in contradiction with inter_lin=F. This needs to be double checked by someone, and reported in a ticket if confirmed.

    NVui (09/06/16): Yes, I confirm. The flag 'inter_lin' controls only the flag 'netrad_cons'. So I think it is useless. The same for the boolean 'no_inter'.

  • The simplest conservative interpolation is to use a uniform function, but it creates "steps". If we wanted to smooth this and remain conservative, we could use a function to constrain the time variations, as it is done with the zenith angle for SW. Based on the physics of LW and the data we have, we could define T4_mean as the mean of Tair4 over the 6 (split) dt' in the interpolation period. Then the value of LW(t0+n*dt') = LW(itau_n) * Tair(t0+n*dt')4 / T4_mean. I think this is conservative, but it needs to be checked.

    NVui (09/06/16): I could eventually test this interpolation at fluxnet stations.

JP (15/06/16) : Yes in principle an energy conservative method would be better for LWdown. At the time I developed the new methods for reading and processing the forcing, I compared a linear interpolation of LWdown with maintaining it constant over its period of validity and the difference was negligible. But the "contstant interpolation" can easily be implemented in the new driver and help would be appreciated.

  • Shall we understand that the default spred_prec is one in orchidee_driver ? We recently made a collective choice to take spred_prec=split/2, is it overlooked in orchideedriver?
  • For the SW, what are the low incidence angle problems? It must be kept in mind that there is no particular reason why the diurnal cycle of SW should be either very smooth or well centered at noon. We do not deal with extra-terrestrial radiation, but incident SW at the surface, and the noise induced by clouds can be very large.

    NVui (09/06/16): Even if clouds may induce noise, I don't think they can't explain the strange values under low incidence angles that Jan reports. The SWdown interpolated by orchideedriver at 6pm has systematic spikes, with values of the same order than those at noon. This is not realistic. To my opinion, there is a bug in the interpolation scheme.

JP (15/06/16) : I looked into the solar interpolation in detail as it is through this variable that I found that dim2driver was not working correctly and decided to re-write this code. It is not a bug but just the complexity of distributing a given amount of energy proportional to the zenith angle with the condition that at night the SWdown should be zero. If you have over a 3h interval 100W to distribute but the sun is only 30 minutes over the horizon, you have a problem : 600W ! I would be happy to test any new and original solution to this problem.

Jan's understanding of the working of dim2driver (Jan 26/05/2016)

It is important to understand that readim2.f90 is written in time steps and not physical time. There are 2 time step counters in the driver :

  • itau_n : is the last read index of the time axis in the forcing file and readdim2.f90. itau_nm1 is the previously read value and will be used with itau_n for the interpolation. Note that itau_nm1 = itau_n - 1.
  • itauin : is the time stepping of ORCHIDEE and takes 'split' values between itau_nm1' and itau_n.

The dates at each of the time steps is given by t(itau_n) or t(itauin) for ORCHIDEE and we can write t(itau_nm1)=t0 et t(itau_n)=t0+dt.

The readdim2.f90 will then interpolate using values at itau_nm1 and itau_n to obtain the forcing at itauin, which is the current time step of ORCHIDEE. Once itauin has stepped through split values, the forcing values of itau_n will be copied to itau_nm1 and the new values for itau_n will be read from the netCDF file. Because of this time-step approach the physical time chosen at itau_n=1 is very important.

The time-stepping is done independently from the physical time. Thus it is the choice of the physical time at itau_n=1 which will determine at which dates the values of the forcing at itau_nm1 and itau_n are valid. This poses no problem for instantaneous variables in the forcing file as then the indexing space (itau_n) and the physical time are identical.

The issue becomes more complicated when we consider fluxes which are averaged over an interval. The only valid assumption is that the dates corresponding to index itau_n is within the time interval over which the average was performed. In readim2.f90 there is no information if the date of n is at the start, end or anywhere else within the averaging period.

The graphic below explains probably better the above description.

The upper panel of the figure is meant to illustrate that the interpolation for fluxes in all generality (here for fluxes valid over the forcing time step before the instantaneous values) will not produce an interpolated value at the same time as the scalar field. readdim2.f90 assumes implicitly that the interval of validity of the fluxes at index itau_n are between itau_n-1/2 and itau_n+1/2 else the flux and scalar interpolations give results at different physical times.

The following explanations will probably help understand the meaning of the interpolated values in physical time. Let us suppose itauin is exactly the middle of the interval itau_nm1 and itau_n. From the upper panel of the figure we can deduce the following :

  • Tair(itauin) = (Tair(itau_nm1)+Tair(itau_n))/2 will be valid at t(itauin) = (t(itau_n-1)+t(itau_n))/2 thus the interpolated value is exactly in the middle of the physical time interval.
  • LWdown(itauin) = (LWdown(itau_nm1)+LWdown(itau_n))/2 will be valid at t(itauin) = (t(itau_n-2)+t(itau_n))/2 = t(itau_nm1). This is thus different from the time at which Tair(itauin) is valid. Exactly half a forcing time step away.

The lower panel will show that this error does not occur only if, the fluxes are averages centred on the time corresponding at which the scalars are valid. Thus the old driver is only correct if the fluxes at time step itau_n are exactly valid over the time interval t(itau_n-1/2) and t(itau_n+1/2).

AD(27/05/2016): I disagree with the above statement that motivates the development of a new interpolation method : "The time-stepping is done independently from the physical time. [...] The issue becomes more complicated when we consider fluxes which are averaged over an interval. The only valid assumption is that the dates corresponding to index itau_n is within the time interval over which the average was performed. In readim2.f90 there is no information if the date of n is at the start, end or anywhere else within the averaging period."

First, the time is defined without ambiguity from the initial time and the nb of records that were read. Second, when we read a flux at the record itau_n, we perfectly know to which interval it corresponds. For WFDEI for instance, we know from http://www.eu-watch.org/gfx_content/documents/README-WFDEI%20(v2016).pdf that the record at itau_n holds the average flux over the 3-hourly period that precedes the "time stamp" of itau_n ([t0,t0+dt] according to the above notations). It does not mean that LW(itau_n) is only "valid" at one instant over [t0,t0+dt] as suggested above. This is perfectly clear, perfectly coherent with the first panel of the above graphic, and tractable by dim2driver with the good values of inter_lin and netrad_cons.

It further means that, whichever the interpolation we perform, for WFDEI case, the average of the interpolated flux over the 3h that precede t0+dt must equal the value read in record itau_n. It thus implies that any interpolation that does not preserve the mean over [t0,t0+dt] is inappropriate for such a flux.

Looking at LWdown at the bottom of the 2nd panel above, the mean between [itau_nm1,itau_n]=[t0,t0+dt] is not equal to the value read at itau_n in any of these two cases: (a) uniform values over the green and brown intervals, (b) linear interpolation betwen the centre of the green and the centre of the brown intervals. Maybe another solution is coded, but then, it would be nice to explain it.

NVui (09/06/16): I don't perfectly unerstand what has been written here above but I don't agree when Jan concludes that the old driver is only correct if the fluxes at time step itau_n are exactly valid over the time interval t(itau_n-1/2) and t(itau_n+1/2). To develop the rationale on LWdown is not appropriate as we all agreed that LWdown should not be linearly interpolated to preserve the mean value. When we look at the code of readdim2, the interpolation of swdown is only based on the value 'swdown_n' (see http://forge.ipsl.jussieu.fr/orchidee/browser/trunk/ORCHIDEE/src_driver/readdim2.f90#L1603) and not on the value of swdown_nm1. For me, this clearly means that the value stored at 'itau_n' (swdown_n) is valid over the time interval itau_nm1 (t0) and itau_n (t0+dt).

Add an explicit time information within the forcing files

As described above, the driver of ORCHIDEE is relatively rigid because it makes strict assumptions on temporal specification for the forcing files. At first, we do not expect to develop a more flexible driver but we would like to add explicit time information in the driver in order to avoid misuse. We suggest to add a time_bounds attribute to the time variables stored in the Netcdf forcing file according to the CM convention : http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.6/cf-conventions.html

Each file will contain at least 2 time axes, one for the scalar variables and one for the flux variables. Each of the time axes will have its "time_bounds" variable in order to describe the interval each value describes. A time_op attribute (average, point) for each variable could be also added. The time specification of a forcing file (based on time_bounds and time_op attributes) should be read by the driver, in order to check for consistency.

Questions that remain:

  • How to implement in detail such a solution without perturbing too much the day-to-day usage of ORCHIDEE.
  • It would be most easily managed if we build a library of forcing files available to all at IPSL and which contains only forcing data sets with all the information.
  • Should we try to support old and incomplete forcing files and for how long ?
  • The improved driver should use all the extra time information in the forcing files but this might require some changes to IOIPSL.

NVui (09/06/16): This functionality of the driver developped by Jan is a key feature. It has the advantage to have self-documenting datasets, where the temporal specificities are explicitly set as an attribute of each data field. In addition, the capacity of the driver of reading and using this information enables of directly using the original datasets as they are produced by data centers, without any modification (ie removing one time-step for some or all variables). This modification was often needed when using the current driver

Restart the last forcing time step read

As the driver is built now, the first time step read (for the Temperature for instance) is valid for t+dt. This means that for a yearly file at a 6-hourly time resolution, the first temperature read is for "1-JAN 06:00".
For a simulation starting from scratch (no restart), the driver uses the first value for the second day ("2-JAN 00:00") in order to interpolate with the first one readed (and to calculate half-hourly values between "1-JAN 00:00" and "1-JAN 06:00". At the end of a simulation, the last forcing file read is not written in the restart file of the driver.
When we perform monthly runs using a yearly forcing file, it is managed within the driver to use the appropriate time step in the forcing file but when we perform yearly runs using a yearly file, we process as for a run starting from scratch (while we could use the last values read during the last run).
We suggest to restart the last time step read of the meteorological variables in order to be make possible this.

Once the improved driver knows about the time interval over which the forcing is valid, then we can test if the restart date is correct and we can dump the next restart at the end of the time interval.

Remove the reading of the watchout files

In ORCHIDEE, there is the possibility to write in an output file, the meteorological fields used during the simulation. It is a feature useful when doing a coupled simulation (ORCHIDEE coupled to LMDz) in order to be capable to re-use this 'forcing' in off-line mode. These output files are named WATCHOUT files. These WATCHOUT files are not based on the same template than the standard forcing files read by ORCHIDEE.
Consequently, there are specific reading for these WATCHOUT files.
We suggest to :

  • either generate the WATCHOUT files in ORCHIDEE under the format than a standard forcing file
  • or, develop a specific tool that convert the WATCHOUT file into the same format than a standard forcing file. This tool will be launch as a pre-treatment.

Comparison of the old and new drivers

ORCHIDEE has now 2 drivers which are quite different in their conception. As discussed above the temporal interpolation is a challenge so it is worth verifying how well it is done by both drivers and how important the differences are. This effort is undertaken here.

The issue is particularly critical with global forcing which are available at 3 or 6 hour intervals as any shift in the date attributed to the data points impacts the interpolation in time for the variables ORCHIDEE. This issue is largely attenuated for site forcing where atmospheric conditions are available at a frequency higher than hourly. Thus, the tests preformed here are carried out over the 5 global forcing data sets which are currently available to the community :

  • CRU-NCEP : This is the NCEP re-analysis corrected by Nicolas Viovy using CRU data. The version 5.4 which is used here only has data at 6 hourly intervals.
  • CSWP3 : Is the forcing prepared by U Tokyo for the international GSWP excise and is based on the C20C re-analysis of ECMWF.
  • PGF : Is the Princeton forcing prepared which combines the NCEP re-analysis, CRU and GPCC data.
  • WFD : Based on ERA-40 an corrected in the same way as WFDEI (This is the original WATCH data set).
  • WFDEI : Based on ERA-I and corrected by the UKMO using the independent CRU information (An evolution of the WATCH data set)

Modelling set-up

The ORCHIDEE trunk version of May 2016 is used for all the tests. All forcing are applied at a 900s time-step to the model which is configured to output through XIOS at this same time intervals. This ensures that all the post-processing done by XIOS on the forcing variables for the ORCHIDEE outputs is negligible compared to the assumptions on the time axis in the forcing.

From the forcing files a small region (lon=-10.3:5.3 lat=35.3:45.3) was extracted to reduce computing time and ensure that the original data set could easily be compared to the model output. The year 2006 was used, except for WFD where it was replaced by 1996 as this forcing ends in 2001.

Orchideedriver needs some information on the time representativeness of the data for its interpolations. The CF convention proposes to put this information into the attribute "cell_methods". This information is not used by dim2driver.

CRU-NCEP, WFD and WFDEI came already with the attributes describing the time averaging of the fluxes set. Two cases are distinguished there and they are :

  • Fluxes WFDEI : "time: mean(end)" (SWdown LWdown Rainf Snowf)
  • Fluxes CRU-NCEP & WFD : "time: mean(start)" (SWdown LWdown Rainf Snowf)
  • Scalar : "time: instantaneous" (Tair Qair PSurf Wind)

For GSWP and PGF this information was added after extracting the region of interest with the following commands :

  • GSWP Fluxes :
	for v in $FLUX ; do
	    ncatted -h -a cell_methods,$v,o,c,"time: mean(center)" ES_GSWP3_${i}.nc
	done

  • GSWP Scalar :
	for v in $SCALAR ; do
	    ncatted -h -a cell_methods,$v,o,c,"time: instantaneous" ES_GSWP3_${i}.nc
	done
  • PGF Fluxes :
	for v in $FLUX ; do
	    ncatted -h -a cell_methods,$v,o,c,"time: mean(start)" ES_PGF_${i}.nc
	done
  • PGF Scalar :
    	for v in $SCALAR ; do
	    ncatted -h -a cell_methods,$v,o,c,"time: instantaneous" ES_PGF_${i}.nc
	done

Furthermore for the fluxes in GSWP we tested all three possible hypothesis :

  • GSWPe : "time: mean(end)"
  • GSWPc : "time: mean(center)"
  • GSWMs : "time: mean(start)"

Hyungjun Kim later confirmed that the "mean(end)" assumption is the correct one for GSWP3. For PGF these setting was deduced from the meta-data in the file and should be verified with the authors.

Note : During these test it was noted that the option "TIME_SKIP" in the run.def does not work correctly. It should not used.

AD (09/06/2016): for me, the "cell methods" that are used above are not all consistent with the documentation of the forcings. See http://www.eu-watch.org/gfx_content/documents/README-WFDEI%20(v2016).pdf for WFD and WFDEI. For the fluxes, the above analysis seems incorrect for WFD; and for GSWP, has mean(center) or mean(end) been used ? A personal synthesis is available on https://forge.ipsl.jussieu.fr/orchidee/attachment/wiki/Branches/Driver_Improvements/forcing_driver.xls.

JP (15/06/2016) : I made a mistake above. for WFD and CRU-NCEP. It should have been "mean(start)" for the fluxes. Corrected now.

AD (19/06/2016): I think that CRU-NCEP should use "mean(end)", like WFDEI and GSWP. By the way, there is only line for the GSWP forcing, which cell_method has it been produced ?

Note also that dim2driver needs a forcing that has its first record at 0 UTC, in addition to instanteaneous "scalars" and fluxes as mean(end), to work correctly. This is why dim2driver has a weird diurnal cycle with WFDEI on https://forge.ipsl.jussieu.fr/orchidee/wiki/Branches/Driver_Improvements/ForcingVariable. The original WFDEI starts at 3UTC, and with dim2driver, you need to use WFDEIv1, in which a lag of 3 hours has been introduced by Ben Poulter and Fabienne Maignan. But I suppose that Jan has used dim2driver with the original WFDEI forcing, which would explain WFDEI is not well phased in the tests of Jan.

For WFD and GSWP3, we miss information on the time of the first record and the selected cell_method to conclude on the graphics. And orchideedriver too needs account for the time corresponding to the first record. Is such a flexibility permitted?

CRU-NCEP is well adapted to dim2driver, and behaves satisfactorily with it, while the SW shows weird peaks at sunset with orchideedriver. This probably comes from the fact it tries to redistribute too much SW on a too short period of daytime, because of the dt/2 shift introduced in the new driver (see the above figure with green and brownish intervalls). On https://forge.ipsl.jussieu.fr/orchidee/wiki/Branches/Driver_Improvements/ForcingVariable, you can also see that SW starts to increase after night time at a later time with orchideedriver than with dim2driver, which is again consistent with a dt/2 shift. The "scalar" variables show synchronous variations with both drivers, but it seems that the wind speed is a bit higher with orchideedriver (???). Finally, the LW is not similarly phased, and it is probaly because dim2driver uses the same linear interpolation as for the scalars, while orchideedriver does it between the "interval centers" (at t0+dt/2 and t0+3dt/2 if I undertand the above graphics correctly). But again, I believe that linear interpolation should not be used on LW.

Analysis

Analysis can be tricky as the graphical processing needs to interpret correctly the time dimension so that the curves are well positioned on the time axis. The difference in representativeness of instantaneous and averaged values also needs to be illustrated in the graphics. The following choices were made for the original forcing data :

  • Instantaneous values are represented by symbols as in all rigour they are only valid as the given point in time.
  • Mean fluxes are represented by horizontal lines over the period of validity of the value. This is the simplest assumption which can be made for average values.

The model is always represented by continuous lines.

To allow for the verification of the results presented here, the python code used to produce the graphics is provided here here

Results

The results for all forcing files are presented for a selected point within the region and two diurnal cycles in June (20 and 21 of June). The summer was chosen because of its more marked diurnal cycle. It did not rain on this point during the days examined.

The temporal evolution of two scalar variables (Tair and Qair) are compared as well as two fluxes (LWdown and SWdown). The Potential evaporation computed by ORCHIDEE is also shown as this is the most direct implication of the forcing on the model after precipitation. It allows to judge how strongly the errors made in the interpretation of the forcing will affect the functioning of the model. This is illustrated by the mean difference in PET for the month of June.

NVui(09/06/16) Based on the temporal specification for the forcing files expected by the current driver, there is only the CRU-NCEP dataset which can be directly used by it, without any modification. In this dataset, instantaneous variables are for t+dt and the fluxes are average between t and t+dt (mean(end)). The other forcings have different temporal specification that will produce inconsistent interpolation for all or some of the meteorological fields with dim2driver.

NVui(09/06/16) Looking at the interpolations done by orchideedriver, I think there is a problem with the interpolation of SWdown. As already mentioned above, there are systematic spikes at 6pm which can not only be explained by clouds or other 'noise' sources. In addition, there time period (between midnight and 6am for instance with CRU-NCEP, where the forcing indicates a non-null radiation, while the interpolated values are 0 over the entire time period. This does not preserve the mean value.

JP (15/06/16) : Both drivers use the same code for the temporal interpolation of SWdown. The difference is only in the way they phase the 3 hourly forcing with the computed solar angle. This has important consequences at the beginning and end of the day. Perhaps one solution would be to allow some solar radiation (diffuse !) also when the sun is under the horizon.

I'd like also to understand why there is a difference in the wind speed 'downscaling' from 10m to 2m between the drivers (look at the comparison for CRU-NCEP where the 2 interpolated time-series are in phase but with different values). Is it due to the displacement height ? In the current ORCHIDEE, the displacement height is set to 0.75*veget_height but we often find in the literature a value of 0.66*veget_height. Is this later value used in orchideedriver ?

JP (15/06/16) : Here also both drivers use exactly the same code and find that they produce about the same values of wind. So I do not understand your question.

Conclusions

For all the low frequency forcing (less frequent than hourly) dim2driver does not provide the correct forcing to ORCHIDEE. This is due to the fact that instantaneous values for scalars and averages for fluxes are provided in the forcing files. Some more detailed conclusions are apparent from these tests.

  • Two categories of forcing emerge because of their different conventions as to the placement of the mean values : ERA based (WFDEI, GSWP = mean(end)) or NCEP based (CRUNCEP, PGF = mean(end)).

WFD is a little particular here.

  • The attribute "cell_methods" is needed in the files so that the drivers can perform correct time interpolations.
  • The phasing of the diurnal cycle of the atmospheric variables is important in defining potential evaporation.
  • The uncertainty on the potential evaporation is probably not as important as the one on precipitation, which was not tested here.
  • As demonstrated by the figures below, the correct treatment of the diurnal cycle in orchideedriver results in much more comparable potential evaporations in ORCHIDEE.

Or in other words dim2driver introduces differences which are not physical. The PET obtained with orchideedriver is on the left and dim2driver on the right.

  • dim2driver generally underestimates potential evaporation.
  • The interpolation of SWdown should be revised as it produces discontinuous fluxes to the model.
  • Should a forcing have its fluxes centred on the time interval of averaging, perhaps dim2driver would behave correctly !
  • Note that the map of PGF is probably shifted by half a grid spacing in longitude and latitude.

Discussions between Nicolas Vuichard and Jan Polcher (16/6/16)

During a discussion we came to the conclusion that there must be a configuration of dim2driver and orchideedriver which yield the same results for the CRU-NCEP forcing. We believed this to be the case as CRU-NCEP was designed for dim2driver. It is the only forcing which contains different time axes for scalar and fluxes and the attributes needed for orchideedriver.

Tests were performed and some additions made to orchideedriver to try and achieve this. The following steps were needed and are now committed on the SVN server under revision 3567 :

1) Dawn and dusk issues in orchideedriver : The spikes in SWdown in the new driver are an indication that the solar angle becomes zero too quickly. The formulation used in readim2.f90 for generating the diurnal cycle could not be understood. So the solution was to implement a dawn_angle in orchideedriver. This is the solar angle from which the diffuse radiation is dominant and light still reaches the ground although the sun is setting. For the angles between zero and this dawn_angle the solar angle is maintained at dawn_angle.

2) maintaining LWdown : A nearest neighbour interpolation was implemented in orchideedriver. It is activated with the LWDOWN_cons=y. In dim2driver this same result is obtained with INTER_LIN = y and NETRAD_CONS = y.

3) The vegetation is not started in the same way in both drivers. It was decided to launch both simulations with the same restart file. This surprising behaviour of the initialisation of the model is probably a bug in slowproc.f90.

The results are presented below. The green line cannot be seen as it is perfectly overlaid by the blue one !

AD(22/06/2106): this is very encouraging, thank you for all this work. Yet, some little questions remain. When looking at the plot in the above link, which seem to correspond to a grid-point touching the one at which the above image was obtained, I see different wind speeds, RadT and EPot for the two CRU-NCEP simulations. The difference in wind can explain the aother differences, but why is the wind different?

Update from the Earth2Observe Project

In the framework of the Earth2Observe project three models (JULES, ISBA and ORCHIDEE) have compared the way they reconstruct the diurnal cycle from the 3hourly E2OFD. Please not that Jules has only provided hourly values and thus there is small shift at strong changes.

A few things can be noted in https://forge.ipsl.jussieu.fr/orchidee/attachment/wiki/Branches/Driver_Improvements/Graphics_ALL_E2OFD.pdf :

  • The SWdown is quite different in all 3 models. Only ORCHIDEE tries to generate a diurnal cycle using the solar zenithal angle.
  • The scalar are interpolated in exactly the same way.
  • There is a time shift in the radiation fluxes produced by ISBA related to the difficulties exposed above.

The results of the driver inter-comparison done for the Earth2Observe project have now been done and are available here : https://forge.ipsl.jussieu.fr/orchidee/attachment/wiki/Branches/Driver_Improvements/Analysis-SubDiurnal_v3.pdf. It is interesting to see that the new driver of ORCHIDEE behaves as designed and reproduces a slightly more realistic diurnal cycle than the two other models.

After some discussion with our colleagues of JULES, their simulation could have been better configured for the E2OFD data. Their driver foresees also the various difficulties with the temporal disaggregation : http://jules-lsm.github.io/vn4.9/input/temporal-interpolation.html.

Attachments (12)