Custom Query (116 matches)

Filters
 
Or
 
  
 
Columns

Show under each result:


Results (73 - 75 of 116)

Ticket Resolution Summary Owner Reporter
#85 fixed Interpolation does not work ymipsl ssenesi
Description

Doing a simple test of interpolation to a rectilinear domain fails at the stage of close_context. See attached files : s.xml defines one interpolated field and one single output file. When enabling the interpolated field, XIOS crashes with the attached stack trace. In my application, almost all definitions are done using the API. The internal grid is an unstructured one.

#86 fixed Allowing durations expressed as strings with the Fortran API ymipsl ssenesi
Description

Model developpers would appreciate to be able to :

call xios_set_attr('field',freq_op='6h')

i.e. to benefit from the API of the same features as in xml file for expressing durations (without managing XIOS types)

I didn't find a function for converting from string to duration, which could help in this respect

#90 fixed MPI dead lock in XIOS ymipsl mcastril
Description

We are experiencing a repetitive issue with XIOS 1.0 . It appeared using NEMO 3.6 stable and more than 2600 cores, and it seemed to be solved when using Intel 16 compiler and IMPI 5. However, after updating to NEMO 3.6 current stable, the problem appears when using 1920 or more cores. I don't really get how the NEMO revision change could affect to this, but there it is.

The problem is just in this line of client.cpp:

MPI_Send(buff,buffer.count(),MPI_CHAR,serverLeader,1,CXios::globalComm) ;

In the meanwhile the server.cpp is doing MPI_Iprobe continuosly in order to receive all the MPI_Send.

What we have observed is that using a high number of cores, around 80-100 of these cores get stucked at the MPI_Send, causing the run to hang and not complete. The fact that with a certain number of cores the issue appears 80% of the times but not always, made us think that could be related with the IMPI implementation.

Note: See TracQuery for help on using queries.