!=============================================== ! compiler information !=============================================== ! ! All the keys in this section should be filled here or put in a compiler rc ! file. See the other machine rc files for examples. ! ! So, either include a file: #include ${my.svn.dir}/rc/compiler-ifort-15.0.1-theia.rc !=============================================== ! libraries for HDF4 support (mandatory) !=============================================== ! Z library (used for compression in HDF) compiler.lib.z.fflags : compiler.lib.z.libs : -lz ! JPEG library (used for compression in HDF) JPEG_HOME : compiler.lib.jpeg.fflags : compiler.lib.jpeg.libs : -ljpeg ! SZ library (used for compression in HDF) SZIP_HOME : compiler.lib.sz.fflags : -I${SZIP}/include compiler.lib.sz.libs : -L${SZIP}/lib -lsz ! HDF4 library (without netcdf interface) HDF4_HOME : compiler.lib.hdf4.fflags : ${HDF4_INCLUDE_OPTS} compiler.lib.hdf4.libs : ${HDF4_LINK_OPTS} !=============================================== ! libraries for NetCDF support (mandatory) !=============================================== ! ! Use one netcdf library, optionaly with hdf5 if needed. To do, so add the ! corresponding 'with_*' to the 'my.df.define' key in your main rc file. ! NetCDF3 library (with_netcdf) NETCDF_HOME : compiler.lib.netcdf.fflags : compiler.lib.netcdf.libs : ! NetCDF4 library (with_netdf4) NETCDF4_HOME : compiler.lib.netcdf4.fflags : compiler.lib.netcdf4.libs : ! NetCDF4 library with parallel IO features (with_netcdf4_par) NETCDF4_MPI_HOME : ${NETCDF} compiler.lib.netcdf4_par.fflags : -I${NETCDF}/include compiler.lib.netcdf4_par.libs : -L${NETCDF}/lib -lnetcdff -lnetcdf ! * since NetCDF4 is build on top of HDF5, you may have to link/include HDF5 ! * library when building the code: ! HDF5 library (with_hdf5) HDF5_HOME : compiler.lib.hdf5.fflags : compiler.lib.hdf5.libs : ! HDF5 library with parallel features (with_hdf5_par) HDF5_MPI_HOME : ${HDF5} compiler.lib.hdf5_par.fflags : ${HDF5_INCLUDE_OPTS} compiler.lib.hdf5_par.libs : ${HDF5_LINK_OPTS} !=============================================== ! Other libraries !=============================================== ! MPI library MPI_HOME : compiler.lib.mpi.fflags : compiler.lib.mpi.libs : ! GRIB-API library (optional). It is needed only if you read meteo files from ! the ECMWF MARS system, where they are in grib format. compiler.lib.grib_api.fflags : compiler.lib.grib_api.libs : ! a UDUNITS library is optional. It can be used if you are reading meteo files ! in netCDF format to check that units conversion are correct. It is known to ! slow down the code significantly. The two main version are supported. ! Udunits v1.x library (add with_udunits1 to my.tmm.define in your main rc file) UDUNITS_HOME : compiler.lib.udunits1.fflags : compiler.lib.udunits1.libs : ! Udunits v2.x library (add with_udunits2 to my.tmm.define in your main rc file) compiler.lib.udunits2.fflags : compiler.lib.udunits2.libs : ! LAPACK library (optional). If available, add the with_lapack to ! my.tm5.define in your main rc file. see User Manual for further details. LAPACK_HOME : compiler.lib.lapack.fflags : compiler.lib.lapack.libs : -mkl=sequential !=============================================================== ! SETTINGS FOR BATCH SCHEDULER (qsub,bsub,loadleveler,slurm,pbs) !=============================================================== ! ! Settings in this section can be in a dedicated rc file, or just left here. So, ! either include such a file: #include ${my.svn.dir}/rc/queue-slurm-theia.rc !=============================================== ! make !=============================================== ! ! Specify the make command. The setup script will insert the 'build.jobs' ! specified in the expert.rc (probably 8). But can be overwritten at the ! command line (argument to 'setup_tm5' script) ! maker : gmake -j %{build.jobs} !JOB-COMPILE ! If you want to submit the compilation to a compute node, uncomment and set !JOB-COMPILE ! the necessary keys. See user manual for the further details. !JOB-COMPILE ! !JOB-COMPILE ! - if F (default), pycasso calls the ${maker} command in the foreground. !JOB-COMPILE ! - if T, then compilation is submitted according to "submit.to" key (default !JOB-COMPILE ! of which is set in expert.rc, and can be overwritten at the CLI): either "queue", or foreground. !JOB-COMPILE ! !JOB-COMPILE ! "my.queue.make" should be set in the main rc file, so the user can switch it on/off from there. !JOB-COMPILE ! !JOB-COMPILE build.make.submit : ${my.queue.make} !JOB-COMPILE !JOB-COMPILE ! list of options for the job manager, and their value !JOB-COMPILE queue..options.build : option1 option2 option3... !JOB-COMPILE queue..option.build. : !JOB-COMPILE queue..option.build. : !JOB-COMPILE queue..option.build. : !=============================================== ! MPI runner !=============================================== ! ! command for running MPI parallel programs mpirun.command : srun mpirun.args : ! name of command file; if empty, then executable and arguments are added to the command line mpirun.cmdfile : ! name of host file mpirun.hostfile : !=============================================== ! debugger !=============================================== ! ! type: totalview | idb | kdbg debugger : totalview ! command for debugger ! o KDE debugger around gdb (Linux systems using gfortran) !debugger.command : kdbg ! o Intel debugger (for systems with Intel compiler) !debugger.command : idb -gui ! o TotalView (IBM) debugger.command : totalview -searchPath=${build.sourcedir} !=============================================== ! model data !=============================================== ! the user scratch directory my.scratch : ${CARBONTRACKER} ! *Permanent* archives to search for meteo files ! (Note that the location of the meteo files at runtime is set in the main rc) my.meteo.search : ${CARBONTRACKER}/METEO/tm5-nc ! base path to other input data files distributed with the model my.data.dir : ! extra install tasks my.install.tasks :