Welcome, Guest
Username: Password: Remember me
Forum header

TOPIC: Hirlam + openMP

Hirlam + openMP 8 years 2 months ago #444

I'm trying to run Hirlam with openMP on 4 cores machine and for this purpose i have compiled hirlam with opemMPI
now i have faced with next problem:
As far as i understood not all hirlam's scripts working well with openMP they are working well with MPI. for this i edited the submessiondb and set the options nprocx and nprocy by default =1 but for case Forecast~ i set them nprocx=2 and nprocy=2
the question is how to set the ENV variable LAUNCH correctly because if set it to 4 then it will conflict with nprocx*nprocy in checkoptions but if set it to 1 then all processes will be run on one core ????

Re:Hirlam + openMP 8 years 2 months ago #445

  • Eoin Whelan
  • Eoin Whelan's Avatar
  • OFFLINE
  • Gold Boarder
  • Posts: 195
  • Thank you received: 34
Good morning

I am not sure if this is the best way to solve your problem but here is how I have tailoered my job submissions:

In the Met √Čireann branch (operational) I use a locally defined submission.ichec with environment variables set in Env_domain (both in scripts) for all PBS batch submissions.

I define submission.ichec as the submission script in config-sh/config.ICHEC

See:
hirlam.org/trac/browser/branches/metie_h...2/scripts/Env_domain
hirlam.org/trac/browser/branches/metie_h...pts/submission.ichec
and
hirlam.org/trac/browser/branches/metie_h...nfig-sh/config.ICHEC

Hopefully this helps.

Re:Hirlam + openMP 8 years 2 months ago #452

Hi Suleiman.
OpenMP and MPI are completely different ways of writing and running parallel programs and I do not think HIRLAM can make any use of OpenMP.
When running on local machine, in script submission.db I change nprocx and nprocy only for the "default". Something like:

# default nprocx, nprocy, nproc_hgs:
$nprocx = 4;
$nprocy = 2;
$nproc_hgs = 0;

This seems to be the safest place. We do not use the LAUNCH variable, it is more convenient to write mpi commands directly to scripts Prog, Postpp and 3DVAR, where parallel programs are actually launched.
In my case, in script Prog:

mpirun -r ssh -np $nproc $HL_LIB/src/$HIRLAM_CONFIG/bin/hlprog.x $OPTIONS || exit 1

Hope this helps to clarify things a bit, I'm not saying its the best way of doing it.

Re:Hirlam + openMP 8 years 2 months ago #453

  • Xiaohua Yang
  • Xiaohua Yang's Avatar
  • OFFLINE
  • Administrator
  • Posts: 195
  • Thank you received: 4
Andres Luhamaa wrote:
Hi Suleiman.
OpenMP and MPI are completely different ways of writing and running parallel programs and I do not think HIRLAM can make any use of OpenMP.
.

I can confirm that OpenMP option in HIRLAM forecast model has not been actively tested in the recent versions of HIRLAM. Some of the sources in grdy may not even compile in its current shape.

On the other hand the spectral HIRLAM (hirvda in 4DVAR) has been tested in some platforms.

Re:Hirlam + openMP 8 years 2 months ago #455

Thanks! Actually i did the same i just add new LAUNCHFC and replaced LAUNCH with LAUNCHFC in Prog and 3DVAR in others i left LAUNCH
so now it works fine

thanks to all for helping me
Time to create page: 0.080 seconds