PMP Molscat

PMP Molscat is a parallel version of Sheldon Green and Jeremy Hutson's Molscat quantum inelastic scattering program, version 14. It is useful for Molscat calculations that involve many propagations at different values of JTOT and M. It uses MPI message passing, which is available on most modern clusters. It also provides a utility (SMERGE) that can be used to split a large Molscat calculation over several machines without any message passing harness at all. The modifications to Molscat were made by George C. McBane of Grand Valley State University.

If you are unfamiliar with ordinary Molscat, you need to learn how to use it first; see http://www.giss.nasa.gov/tools/molscat/ .

If all the following conditions are true:

then you may find PMP Molscat useful. You might want to browse the manual, or scan through a poster I presented at the 2007 Dynamics of Molecular Collisions meeting, to help you decide whether to try the program.

Recent improvements by P. Valiron

Pierre Valiron, of the Laboratoire d'Astrophysique de Grenoble, has recently made useful improvements to PMP Molscat. For most users, the most important ones are

Sadly, Valiron died in 2008. Eventually his PMP and mine will be merged, and these improvements will be part of the standard distribution. The web page that hosted his version is no longer available; should you want a copy, please contact G. McBane directly. The only disadvantage of Valiron's version is a somewhat more complicated setup; it is not really practical to manage compilation and linking "by hand". He provides a Makefile that the user must customize to suit the local environment, with several examples; it works well in Unix (and probably Mac) environments but will require substantial modifications for people operating under Windows.

Getting started

To run PMP Molscat you need one of the following files:

For other operating systems, download whichever package you can unpack more easily; the only difference between them is the end-of-line conventions used in the Fortran files.
Release notes

April 25, 2008. (1) Bug in SMERGE that caused trouble when more than 99 temporary files were required has been fixed; thanks to Brian Stewart for the bug report. (2) SMERGE now gives a useful error message when the array size for storage of integer S-matrix data is inadequate, and comments in sdata.fi have been improved to make the storage requirements clearer. (3) Namelist variables jtotmin and jtotmax have been added to permit exclusion of some JTOT values during assembly of the merged S-matrix file.

April 20, 2005. Modified to give useful error message if the maximum number of temporary files was exceeded. Increased default amount of S-matrix storage space available in distribution version. Thanks to T.-G. Lee of Oak Ridge National Lab for the bug report.

April 2, 2005. Modified SMERGE so it can combine S-matrix files with different ENERGY arrays. The new capabilities are equivalent to IRSTRT=3 of the serial program, and allow easier "filling in" of S-matrices from abnormally terminated runs.

March 18-19, 2005. Fixed bug introduced in SMERGE version of 11 March, and also fixed old bug that gave corrupted output files if input data files had more than 800 channels.

March 11, 2005. (1) The main calculation now sorts the task list by the number of channels, improving the load balancing somewhat. (2) The SMERGE program now handles empty ISAVEU files (like the one produced by pmpdyn.f!) smoothly. (3) If more than one S-matrix for a single JTOT/M/INRG combination appears in the input files, SMERGE now discards all but the first, so that "overlapping" runs that duplicate a few JTOT/M/INRG combinations can be merged without causing problems in the postprocessor programs.

Feb. 20, 2005. Dynamic dispatch version (pmpdyn.f) available. 'autoname' variable added to SMERGE to automatically generate input filenames ISAVEU.0000, ISAVEU.0001, etc.

Please send bug reports and questions to George McBane at mcbaneg@gvsu.edu.


George McBane (mcbaneg@gvsu.edu)
Last modified: Wed Oct 12 17:25:42 -0400 2011