Commit 837d6a3f authored by Pavan Balaji's avatar Pavan Balaji
Browse files

[svn-r10804] Missed out some mpich2 --> mpich renames.

No reviewer.
parent ced32d4e
......@@ -27,7 +27,7 @@ on bug fixes and new releases.
8. Fault Tolerance
9. Environment Variables
10. Developer Builds
11. Installing MPICH2 on windows
11. Installing MPICH on windows
12. Multiple Fortran compiler support
......@@ -213,7 +213,7 @@ shared memory), Hydra process management) of MPICH up and running.
nothing is specified, ':1' is assumed.
More details on interacting with Hydra can be found at
http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager
http://wiki.mpich.org/mpich/index.php/Using_the_Hydra_Process_Manager
If you have completed all of the above steps, you have successfully
installed MPICH and run an MPI example.
......@@ -231,7 +231,7 @@ environments as well as our partner institutes. If you have problems
with the installation or usage of MPICH, please follow these steps:
1. First see the Frequently Asked Questions (FAQ) page at
http://wiki.mcs.anl.gov/mpich2/index.php/Frequently_Asked_Questions to
http://wiki.mpich.org/mpich/index.php/Frequently_Asked_Questions to
see if the problem you are facing has a simple solution. Many common
problems and their solutions are listed here.
......@@ -275,7 +275,7 @@ application or benchmark and send that along in your bug report.
4. If you have found a bug in MPICH, we request that you report it at
our bug tracking system:
(https://trac.mcs.anl.gov/projects/mpich2/newticket). Even if you
(https://trac.mpich.org/projects/mpich/newticket). Even if you
believe you have found a bug, we recommend you sending an email to
mpich-discuss@mcs.anl.gov first.
......@@ -517,7 +517,7 @@ hydra
Hydra is the default process management framework that uses existing
daemons on nodes (e.g., ssh, pbs, slurm, sge) to start MPI
processes. More information on Hydra can be found at
http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager
http://wiki.mpich.org/mpich/index.php/Using_the_Hydra_Process_Manager
mpd
---
......@@ -531,7 +531,7 @@ smpd
SMPD is a process manager for interoperability between Microsoft
Windows and UNIX, where some processes are running on Windows and
others are running on a variant of UNIX. For more information, please
see mpich2-%VERSION%/src/pm/smpd/README.
see mpich-%VERSION%/src/pm/smpd/README.
gforker
-------
......@@ -825,7 +825,7 @@ where <N> is the checkpoint number you want to restart from.
These instructions can also be found on the MPICH wiki:
http://wiki.mcs.anl.gov/mpich2/index.php/Checkpointing
http://wiki.mpich.org/mpich/index.php/Checkpointing
-------------------------------------------------------------------------
......@@ -859,14 +859,14 @@ For MPICH developers who want to directly work on the svn, there are
a few additional steps involved (people using the release tarballs do
not have to follow these steps). Details about these steps can be
found here:
http://wiki.mcs.anl.gov/mpich2/index.php/Getting_And_Building_MPICH
http://wiki.mpich.org/mpich/index.php/Getting_And_Building_MPICH
-------------------------------------------------------------------------
11. Installing MPICH2 on Windows
11. Installing MPICH on Windows
================================
Here are the instructions for setting up MPICH2 on a Windows machine:
Here are the instructions for setting up MPICH on a Windows machine:
(a) Install:
Microsoft Developer Studio 2003 or later
......@@ -875,16 +875,16 @@ Here are the instructions for setting up MPICH2 on a Windows machine:
choose the dos file format option
install perl and svn
(b) Checkout mpich2:
(b) Checkout mpich:
Bring up a command prompt.
(replace "yourname" with your MCS login name):
svn co https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk mpich2
svn co https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk mpich
(c) Generate *.h.in
Bring up a cygwin bash shell.
cd mpich2
cd mpich
./autogen.sh
exit
......@@ -892,38 +892,38 @@ Here are the instructions for setting up MPICH2 on a Windows machine:
(e) Open Developer Studio
open mpich2\mpich2.sln
build the ch3sockDebug mpich2 solution
build the ch3sockDebug mpich2s project
build the ch3sockRelease mpich2 solution
build the ch3sockRelease mpich2s project
build the Debug mpich2 solution
build the Release mpich2 solution
build the fortDebug mpich2 solution
build the fortRelease mpich2 solution
build the gfortDebug mpich2 solution
build the gfortRelease mpich2 solution
build the sfortDebug mpich2 solution
build the sfortRelease mpich2 solution
open mpich\mpich.sln
build the ch3sockDebug mpich solution
build the ch3sockDebug mpichs project
build the ch3sockRelease mpich solution
build the ch3sockRelease mpichs project
build the Debug mpich solution
build the Release mpich solution
build the fortDebug mpich solution
build the fortRelease mpich solution
build the gfortDebug mpich solution
build the gfortRelease mpich solution
build the sfortDebug mpich solution
build the sfortRelease mpich solution
(f) Open a command prompt
cd to mpich2\maint
cd to mpich\maint
execute "makegcclibs.bat"
(g) Open another Developer Studio instance
open mpich2\examples\examples.sln
open mpich\examples\examples.sln
build the Release target of the cpi project
(h) Return to Developer Studio with the mpich2 solution
(h) Return to Developer Studio with the mpich solution
set the version numbers in the Installer project
build the Installer mpich2 solution
build the Installer mpich solution
(i) Test and distribute mpich2\maint\ReleaseMSI\mpich2.msi
(i) Test and distribute mpich\maint\ReleaseMSI\mpich.msi
mpich2.msi can be renamed, eg mpich2-1.1.msi
mpich.msi can be renamed, eg mpich-1.1.msi
(j) To install the launcher:
......@@ -932,10 +932,10 @@ Here are the instructions for setting up MPICH2 on a Windows machine:
(k) Compile and run an MPI application:
Compile an mpi application. Use mpi.h from mpich2\src\include\win32
and mpi.lib in mpich2\lib
Compile an mpi application. Use mpi.h from mpich\src\include\win32
and mpi.lib in mpich\lib
Place your executable along with the mpich2 dlls somewhere accessable
Place your executable along with the mpich dlls somewhere accessable
to all the machines.
Execute a job by running something like: mpiexec -n 3 myapp.exe
......
......@@ -51,7 +51,7 @@
\jcompress\viewkind4\viewscale100\nolnhtadjtbl\rsidroot3749324 \fet0\sectd \linex0\sectdefaultcl\sftnbj {\*\pnseclvl1\pnucrm\pnstart1\pnindent720\pnhang {\pntxta .}}{\*\pnseclvl2\pnucltr\pnstart1\pnindent720\pnhang {\pntxta .}}{\*\pnseclvl3
\pndec\pnstart1\pnindent720\pnhang {\pntxta .}}{\*\pnseclvl4\pnlcltr\pnstart1\pnindent720\pnhang {\pntxta )}}{\*\pnseclvl5\pndec\pnstart1\pnindent720\pnhang {\pntxtb (}{\pntxta )}}{\*\pnseclvl6\pnlcltr\pnstart1\pnindent720\pnhang {\pntxtb (}{\pntxta )}}
{\*\pnseclvl7\pnlcrm\pnstart1\pnindent720\pnhang {\pntxtb (}{\pntxta )}}{\*\pnseclvl8\pnlcltr\pnstart1\pnindent720\pnhang {\pntxtb (}{\pntxta )}}{\*\pnseclvl9\pnlcrm\pnstart1\pnindent720\pnhang {\pntxtb (}{\pntxta )}}\pard\plain
\ql \li0\ri0\nowidctlpar\faauto\outlinelevel0\rin0\lin0\itap0\pararsid289831 \fs24\lang1033\langfe1033\cgrid\langnp1033\langfenp1033 {\f2\fs20\insrsid5849808 MPICH2 for Microsoft Windows
\ql \li0\ri0\nowidctlpar\faauto\outlinelevel0\rin0\lin0\itap0\pararsid289831 \fs24\lang1033\langfe1033\cgrid\langnp1033\langfenp1033 {\f2\fs20\insrsid5849808 MPICH for Microsoft Windows
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\rin0\lin0\itap0 {\f2\fs20\insrsid5849808 Email bugs and error reports to:
\par mpich-discuss@mcs.anl.gov
\par
......@@ -67,16 +67,16 @@
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\outlinelevel0\rin0\lin0\itap0\pararsid289831 {\f2\fs20\insrsid5849808 THE INSTALLER:
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\rin0\lin0\itap0 {\f2\fs20\insrsid5849808
\par
\par You must install MPICH2 on all machines that you want to run MPI programs on. }{\f2\fs20\insrsid3749324 Run the installer}{\f2\fs20\insrsid5849808 on each machine individually.
\par You must install MPICH on all machines that you want to run MPI programs on. }{\f2\fs20\insrsid3749324 Run the installer}{\f2\fs20\insrsid5849808 on each machine individually.
\par
\par The installer creates the following mpich2 directory structure on your machine:
\par mpich2\\bin
\par mpich2\\include
\par mpich2\\lib
\par The installer creates the following mpich directory structure on your machine:
\par mpich\\bin
\par mpich\\include
\par mpich\\lib
\par
\par The include and lib directories contain the libraries needed to compile MPI programs. The mpich2 dlls are copied to the Windows\\
system32 directory. The bin directory contains smpd.exe which is the MPICH2 process manager used to launch MPI programs. mpiexec.exe}{\f2\fs20\insrsid9724047 ,}{\f2\fs20\insrsid5849808 also found in the bin directory}{\f2\fs20\insrsid9724047 ,}{
\f2\fs20\insrsid5849808 is used to start MPICH2 jobs.
\par The include and lib directories contain the libraries needed to compile MPI programs. The mpich dlls are copied to the Windows\\
system32 directory. The bin directory contains smpd.exe which is the MPICH process manager used to launch MPI programs. mpiexec.exe}{\f2\fs20\insrsid9724047 ,}{\f2\fs20\insrsid5849808 also found in the bin directory}{\f2\fs20\insrsid9724047 ,}{
\f2\fs20\insrsid5849808 is used to start MPICH jobs.
\par
\par
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\outlinelevel0\rin0\lin0\itap0\pararsid289831 {\f2\fs20\insrsid5849808 COMPILING:
......@@ -85,25 +85,25 @@ system32 directory. The bin directory contains smpd.exe which is the MPICH2 pro
\par Compiling an MPI program:
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 1)\tab}}\pard \ql \fi-360\li720\ri0\nowidctlpar\jclisttab\tx720\faauto\ls3\rin0\lin720\itap0\pararsid10776424 {\f2\fs20\insrsid5849808
Create a project for Visual Studio 2003, or Intel Fortran 8.0
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 2)\tab}Add mpich2\\include to the include path
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 3)\tab}Add mpich2\\lib to the library path
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 2)\tab}Add mpich\\include to the include path
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 3)\tab}Add mpich\\lib to the library path
\par {\listtext\pard\plain\f2\fs20\insrsid289831 \hich\af2\dbch\af0\loch\f2 4)\tab}}{\f2\fs20\insrsid289831 For C applications a}{\f2\fs20\insrsid5849808 dd mpi.lib to your target link command.
\par {\listtext\pard\plain\f2\fs20\insrsid289831 \hich\af2\dbch\af0\loch\f2 5)\tab}}{\f2\fs20\insrsid289831 For }{\f2\fs20\insrsid5849808 Fortran applications add }{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid5849808 mpich2.lib to the link command.
\par {\listtext\pard\plain\f2\fs20\insrsid289831 \hich\af2\dbch\af0\loch\f2 5)\tab}}{\f2\fs20\insrsid289831 For }{\f2\fs20\insrsid5849808 Fortran applications add }{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid5849808 mpich.lib to the link command.
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 6)\tab}Compile
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 7)\tab}Place your application and all the dlls it depends on in a shared location or copy them to all the nodes.
\par {\listtext\pard\plain\f2\fs20\insrsid5849808 \hich\af2\dbch\af0\loch\f2 8)\tab}Run the application using mpiexec
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\rin0\lin0\itap0 {\f2\fs20\insrsid5849808
\par }{\f2\fs20\insrsid9307247 For Visual Fortran 6 use }{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpich2s.lib:
\par }{\f2\fs20\insrsid9307247 For Visual Fortran 6 use }{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpichs.lib:
\par {\listtext\pard\plain\f3\fs20\insrsid9724047 \loch\af3\dbch\af0\hich\f3 \'b7\tab}}\pard \ql \fi-360\li720\ri0\nowidctlpar\jclisttab\tx720\faauto\ls1\rin0\lin720\itap0\pararsid9307247 {\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247
mpich2.lib contains all caps cdecl: MPI_INIT
\par {\listtext\pard\plain\f3\fs20\insrsid9724047 \loch\af3\dbch\af0\hich\f3 \'b7\tab}}{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpich2s.lib contains all caps stdcall: MPI_INIT@4
\par {\listtext\pard\plain\f3\fs20\insrsid9724047 \loch\af3\dbch\af0\hich\f3 \'b7\tab}}{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpich2g.lib or }{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpich2g.a contain lowercase cdecl: mpi_init__
mpich.lib contains all caps cdecl: MPI_INIT
\par {\listtext\pard\plain\f3\fs20\insrsid9724047 \loch\af3\dbch\af0\hich\f3 \'b7\tab}}{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpichs.lib contains all caps stdcall: MPI_INIT@4
\par {\listtext\pard\plain\f3\fs20\insrsid9724047 \loch\af3\dbch\af0\hich\f3 \'b7\tab}}{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpichg.lib or }{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid9307247 mpichg.a contain lowercase cdecl: mpi_init__
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\outlinelevel0\rin0\lin0\itap0\pararsid289831 {\f2\fs20\insrsid9307247 For gcc/g77
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 1)\tab}}\pard \ql \fi-360\li720\ri0\nowidctlpar\jclisttab\tx720\faauto\ls2\rin0\lin720\itap0\pararsid9307247 {\f2\fs20\insrsid9307247 create a makefile
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 2)\tab}add \endash I\'85mpich2\\include
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 3)\tab}add \endash L\'85mpich2\\lib
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 2)\tab}add \endash I\'85mpich\\include
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 3)\tab}add \endash L\'85mpich\\lib
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 4)\tab}}\pard \ql \fi-360\li720\ri0\nowidctlpar\jclisttab\tx720\faauto\ls2\rin0\lin720\itap0\pararsid9724047 {\f2\fs20\insrsid9307247 add \endash lmpi}{\f2\fs20\insrsid10776424
(for g77: -l}{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid10776424 mpich2g)}{\f2\fs20\insrsid12541554
(for g77: -l}{\f2\fs20\insrsid9724047 f}{\f2\fs20\insrsid10776424 mpichg)}{\f2\fs20\insrsid12541554
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 5)\tab}}\pard \ql \fi-360\li720\ri0\nowidctlpar\jclisttab\tx720\faauto\ls2\rin0\lin720\itap0\pararsid9307247 {\f2\fs20\insrsid9307247 add the rules for your source files
\par {\listtext\pard\plain\f2\fs20\insrsid9307247 \hich\af2\dbch\af0\loch\f2 6)\tab}same as 6,7,8 above
\par }\pard \ql \li0\ri0\nowidctlpar\faauto\rin0\lin0\itap0\pararsid9307247 {\f2\fs20\insrsid9307247
......@@ -122,15 +122,15 @@ mpich2.lib contains all caps cdecl: MPI_INIT
\par
\par If you want to have worker nodes that do not have any tools on them, you can simply copy smpd.exe to each node and execute "smpd.exe -install".
\par
\par smpd.exe is the only application required on each machine to launch MPICH2 jobs}{\f2\fs20\insrsid9724047 . }{\f2\fs20\insrsid5849808 }{\f2\fs20\insrsid9724047 B}{\f2\fs20\insrsid5849808 ut}{\f2\fs20\insrsid9724047 ,}{\f2\fs20\insrsid5849808
MPICH2 applications require the mpich2 dlls. This requirement can be satisfied by copying the mpich2 dlls to the windows\\
system32 directory on each node. Then any mpich2 application can run on those systems. This is what the installer does. If you don't want to copy the mpich2 dlls to e
\par smpd.exe is the only application required on each machine to launch MPICH jobs}{\f2\fs20\insrsid9724047 . }{\f2\fs20\insrsid5849808 }{\f2\fs20\insrsid9724047 B}{\f2\fs20\insrsid5849808 ut}{\f2\fs20\insrsid9724047 ,}{\f2\fs20\insrsid5849808
MPICH applications require the mpich dlls. This requirement can be satisfied by copying the mpich dlls to the windows\\
system32 directory on each node. Then any mpich application can run on those systems. This is what the installer does. If you don't want to copy the mpich dlls to e
ach machine, then you need to place the dlls in the same location as the executable you are going to launch.
\par
\par For example, if you have a directory called \\\\myserver\\mysharedfolder and you have myapp.exe and }{\f2\fs20\insrsid3147644 *}{\f2\fs20\insrsid5849808 mpich2}{\f2\fs20\insrsid3147644 *}{\f2\fs20\insrsid5849808 .dll in that directory then you can execu
\par For example, if you have a directory called \\\\myserver\\mysharedfolder and you have myapp.exe and }{\f2\fs20\insrsid3147644 *}{\f2\fs20\insrsid5849808 mpich}{\f2\fs20\insrsid3147644 *}{\f2\fs20\insrsid5849808 .dll in that directory then you can execu
te this command: "mpiexec -n 4 \\\\myserver\\mysharedfolder\\myapp.exe"
\par
\par Note: There are several mpich2 dlls and depending on your build target (Fortran, C, Debug or Release) you will need the corresponding dll}{\f2\fs20\insrsid9724047 s}{\f2\fs20\insrsid5849808 in the application directory.
\par Note: There are several mpich dlls and depending on your build target (Fortran, C, Debug or Release) you will need the corresponding dll}{\f2\fs20\insrsid9724047 s}{\f2\fs20\insrsid5849808 in the application directory.
\par
\par }{\f2\fs20\insrsid10776424
\par }}
\ No newline at end of file
......@@ -52,14 +52,14 @@
reasons. A workaround is to use GNU Make instead. See the following
ticket for more information:
https://trac.mcs.anl.gov/projects/mpich2/ticket/1122
https://trac.mpich.org/projects/mpich/ticket/1122
* Build fails with Intel compiler suite 13.0, because of weak symbol
issues in the compiler. A workaround is to disable weak symbol
support by passing --disable-weak-symbols to configure. See the
following ticket for more information:
https://trac.mcs.anl.gov/projects/mpich2/ticket/1659
https://trac.mpich.org/projects/mpich/ticket/1659
### Process Managers
......
......@@ -106,7 +106,7 @@ less than v1.6. Please check your SVN client version (with
If you do have a modern SVN client and believe that you have reached
this error case for some other reason, please file a ticket at:
https://trac.mcs.anl.gov/projects/mpich2/newticket
https://trac.mpich.org/projects/mpich/newticket
EOT
exit 1
......@@ -632,7 +632,7 @@ fi
if [ "$do_smpdversion" = yes ] ; then
echo_n "Creating src/pm/smpd/smpd_version.h... "
smpdVersion=${MPICH2_VERSION}
smpdVersion=${MPICH_VERSION}
cat >src/pm/smpd/smpd_version.h <<EOF
/* -*- Mode: C; c-basic-offset:4 ; -*- */
/*
......
......@@ -116,7 +116,7 @@ AC_INIT([MPICH],
MPICH_VERSION_m4,
[mpich-discuss@mcs.anl.gov],
[mpich],
[http://www.mcs.anl.gov/research/projects/mpich2/])
[http://www.mpich.org/])
if test "x$prefix" != "xNONE" && test -d "$prefix"; then
if test "x`(cd \"$prefix\"; echo \"$PWD\")`" = "x`(cd \"$srcdir\"; echo \"$PWD\")`" ||\
......@@ -5180,7 +5180,7 @@ void f1(void *a) { return; }],
;;
win|windows)
with_thread_package=win
MPICH2_THREAD_PACKAGE=win
MPICH_THREAD_PACKAGE=win
MPIU_THREAD_PACKAGE_NAME=MPIU_THREAD_PACKAGE_WIN
AC_MSG_ERROR([The 'win' thread package is not supported via autoconf builds at this time.])
;;
......
......@@ -662,7 +662,7 @@ Currently three process managers are distributed with MPICH
\item[hydra] This is the default process manager tha natively uses the
existing daemons on the system such as ssh, slurm, pbs.
\item[smpd] This one can be used for both Linux and Windows. It is the
only process manager that supports the Windows version of MPICH2.
only process manager that supports the Windows version of MPICH.
\item[gforker] This is a simple process manager that creates all
processes on a single machine. It is useful both for debugging and
for running on shared memory multiprocessors.
......@@ -782,7 +782,7 @@ Other MPI test suites are available from
\url{http://www.mcs.anl.gov/mpi/mpi-test/tsuite.html}. As part of the MPICH
development, we run the MPICH1, MPICH, C++, and Intel test suites every night
and post the results on
\url{http://www.mcs.anl.gov/research/projects/mpich2/nightly/old/}.
\url{http://www.mpich.org/static/cron/tests/}.
Other tests are run on an occasional basis.
% \subsection{Using the Intel Test Suite}
......@@ -881,10 +881,10 @@ this case.
\label{sec:winbin}
The Windows binary distribution uses the Microsoft Installer. Download and
execute \texttt{mpich2-1.x.xxx.msi} to install the binary distribution. The default
installation path is \texttt{C:$\backslash$Program Files$\backslash$MPICH2}.
execute \texttt{mpich-xxx.msi} to install the binary distribution. The default
installation path is \texttt{C:$\backslash$Program Files$\backslash$MPICH}.
You must have administrator privileges to install
\texttt{mpich2-1.x.xxx.msi}. The installer
\texttt{mpich-xxx.msi}. The installer
installs a Windows service to launch MPICH applications and only administrators
may install services. This process manager is called smpd.exe. Access to
the process manager is passphrase protected. The installer asks for this
......@@ -896,11 +896,11 @@ Under the installation directory are three sub-directories: \texttt{include},
directories contain the header files and libraries necessary to compile MPI
applications. The \texttt{bin} directory contains the process manager,
\texttt{smpd.exe}, and the the MPI job launcher, \texttt{mpiexec.exe}. The
dlls that implement MPICH2 are copied to the Windows system32 directory.
dlls that implement MPICH are copied to the Windows system32 directory.
You can install MPICH in unattended mode by executing
\begin{verbatim}
msiexec /q /I mpich2-1.x.xxx.msi
msiexec /q /I mpich-1.x.xxx.msi
\end{verbatim}
The smpd process manager for Windows runs as a service that can launch jobs
......@@ -925,13 +925,13 @@ run on each node individualy by a domain administrator.
\subsection{Source distribution}
\label{sec:winsrc}
In order to build MPICH2 from the source distribution under Windows,
In order to build MPICH from the source distribution under Windows,
you must have MS Developer Studio .NET 2003 or later, perl and optionally
Intel Fortran 8 or later.
\begin{itemize}
\item
Download \texttt{mpich2-1.x.y.tar.gz} and unzip it.
Download \texttt{mpich-x.y.z.tar.gz} and unzip it.
\item
Bring up a Visual Studio Command prompt with the compiler environment
variables set.
......@@ -941,31 +941,31 @@ Run \texttt{winconfigure.wsf}. If you don't have a Fortran compiler add the
projects and dependencies. Execute \texttt{winconfigure.wsf /?} to see all
available options.
\item
open \texttt{mpich2$\backslash$mpich2.sln}
open \texttt{mpich$\backslash$mpich.sln}
%\item
% build the \texttt{ch3sockDebug mpich2} solution
% build the \texttt{ch3sockDebug mpich} solution
%\item
% build the \texttt{ch3sockDebug mpich2s} project
% build the \texttt{ch3sockDebug mpichs} project
\item
build the ch3sockRelease mpich2 solution
build the ch3sockRelease mpich solution
\item
build the ch3sockRelease mpich2s project
build the ch3sockRelease mpichs project
%\item
% build the Debug mpich2 solution
% build the Debug mpich solution
\item
build the Release mpich2 solution
build the Release mpich solution
%\item
% build the fortDebug mpich2 solution
% build the fortDebug mpich solution
\item
build the fortRelease mpich2 solution
build the fortRelease mpich solution
%\item
% build the gfortDebug mpich2 solution
% build the gfortDebug mpich solution
\item
build the gfortRelease mpich2 solution
build the gfortRelease mpich solution
%\item
% build the sfortDebug mpich2 solution
% build the sfortDebug mpich solution
\item
build the sfortRelease mpich2 solution
build the sfortRelease mpich solution
\item
build the channel of your choice. The options are \texttt{sock} and
\texttt{nemesis}.
......@@ -979,19 +979,19 @@ compete for individual processors.
\subsection{\texttt{cygwin}}
\label{sec:cygwin}
MPICH2 can also be built under \texttt{cygwin} using the source
MPICH can also be built under \texttt{cygwin} using the source
distribution and the Unix commands described in previous sections. This
will not build the same libraries as described in this section. It will
build a ``Unix'' distribution that runs under \texttt{cygwin}.
It is best to use paths that do not contain embedded spaces when
building MPICH2 under \texttt{cygwin}. For example, consider using
building MPICH under \texttt{cygwin}. For example, consider using
\begin{verbatim}
c:\programFiles\mpich2%MPICH2_VERSION%
c:\programFiles\mpich%MPICH_VERSION%
\end{verbatim}
instead of
\begin{verbatim}
c:\Program Files\mpich2%MPICH2_VERSION%
c:\Program Files\mpich%MPICH_VERSION%
\end{verbatim}
It may also help to have \texttt{cygwin} installed into a path with no
embedded blanks.
......@@ -1009,7 +1009,7 @@ Account Control.
\item
You can install MPICH from the administrator command prompt by executing
\begin{verbatim}
msiexec /I mpich2-1.x.xxx.msi
msiexec /I mpich-1.x.xxx.msi
\end{verbatim}
\end{enumerate}
......
%
% This is a latex file that generates a reference manual for
% the wire protocol between an mpich2 process and a smpd manager.
% the wire protocol between an mpich process and a smpd manager.
%
\documentclass[dvipdfm,11pt]{article}
\usepackage[dvipdfm]{hyperref} % Upgraded url package
......@@ -32,16 +32,16 @@ Argonne National Laboratory}
\section{Introduction}
When a user builds MPICH2 they have the option to choose the SMPD process
manager to launch and manage processes in MPICH2 jobs. MPICH2 provides
an implementation of smpd and mpiexec to launch MPICH2 jobs. MPICH2
When a user builds MPICH they have the option to choose the SMPD process
manager to launch and manage processes in MPICH jobs. MPICH provides
an implementation of smpd and mpiexec to launch MPICH jobs. MPICH
applications communicate with the process manager using the PMI interface.
The PMI library for smpd provides an implementation of PMI for communicating
with SMPD process managers. This document describes the environment and wire
protocol between the MPICH2 application and the SMPD manager.
protocol between the MPICH application and the SMPD manager.
If a process manager implementor replicates the environment and protocol
described in this document, they would be able to launch and manage MPICH2
described in this document, they would be able to launch and manage MPICH
jobs compiled for SMPD.
An SMPD manager communicates with its child process through environment
......@@ -49,7 +49,7 @@ variables and a socket. This document describes the environment and the
wire protocol on that socket.
\section{SMPD manager topology}
This section describes how SMPD is organized in MPICH2. An implementation
This section describes how SMPD is organized in MPICH. An implementation
of a process manager that uses the protocol described in this document is
not required to use this topology. It is provided for reference.
......@@ -71,8 +71,8 @@ provides the context id to the child when it launches a process.
\end{figure}
\section{Child process environment}
SMPD managers launch and manage child processes in an MPICH2 job.
MPICH2 processes compiled with the SMPD PMI library expect the following
SMPD managers launch and manage child processes in an MPICH job.
MPICH processes compiled with the SMPD PMI library expect the following
environment variables to be set:
\begin{description}
......
......@@ -291,7 +291,7 @@ For the time being we will document these separately.
MPICH provides a number of process management systems. Hydra is the
default process manager in MPICH. More details on Hydra and its
extensions to mpiexec can be found at
\url{http://wiki.mcs.anl.gov/mpich2/index.php/Using\_the\_Hydra\_Process\_Manager}
\url{http://wiki.mpich.org/mpich/index.php/Using\_the\_Hydra\_Process\_Manager}
\subsection{Extensions for SMPD Process Management Environment}
......@@ -587,7 +587,7 @@ Hydra process manager. Currently only the BLCR checkpointing library
is supported. BLCR needs to be installed separately. Below we
describe how to enable the feature in MPICH and how to use it. This
information can also be found on the MPICH Wiki:
\url{http://wiki.mcs.anl.gov/mpich2/index.php/Checkpointing}
\url{http://wiki.mpich.org/mpich/index.php/Checkpointing}
\subsection{Configuring for checkpointing}
\label{sec:conf-checkp}
......@@ -681,20 +681,20 @@ be found in the \texttt{mpich/test/mpi} source directory and can be
run with the command \texttt{make testing}. This test suite should
work with any MPI implementation, not just MPICH.
\section{MPICH2 under Windows}
\section{MPICH under Windows}
\label{sec:windows}
\subsection{Directories}
\label{sec:windir}
The default installation of MPICH2 is in
\texttt{C:$\backslash$Program Files$\backslash$MPICH2}. Under the installation
The default installation of MPICH is in
\texttt{C:$\backslash$Program Files$\backslash$MPICH}. Under the installation
directory are three sub-directories: \texttt{include}, \texttt{bin}, and
\texttt{lib}. The \texttt{include} and \texttt{lib} directories contain
the header files and libraries necessary to compile MPI applications.
The \texttt{bin} directory contains the process manager, \texttt{smpd.exe},
and the MPI job launcher, \texttt{mpiexec.exe}. The dlls that implement
MPICH2 are copied to the Windows system32 directory.
MPICH are copied to the Windows system32 directory.
\subsection{Compiling}
\label{sec:wincompile}
......@@ -707,20 +707,20 @@ create user applications. \texttt{gcc} and \texttt{g77} for \texttt{cygwin} can
For MS Developer Studio users: Create a project and add
\begin{verbatim}
C:\Program Files\MPICH2\include
C:\Program Files\MPICH\include
\end{verbatim}
to the include path and
\begin{verbatim}
C:\Program Files\MPICH2\lib
C:\Program Files\MPICH\lib
\end{verbatim}
to
the library path. Add \texttt{mpi.lib} and \texttt{cxx.lib} to the
link command. Add \texttt{cxxd.lib} to the Debug target link instead of
\texttt{cxx.lib}.
Intel Fortran 8 users should add \texttt{fmpich2.lib} to the link command.
Intel Fortran 8 users should add \texttt{fmpich.lib} to the link command.
Cygwin users should use \texttt{libmpich2.a} \texttt{libfmpich2g.a}.
Cygwin users should use \texttt{libmpich.a} \texttt{libfmpichg.a}.
\subsection{Running}
\label{sec:winrun}
......@@ -735,7 +735,7 @@ for a description of the options to \texttt{mpiexec}.
\section{Frequently Asked Questions}
The frequently asked questions are maintained online
here:\url{http://wiki.mcs.anl.gov/mpich2/index.php/Frequently_Asked_Questions}
here:\url{http://wiki.mpich.org/mpich/index.php/Frequently_Asked_Questions}
\bibliographystyle{plain}
\bibliography{user}
......
ALL: all-redirect
SHELL = @SHELL@
srcdir = @srcdir@
MPICH2_VERSION = @MPICH2_VERSION@
MPICH_VERSION = @MPICH_VERSION@
.SUFFIXES: .pdf .dvi .tex
......@@ -16,9 +16,9 @@ BIBTEX = BIBINPUTS=".:$(srcdir):" ; export BIBINPUTS ; bibtex
DVIPDFM = TEXINPUTS=".:$(srcdir):" ; export TEXINPUTS ; dvipdfm
LATEX2HTML = latex2html
# Update the %MPICH2_VERSION% with current version string.
# Update the %MPICH_VERSION% with current version string.
windev.tex: windev.tex.vin
sed -e "s/%MPICH2_VERSION%/${MPICH2_VERSION}/g" $? > $@
sed -e "s/%MPICH_VERSION%/${MPICH_VERSION}/g" $? > $@
windev.dvi: windev.tex
-$(LATEX) windev.tex
......
......@@ -5,8 +5,8 @@ GOTO AFTERHELP
REM
REM Usage:
REM getdotin
REM 1) check out mpich2 from cvs
REM the environment variable USERNAME is used to checkout mpich2
REM 1) check out mpich from cvs
REM the environment variable USERNAME is used to checkout mpich
REM If this variable is not set or is set to the wrong mcs user:
REM set USERNAME=mymcsusername
REM
......@@ -21,18 +21,18 @@ echo cd /sandbox/%USERNAME% > sshcmds.txt
echo mkdir dotintmp >> sshcmds.txt
echo cd dotintmp >> sshcmds.txt
if "%1" == "" GOTO EXPORT_HEAD
echo cvs -d /home/MPI/cvsMaster export -r %1 mpich2allWithMPE >> sshcmds.txt
echo cvs -d /home/MPI/cvsMaster export -r %1 mpichallWithMPE >> sshcmds.txt
GOTO AFTER_EXPORT_HEAD
:EXPORT_HEAD
echo cvs -d /home/MPI/cvsMaster export -r HEAD mpich2allWithMPE >> sshcmds.txt
echo cvs -d /home/MPI/cvsMaster export -r HEAD mpichallWithMPE >> sshcmds.txt
:AFTER_EXPORT_HEAD
echo cd mpich2 >> sshcmds.txt
echo cd mpich >> sshcmds.txt
echo autogen.sh >> sshcmds.txt
echo tar cvf dotin.tar `find . -name "*.h.in"` >> sshcmds.txt
echo gzip dotin.tar >> sshcmds.txt
echo exit >> sshcmds.txt
ssh -l %USERNAME% %CVS_HOST% < sshcmds.txt
scp %USERNAME%@%CVS_HOST%:/sandbox/%USERNAME%/dotintmp/mpich2/dotin.tar.gz .
scp %USERNAME%@%CVS_HOST%:/sandbox/%USERNAME%/dotintmp/mpich/dotin.tar.gz .
ssh -l %USERNAME% %CVS_HOST% rm -rf /sandbox/%USERNAME%/dotintmp
del sshcmds.txt
tar xvfz dotin.tar.gz
......
This diff is collapsed.
......@@ -32,7 +32,7 @@
Name="VCCustomBuildTool"/>