Machines Presently Supported by PETSc 2.0

http://www.mcs.anl.gov/petsc

This file lists the machines and compilers that we currently support for PETSc 2.0. Our main limitation is easy access to the architecture. Since PETSc relies on some system details, porting to a new machine is not always trivial. This is basically for three reasons:

We will try to maintain PETSc 2.0 support for any major machine class, but only if we have easy access to the machine. In addition, most machines and some implementations of MPI have bugs; since PETSc is a sophisticated system it finds bugs in features that many other users do not employ. We cannot write PETSc to work around bugs in system software. Thus, true system bugs, MPI implementation bugs, etc. must be reported to the appropriate vendors for fixing; we will help you in determining whether a particular problem is a true bug and make suggestions for work-arounds.


Details about machines currently supported by PETSc 2.0

This system is rather old now, must users have since upgraded to Sun Solaris. Since the old bundled Sun compiler was not ANSI-C, we develop using the GNU compilers. The Sun ANSI-C compilers require several modifications to the base makefiles; these are indicated in the file bmake/sun4/base.

Known Problems: Automated debugger attachment (such as with the option -start_in_debugger) may not work for SGI machines with gdb debugger. Does work with dbx.

Known Problems:

0) Trouble compiling the Fortran interface on some version of the Origin 2000; see the file troubleshooting.html for a work-around.

1) We do not know how to trap floating point exceptions, does not seem to be supported by the OS yet. 2) See known problems for PETSC_ARCH=IRIX above.

2) The optimizing compiler on the Power Challenge has been known to generate incorrect code, so always develop with BOPT=g and switch to BOPT=O when you are sure the code is running correctly

Significant performance degradation seems to result when using all processors of a Power Challenge system to run MPI programs. You should only use p-1 processors in a p processor system; this has nothing to do with PETSc .

Fine

Most common problem is actually not PETSc but problems with MPI

Fine

Known Problems:


We will never port to some older machines including::