ViewVC Help
View File | Revision Log | Show Annotations | View Changeset | Root Listing
root/OpenMD/branches/development/README
(Generate patch)

Comparing trunk/README (file contents):
Revision 1054 by gezelter, Tue May 16 02:06:37 2006 UTC vs.
Revision 1055 by gezelter, Tue Sep 26 21:55:29 2006 UTC

# Line 7 | Line 7 | Embedded Atom Method (EAM) or Sutton-Chen (SC) potenti
7   (point dipoles, sticky atoms), as well as transition metals under the
8   Embedded Atom Method (EAM) or Sutton-Chen (SC) potentials.
9  
10 < Simulations are started in OOPSE using two files:
10 > Simulations are started in OOPSE using a single Molecular Dynamics (.md)
11 > file.   These files must start with the <OOPSE> tag and must
12 > have two sections:
13  
14 <  1) a C-based meta-data (.md) file, and
14 >  1) a C-based <MetaData> section, and
15  
16 <  2) a modified XYZ format for initial coordinate and velocity information.
16 >  2) a <Snapshot> block for initial coordinate and velocity information.
17  
18 < Detailed descriptions of the structures of these two files are
18 > Detailed descriptions of the structures of these files are
19   available in the "doc" directory.  Sample simulations are
20   available in the "samples" directory.
21  
# Line 22 | Line 24 | What you need to compile and use OOPSE:
24   1) Good C, C++ and Fortran95 compilers.  We've built and tested OOPSE
25      on the following architecture & compiler combinations:
26  
27 <    Architecture                CC     CXX     F90    Notes
28 <    -------------------------   ----   -----   -----  ----------------------
29 <    ix86-pc-linux-gnu           icc    icpc    ifort  (Intel versions 7-9)
30 <    i386-apple-darwin8.6.1      icc    icpc    ifort  (Intel version 9.1)
31 <    powerpc-apple-darwin8.6.0   gcc    g++     xlf95  (GNU v.4 / IBM XL v. 8.1)
32 <    x86_64-unknown-linux-gnu    pgcc   pgCC    pgf95  (Portland Group v. 6.0)
33 <    sparc-sun-solaris2.10       cc     CC      f95    (Sun ONE Studio 10)
27 >    Architecture                CC     CXX     F90     Notes
28 >    -------------------------   ----   -----   -----   ----------------------
29 >    ix86-pc-linux-gnu           icc    icpc    ifort   (Intel versions 7-9)
30 >    i386-apple-darwin8.7.1      icc    icpc    ifort   (Intel version 9.1)
31 >    powerpc-apple-darwin8.7.0   gcc    g++     xlf95   (GNU v.4 / IBM XL v. 8.1)
32 >    x86_64-unknown-linux-gnu    gcc    g++     pathf95 (GNU v.4 / Pathscale 2.5)
33 >    sparc-sun-solaris2.10       cc     CC      f95     (Sun ONE Studio 10)
34      
35 <    We've successfully compiled OOPSE with the Pathscale c, c++, and
36 <    Fortran95 compilers on the  x86_64-unknown-linux-gnu architecture,
35 <    but a bug in the exception handling on these compilers causes
36 <    OOPSE to abort (rather than providing a useful error message) when
37 <    an error is found in the meta-data file.
35 >    We've also successfully compiled OOPSE with the Portland Group c, c++,
36 >    and Fortran95 compilers on the  x86_64-unknown-linux-gnu architecture.
37    
38      OOPSE uses features of the Fortran 95 language. The fortran
39      portions of our code will not compile if your compiler does not
# Line 42 | Line 41 | What you need to compile and use OOPSE:
41      compilers do support these features. None of the Fortran 77
42      compilers can be used to compile OOPSE.
43  
44 <    Compilers that are known to fail on OOPSE: g77, Gfortran, Older
45 <    Portland Group compilers (pgf77, pgf90).
44 >    Fortran compilers that are known to fail on OOPSE: g77, Gfortran,
45 >    Older Portland Group compilers (pgf77, pgf90).
46  
47 <    Compilers that are known to work on OOPSE: Intel's ifort,
47 >    Fortran compilers that are known to work on OOPSE: Intel's ifort,
48      Pathscale's pathf95, IBM's xlf95, Portland's pgf95 (version 6 or
49      higher), Sun's f95. There may be others that work also.
50  
# Line 62 | Line 61 | What you need to compile and use OOPSE:
61   4) MPI is optional for the single processor version of OOPSE,
62      but is required if you want OOPSE to run in parallel.
63  
64 <    We like MPICH-1.2.*.  Other implementations might work, but we
65 <    haven't tried.  You can get MPICH here:
67 <    http://www-unix.mcs.anl.gov/mpi/mpich/
64 >    We like Open MPI.  Other implementations might work, but we
65 >    haven't tried.  You can get Open MPI here: http://www.open-mpi.org/
66  
67   INSTRUCTIONS
68  
69   1) Get, build, and test the required pieces above.
70 < 2) ./configure  (or ./configure --with-mpi=/usr/local/mpich)
70 > 2) ./configure  (or ./configure --with-mpi=/usr/local/openmpi)
71   3) make
72   4) make install
73  

Diff Legend

Removed lines
+ Added lines
< Changed lines
> Changed lines