Topic: Installation of POOFEM and oofem with pets

Hello,
I tried to install parallel OOFEM on my mac. I installed it with petsc, openmpi(version 1.6.4), metis and parmetis. My problem was that everytime I switched on the OPENMP flag in the cmake configuration I had an error with those pragma commands in engngm.C that were used when OPENMP was activated. However when I switched it on the installation was successful and it passed all the parallel tests. Do you believe that my oofem will have problems running  other structures different to the partests ?
I also wanted to ask if I could use this installation for old oofem problems. If I modify the name of an input file to oofem.in.0, add in the nonlinearstatic lines lstype 3 and smtype 7 (to use petsc solvers) and set in the mpirun -np  1 for a single process is it going to work? I am asking this question because I wanted to use petsc solvers in my simple oofem problems and when I tried to install oofem and petsc without mpi the ccmake was asking for the directory of the mpi. Should I maybe work for serial problems with other libraries such as spooles or iml?
Finally I would like to ask  concerning the partition of the meshes. I found in the tools file a programm called oofem2part. Is it ok to use it for partitioning my meshes or is it a trial version that may contain bugs? Is there maybe another program that I could use to generate a mesh and partition it in order to run it with parallel oofem using the parmetis libraries?

Many Thanks

Re: Installation of POOFEM and oofem with pets

Hi demexen.

First, OpenMP is just threaded assembly (which can be used in conjuction with MPI as well if you want). Not all compilers support OpenMP though. For example Clang does not.


There is a flag "-p" to denote if a problem is run in parallel. If you don't add that flag, then you can still run sequential problems without having to use mpirun.


I would not recomment IML, and directly using SPOOLES isn't really actively tested by anyone, since it's easily supported through PETSc.
Yes, petsc, requires MPI, but you can still run it sequentially as mentioned above, e.g:

oofem -f oofem.in -pc_type lu -ksp_type none -pc_factor_mat_solver_package spooles

There is no trail versions, and oofem2part should work, but it certainly can have bugs anyway.
There problably are other mesh generators which will partition the mesh for you as well (but I'm not very familiar with them).