[Dune] How to use parallel UGGrid?

giese at mathematik.hu-berlin.de giese at mathematik.hu-berlin.de
Fri Jun 18 11:02:21 CEST 2010


Dear Dune-Developers,

I have a question regarding the usage of the parallel UGGrid in combination
with the gmshreader. I tried an easy example in DUNE, where the programme
just reads a grid using the gmshreader and produces an output of the
distributed grid using the vtkwriter. The crucial part of the code looks
as follows:

    .
    .
    .

    typedef Dune::UGGrid<3> GridType;
    GridType grid(400);

    // read a gmsh file
    Dune::GmshReader<GridType> gmshreader;
    gmshreader.read(grid, "cube.msh");

    // refine grid
    grid.globalRefine(level);

    if(!Dune::MPIHelper::isFake)
      grid.loadBalance();

    // get a grid view
    typedef GridType::LeafGridView GV;
    const GV& gv = grid.leafView();

    // plot celldata
    std::vector<int> a(gv.size(0), 1);

    // output
    Dune::VTKWriter<GV> vtkwriter(gv);
    vtkwriter.addCellData(a, "celldata");
    vtkwriter.write("TestGrid", Dune::VTKOptions::ascii);

    .
    .
    .

This code produces an error message, which you can be seen below. It seems
that UGGrid starts in parallel. It is actually configured with
"--enable-parallel". But somehow in the end something is going wrong. What
do I have to change? Do I have to add some code? Maybe you have an easy
example that works in parallel? I would be very grateful if you could help
me!

Best regards,
Wolfgang Giese

---Error Message---
The programme "gridtest" started on three nodes produces the
following error Message:

parallel run on 3 process(es)
DimX=3, DimY=1, DimZ=1
Reading 3d Gmsh grid...
Reading 3d Gmsh grid...
Reading 3d Gmsh grid...
version 2 Gmsh file detected
file contains 14 nodes
number of real vertices = 14
number of boundary elements = 24
number of elements = 24
number of real vertices = 14
number of boundary elements = 24
number of elements = 24
file contains 48 elements
number of real vertices = 14
number of boundary elements = 24
number of elements = 24
[node18:11171] *** Process received signal ***
[node18:11171] Signal: Segmentation fault (11)
[node18:11171] Signal code: Address not mapped (1)
[node18:11171] Failing at address: 0x7359e3588
[node18:11170] *** Process received signal ***
[node18:11170] Signal: Segmentation fault (11)
[node18:11170] Signal code: Address not mapped (1)
[node18:11170] Failing at address: 0x1655035c8
[node18:11170] [ 0] /lib64/libpthread.so.0 [0x2b402c690a90]
[node18:11170] [ 1] ./gridtest(_ZN4Dune6UGGridILi3EE12globalRefineEi+0xdb)
[0x60c78b]
[node18:11170] [ 2] ./gridtest(main+0x233) [0x597213]
[node18:11170] [ 3] /lib64/libc.so.6(__libc_start_main+0xe6) [0x2b402c8bd586]
[node18:11170] [ 4] ./gridtest [0x596b79]
[node18:11170] *** End of error message ***
[node18:11171] [ 0] /lib64/libpthread.so.0 [0x2ac5e672ca90]
[node18:11171] [ 1] ./gridtest(_ZN4Dune6UGGridILi3EE12globalRefineEi+0xdb)
[0x60c78b]
[node18:11171] [ 2] ./gridtest(main+0x233) [0x597213]
[node18:11171] [ 3] /lib64/libc.so.6(__libc_start_main+0xe6) [0x2ac5e6959586]
[node18:11171] [ 4] ./gridtest [0x596b79]
[node18:11171] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 11171 on node node18 exited on
signal 11 (Segmentation fault).
--------------------------------------------------------------------------





More information about the Dune mailing list