[Dune] error running code with MPI
Vogelbacher Florian
Florian.Vogelbacher at psi.ch
Fri Sep 3 11:01:28 CEST 2010
Dear Dune,
the reason for a new installation (see last mail on mailinglist about
dune-grid-howto build problem) was a problem running our code using MPI
on our machine.
The code is running in parallel on a Mac, but not on my machine. Has
anyone seen something similar? ALUGrid problem?
Best wishes,
Florian
1.) the system:
gcc (GCC) 4.3.1
Linux felsim01 2.6.18-194.8.1.el5 #1 SMP Thu Jul 1 16:05:53 EDT 2010
x86_64 x86_64 x86_64 GNU/Linux
mpirun (Open MPI) 1.3.3
2.) the error:
[...]
Created parallel ALUSimplexGrid<3,3> from macro grid file
'../../../grids/spherical-capacitor.dgf'.
process[1] : #[total leaf element]=0
2010-Sep-03 10:11:07.563938 ::: hades3delectrostatic.cc: 320 :::
PRODUCTION #[total leaf element]=7863
2010-Sep-03 10:11:07.564055 ::: hades3delectrostatic.cc: 323 :::
PRODUCTION #[process in this run]=2
process[0] : #[total leaf element]=7863
2010-Sep-03 10:11:07.564110 :::
../globaldatainterface/globaldatainterface.hh: 75 ::: PRODUCTION [
Get element tags from macro grid ...
2010-Sep-03 10:11:07.567824 :::
../globaldatainterface/globaldatainterface.hh: 134 ::: PRODUCTION all
element tags from macro grid are now available on cpu 0. ]
2010-Sep-03 10:11:07.567973 ::: hades3delectrostatic.cc: 353 :::
PRODUCTION [parallel - #[global refinement]=0
2010-Sep-03 10:11:07.568019 ::: hades3delectrostatic.cc: 374 :::
PRODUCTION [parallel - initiate load balance
2010-Sep-03 10:11:07.568065 ::: hades3delectrostatic:
parallel/gitter_pll_sti.h:732: void
ALUGridSpace::LinkedObject::Identifier::read(__gnu_cxx::__normal_iterator<const
int*, std::vector<int, std::allocator<int> > >&, const
__gnu_cxx::__normal_iterator<const int*, std::vector<int,
std::allocator<int> > >&): Assertion `pos != end' failed.
[felsim01:02544] *** Process received signal ***
hades3delectrostatic: parallel/gitter_pll_sti.h:732: void
ALUGridSpace::LinkedObject::Identifier::read(__gnu_cxx::__normal_iterator<const
int*, std::vector<int, std::allocator<int> > >&, const
__gnu_cxx::__normal_iterator<const int*, std::vector<int,
std::allocator<int> > >&): Assertion `pos != end' failed.
[felsim01:02544] Signal: Aborted (6)
[felsim01:02544] Signal code: (-6)
[felsim01:02543] *** Process received signal ***
[felsim01:02543] Signal: Aborted (6)
[felsim01:02543] Signal code: (-6)
[felsim01:02543] [ 0] /lib64/libpthread.so.0 [0x35f8a0de60]
[felsim01:02543] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x35f7e30045]
[felsim01:02543] [ 2] /lib64/libc.so.6(abort+0x110) [0x35f7e31ae0]
[felsim01:02543] [ 3] /lib64/libc.so.6(__assert_fail+0xf6) [0x35f7e29756]
[felsim01:02543] [ 4]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace8identifyINS_6Gitter5hedgeEEEvNS_14AccessIteratorIT_E6HandleERSt6vectorISt4pairISt4listIS6_SaIS6_EESB_ESaISC_EERKNS_13MpAccessLocalE+0x1645)
[0x898de5]
[felsim01:02543] [ 5]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace9GitterPll14MacroGitterPll14identificationERNS_13MpAccessLocalE+0x386)
[0x82bd16]
[felsim01:02543] [ 6]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace9GitterPll22notifyMacroGridChangesEv+0x75)
[0x80eba5]
[felsim01:02543] [ 7]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace9GitterPll29loadBalancerGridChangesNotifyEv+0x448)
[0x8129a8]
[felsim01:02543] [ 8]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace13GitterDunePll15duneLoadBalanceEv+0xd)
[0x80c25d]
[felsim01:02543] [ 9]
../../../hades3delectrostatic/hades3delectrostatic(main+0x199e) [0x6369ae]
[felsim01:02543] [10] /lib64/libc.so.6(__libc_start_main+0xf4)
[0x35f7e1d8a4]
[felsim01:02543] [11]
../../../hades3delectrostatic/hades3delectrostatic(_ZNSt8ios_base4InitD1Ev+0x51)
[0x632b39]
[felsim01:02543] *** End of error message ***
[felsim01:02544] [ 0] /lib64/libpthread.so.0 [0x35f8a0de60]
[felsim01:02544] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x35f7e30045]
[felsim01:02544] [ 2] /lib64/libc.so.6(abort+0x110) [0x35f7e31ae0]
[felsim01:02544] [ 3] /lib64/libc.so.6(__assert_fail+0xf6) [0x35f7e29756]
[felsim01:02544] [ 4]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace8identifyINS_6Gitter5hedgeEEEvNS_14AccessIteratorIT_E6HandleERSt6vectorISt4pairISt4listIS6_SaIS6_EESB_ESaISC_EERKNS_13MpAccessLocalE+0x1645)
[0x898de5]
[felsim01:02544] [ 5]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace9GitterPll14MacroGitterPll14identificationERNS_13MpAccessLocalE+0x386)
[0x82bd16]
[felsim01:02544] [ 6]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace9GitterPll22notifyMacroGridChangesEv+0x75)
[0x80eba5]
[felsim01:02544] [ 7]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace9GitterPll29loadBalancerGridChangesNotifyEv+0x448)
[0x8129a8]
[felsim01:02544] [ 8]
../../../hades3delectrostatic/hades3delectrostatic(_ZN12ALUGridSpace13GitterDunePll15duneLoadBalanceEv+0xd)
[0x80c25d]
[felsim01:02544] [ 9]
../../../hades3delectrostatic/hades3delectrostatic(main+0x199e) [0x6369ae]
[felsim01:02544] [10] /lib64/libc.so.6(__libc_start_main+0xf4)
[0x35f7e1d8a4]
[felsim01:02544] [11]
../../../hades3delectrostatic/hades3delectrostatic(_ZNSt8ios_base4InitD1Ev+0x51)
[0x632b39]
[felsim01:02544] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 2544 on node felsim01 exited
on signal 6 (Aborted).
--------------------------------------------------------------------------
--
---------------------------
Paul Scherrer Institut
Florian Vogelbacher
WBCA/004
5232 Villigen PSI
Switzerland
Mail: Florian.Vogelbacher at psi.ch
Phone: +41 (0) 563105019
Web: www.psi.ch
---------------------------
More information about the Dune
mailing list