[dune-fem] Petsc test fails in parallel case
Tobias Malkmus
tomalk at mathematik.uni-freiburg.de
Thu Jul 31 13:54:55 CEST 2014
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
A further hint to this.
If compiled with debug flags an assertion fails, namely:
void Dune::Fem::PetscVector<DFSpace>::init() [with DFSpace =
Dune::Fem::LagrangeDiscreteFunctionSpace<Dune::Fem::FunctionSpace<double,
double, 3, 1>, Dune::Fem::AdaptiveLeafGridPart<Dune::ALUGrid<3, 3,
(Dune::ALUGridElementType)0u, (Dune::ALUGridRefinementType)1u> >, 1>]:
Assertion `int( ghostArrayBuilder.size() ) ==
dofMapping().numSlaveBlocks()' failed
It only fails in parallel and for Griddim = 3 in combination with
LagrangeDiscretFunctionSpaces.
Best Tobias
On 07/31/2014 01:45 PM, Sacconi, Andrea wrote:
> Hi all,
>
> I ran the l2projection_petsc in the trunk of dune-fem, and it fails
> with both YaspGrid and ALUGrid. I ran my little example with the
> trunk as well, and it fails likewise. The error seems to be in the
> creation of the Petsc vector for the discrete function:
>
> Exception
> [ErrorHandler:../../../dune/fem/misc/petsc/petsccommon.hh:80]:
> PETSc Error in the PETSc function 'VecScatterCreate_PtoS' at
> /usr/local_machine/petsc-3.5.0-source/src/vec/vec/utils/vpscat.c:2339:
> 'Petsc has generated inconsistent data'. Error message: 'ith 10
> block entry 177 not owned by any process, upper bound 125'
>
> Should you need any further info, just let me know. Best, Andrea
> __________________________________________________________
>
> Andrea Sacconi PhD student, Applied Mathematics AMMP Section,
> Department of Mathematics, Imperial College London, London SW7 2AZ,
> UK a.sacconi11 at imperial.ac.uk
>
> ________________________________________ From: Tobias Malkmus
> [tomalk at mathematik.uni-freiburg.de] Sent: 31 July 2014 11:43 To:
> Sacconi, Andrea; Dedner, Andreas; dune-fem at dune-project.org
> Subject: Re: [dune-fem] Petsc test fails in parallel case
>
> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
>
> Hi Andrea
>
> So now i agree that there is a petsc error. For the release and
> your small test example i was abled to reproduce the error you
> showed me in one of the mails. I will have a look into it.
>
> Best Tobias
>
> On 07/31/2014 11:54 AM, Sacconi, Andrea wrote:
>> Hi all,
>>
>> thanks for your answers!
>>
>> I think instead that there is a problem with Petsc, somewhere. I
>> attach a very simple, silly example, which actually initialises
>> Petsc and creates a grid and two discrete functions over it.
>> This test fails _IN_PARALLEL_, where the first discrete function
>> is created. If I replace the Petsc discrete function with either
>> an ISTL block vector function or an Adaptive Discrete Function,
>> everything works fine.
>>
>> I cut part of the test, where I actually assemble a mass matrix
>> and test the CG solver from Petsc and compare its performance to
>> the CG Inverse Operator from dune-fem and the CG from ISTL.
>> Again, Petsc doesn't proceed (same error above), while both
>> dune-fem and ISTL classes work perfectly fine.
>>
>> I forgot to mention, I'm using the stable release 1.4.0 and not
>> the trunk.
>>
>> Cheers, Andrea
>> __________________________________________________________
>>
>> Andrea Sacconi PhD student, Applied Mathematics AMMP Section,
>> Department of Mathematics, Imperial College London, London SW7
>> 2AZ, UK a.sacconi11 at imperial.ac.uk
>>
>> ________________________________________ From:
>> dune-fem-bounces+a.sacconi11=imperial.ac.uk at dune-project.org
>> [dune-fem-bounces+a.sacconi11=imperial.ac.uk at dune-project.org]
>> on behalf of Tobias Malkmus [tomalk at mathematik.uni-freiburg.de]
>> Sent: 31 July 2014 10:02 To: Dedner, Andreas;
>> dune-fem at dune-project.org Subject: Re: [dune-fem] Petsc test
>> fails in parallel case
>>
>> Hi Andreas
>>
>> I tested it with YaspGrid<2> and the test fails, due to some
>> communication errors within Yasp. Since the problem also exists
>> for the l2projection_istl test i assume that this is a bug which
>> is not related to Petsc.
>>
>> Best Tobias
>>
>> On 07/31/2014 10:35 AM, Dedner, Andreas wrote:
>>> Hi Tobias. Thanks for taking a look.. Could you repeat the
>>> test with yasp<2> because that was the default in the test
>>> within the release version. Thanks Andreas
>>
>>
>>> Sent from Samsung Mobile
>>
>>
>>> -------- Original message -------- From: Tobias Malkmus
>>> Date:31/07/2014 09:14 (GMT+00:00) To:
>>> dune-fem at dune-project.org Subject: Re: [dune-fem] Petsc test
>>> fails in parallel case
>>
>>> Hi Andrea and Andreas
>>
>>> I tested the l2projection test for petsc (now in
>>> dune/fem/test/l2projection_petsc) and the solver example
>>> within the howto with the newest petsc version ( 3.5 ).
>>
>>> I can not see any problems in parallel/ serial using
>>> ALURGRID_SIMPLEX as the grid type and openmpi 1.8.1 in the
>>> master branch.
>>
>>> For the release branch the 'old' test fails at another place.
>>> I will look into it. The solver example still works fine.
>>
>>> As Andreas allready said, petsc tries to catch all errors and
>>> reports them to dune. Dune (fem) reports them as a Petsc
>>> error, although it might be a SegFault or an other system
>>> error.
>>
>>> In you case Andrea, Petsc reports a segmentation fault: PETSC
>>> ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>> probably memory access out of range
>>
>>> First try to compile this test with CXXFLAGS="-Wall -g" to
>>> enable all the asserts. Please report the error messages from
>>> this test.
>>
>>> Could you please further check wether the solver example in
>>> the howto shows the same behavior.
>>
>>
>>> Best Tobias
>>
>>> On 07/30/2014 10:18 PM, Andreas Dedner wrote:
>>>> Hi Andrea.
>>
>>>> That test has apparently been removed from dune-fem - at
>>>> least it is not there in the master. So it might be old....
>>
>>>> I had a short look. I seems to me that if you only compile
>>>> l2projection then it does not use petsc. You need
>>>> PETSCLINEAROPERATOR == 1 and that is defined if you compile
>>>> l2projection_petscmatrix And have usepetsc: true in the
>>>> parameter file.
>>
>>>> So I don't really see why you are getting a petsc error with
>>>> l2projextion since it should not be using petsc.... But
>>>> perhaps I missed something. Have you tried the example in
>>>> the dune-fem-howto - I am sure that that worked with the
>>>> older version of petsc.
>>
>>>> I tested it myself and I also get an error. In my case an
>>>> assert is triggered. Are you defining NDEBUG? If so then
>>>> that might explain the problem and I have noticed that
>>>> sometimes there when is a segmentation fault somewhere a
>>>> petsc error is produced although it has nothing to do with
>>>> petsc So perhaps that is what is happing here.
>>
>>>> The assert btw. is incodimindexset.hh with some invalide
>>>> index being used. That is going to need a bit more
>>>> investigation I'm afraid. Could you please also test the
>>>> example_solver from the dune-fem-howto - I just don't want to
>>>> fix some test in dune-fem which is not used anymore
>>>> anyway....
>>
>>>> Best Andreas
>>
>>
>>>> On 30/07/14 18:50, Sacconi, Andrea wrote:
>>>>> Hi all,
>>>>>
>>>>> I would like to ask you another question about Petsc and
>>>>> its usage in dune-fem. Recently, I installed the latest
>>>>> release of this library (3.5.0), and thanks to a patch from
>>>>> Tobias Malkmus, it now compiles correctly with
>>>>> dune-fem-1.4.0.
>>>>>
>>>>> I ran the demo test contained in
>>>>> dune/fem/petsc/tests/l2projection, and I am sorry to say
>>>>> that it failed in the parallel case. You can find attached
>>>>> the output of the parallel run with 4 processors.
>>>>>
>>>>> Some additional info:
>>>>>
>>>>> gcc version 4.8.2 OpenMPI 1.6.5 Parmetis: 4.0.3 Petsc
>>>>> 3.5.0)
>>>>>>> I don't know what I'm doing wrong, since I ran the
>>>>>>> demo
>>>>> tests in Petsc and they all passed without errors. If you
>>>>> need any further information, just let me know.
>>>>>
>>>>> Cheers, Andrea
>>>>> __________________________________________________________
>>>>>
>>>>> Andrea Sacconi PhD student, Applied Mathematics AMMP
>>>>> Section, Department of Mathematics, Imperial College
>>>>> London, London SW7 2AZ, UK a.sacconi11 at imperial.ac.uk
>>>>>
>>>>>
>>>>> _______________________________________________ dune-fem
>>>>> mailing list dune-fem at dune-project.org
>>>>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>>
>>
>>
>>
>>>> _______________________________________________ dune-fem
>>>> mailing list dune-fem at dune-project.org
>>>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>>
>>
>>
>>
>>> _______________________________________________ dune-fem
>>> mailing list dune-fem at dune-project.org
>>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>>
>>
>>
>> _______________________________________________ dune-fem mailing
>> list dune-fem at dune-project.org
>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>>
>
> - -- Tobias Malkmus <tomalk at
> mathematik.uni-freiburg.de>
>
> Mathematisches Institut Tel: +49 761 203 5627 Abt.
> für Angewandte Mathematik Universität Freiburg
> Hermann-Herder-Str. 10 79104 Freiburg
>
> -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.22 (GNU/Linux)
> Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
>
> iQEcBAEBAgAGBQJT2h27AAoJEBMY7DHvcWNg7pIH/0/A6sEfjlVUugqidYa5J4Xj
> 2vVdplO/xLNLMWnkPZSiPYEi60LygcqbUURreplFl6uIompSwK1I1ItcbRq4GN/1
> lTcuPRfXTlCuaPLt7nsq3qsrhsBmTt/3Rwm7jzxkSCA1PwWrG3h6+YGL9XbxKgFR
> dFQ0OR6kterWAzuOcZYVX2MSUFOT524R/Q+H/qvYdLRuNNBxBU4O48CDmx3SjI0o
> t/DfDAum8BOQqZiqU7ZjHQmXGzgHl4DlqrUT25lWv4pyF3D6hlGhzsyxVzbP8d9x
> er44sYqNuZQ2+zLuVDwEwTY6TIrX90nDw85Y8QHzULsRk/HYQNowmwkqzFRFO9M=
> =6ewe -----END PGP SIGNATURE-----
>
>
- --
Tobias Malkmus <tomalk at mathematik.uni-freiburg.de>
Mathematisches Institut Tel: +49 761 203 5627
Abt. für Angewandte Mathematik Universität Freiburg
Hermann-Herder-Str. 10
79104 Freiburg
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.22 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
iQEcBAEBAgAGBQJT2i6PAAoJEBMY7DHvcWNg548H/2lvnvohjtZk8MODMTMsijAU
ES7+Ol8YhtCrWJ+VEK+1+nd9g5RcrQieW2XZHTV7Y+b3ok4iEstRclgGYlWpg34s
djeEXTGANKrdLO8K8w2yKs5ASXFx9JpY0ZA8k6sr1YwdnTkIzwkjYwTu8x9BTbil
KR4wtEwGP8gxyqG5d3/Kbfpg7F+LxoLjWkniKRNsX/KjAut3RBZSi9wt9blVftVC
zPGKS3RNPaAM6BNQ1HxMdsXxkIJQkMsJkzPPxhaAs/2PvQlYs0gtAOxVvgB3A6+g
hUUu3cbNveL9bQc8h2UV0bEue/b6KkGzrkkawxQXdKGpTzCch43lEdr5kTWSlA0=
=Ouas
-----END PGP SIGNATURE-----
More information about the dune-fem
mailing list