[dune-fem] Petsc test fails in parallel case

Sacconi, Andrea a.sacconi11 at imperial.ac.uk
Thu Jul 31 11:54:00 CEST 2014


Hi all,

thanks for your answers!

I think instead that there is a problem with Petsc, somewhere.
I attach a very simple, silly example, which actually initialises Petsc and creates a grid and two discrete functions over it. This test fails _IN_PARALLEL_, where the first discrete function is created.
If I replace the Petsc discrete function with either an ISTL block vector function or an Adaptive Discrete Function, everything works fine. 

I cut part of the test, where I actually assemble a mass matrix and test the CG solver from Petsc and compare its performance to the CG Inverse Operator from dune-fem and the CG from ISTL. Again, Petsc doesn't proceed (same error above), while both dune-fem and ISTL classes work perfectly fine.

I forgot to mention, I'm using the stable release 1.4.0 and not the trunk.

Cheers,
Andrea
__________________________________________________________

Andrea Sacconi
PhD student, Applied Mathematics
AMMP Section, Department of Mathematics, Imperial College London,
London SW7 2AZ, UK
a.sacconi11 at imperial.ac.uk

________________________________________
From: dune-fem-bounces+a.sacconi11=imperial.ac.uk at dune-project.org [dune-fem-bounces+a.sacconi11=imperial.ac.uk at dune-project.org] on behalf of Tobias Malkmus [tomalk at mathematik.uni-freiburg.de]
Sent: 31 July 2014 10:02
To: Dedner, Andreas; dune-fem at dune-project.org
Subject: Re: [dune-fem] Petsc test fails in parallel case

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Andreas

I tested it with YaspGrid<2> and the test fails, due to some
communication errors within Yasp. Since the problem also exists for the
l2projection_istl test i assume that this is a bug which is not related
to Petsc.

Best Tobias

On 07/31/2014 10:35 AM, Dedner, Andreas wrote:
> Hi Tobias.
> Thanks for taking a look.. Could you repeat the test with yasp<2> because that was the default in the test within the release version.
> Thanks
> Andreas
>
>
> Sent from Samsung Mobile
>
>
> -------- Original message --------
> From: Tobias Malkmus
> Date:31/07/2014 09:14 (GMT+00:00)
> To: dune-fem at dune-project.org
> Subject: Re: [dune-fem] Petsc test fails in parallel case
>
> Hi Andrea and Andreas
>
> I tested the l2projection test for petsc (now in
> dune/fem/test/l2projection_petsc) and the solver example within the
> howto with the newest petsc version ( 3.5 ).
>
> I can not see any problems in parallel/ serial using ALURGRID_SIMPLEX
> as the grid type and openmpi 1.8.1 in the master branch.
>
> For the release branch the 'old' test fails at another place. I will
> look into it. The solver example still works fine.
>
> As Andreas allready said, petsc tries to catch all errors and reports
> them to dune. Dune (fem) reports them as a Petsc error, although it
> might be a SegFault or an other system error.
>
> In you case Andrea, Petsc reports a segmentation fault:
> PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
>
> First try to compile this test with CXXFLAGS="-Wall -g" to enable all
> the asserts. Please report the error messages from this test.
>
> Could you please further check wether the solver example in the howto
> shows the same behavior.
>
>
> Best Tobias
>
> On 07/30/2014 10:18 PM, Andreas Dedner wrote:
>> Hi Andrea.
>
>> That test has apparently been removed from dune-fem  - at least it
>> is not there in the master. So it might be old....
>
>> I had a short look. I seems to me that if you only compile
>> l2projection then it does not use petsc. You need
>> PETSCLINEAROPERATOR == 1 and that is defined if you compile
>> l2projection_petscmatrix And have usepetsc: true in the parameter
>> file.
>
>> So I don't really see why you are getting a petsc error with
>> l2projextion since it should not be using petsc.... But perhaps I
>> missed something. Have you tried the example in the dune-fem-howto
>> - I am sure that that worked with the older version of petsc.
>
>> I tested it myself and I also get an error. In my case an assert
>> is triggered. Are you defining NDEBUG? If so then that might
>> explain the problem and I have noticed that sometimes there when is
>> a segmentation fault somewhere a petsc error is produced although
>> it has nothing to do with petsc So perhaps that is what is happing
>> here.
>
>> The assert btw. is incodimindexset.hh with some invalide index
>> being used. That is going to need a bit more investigation I'm
>> afraid. Could you please also test the example_solver from the
>> dune-fem-howto - I just don't want to fix some test in dune-fem
>> which is not used anymore anyway....
>
>> Best Andreas
>
>
>> On 30/07/14 18:50, Sacconi, Andrea wrote:
>>> Hi all,
>>>
>>> I would like to ask you another question about Petsc and its
>>> usage in dune-fem. Recently, I installed the latest release of
>>> this library (3.5.0), and thanks to a patch from Tobias Malkmus,
>>> it now compiles correctly with dune-fem-1.4.0.
>>>
>>> I ran the demo test contained in
>>> dune/fem/petsc/tests/l2projection, and I am sorry to say that it
>>> failed in the parallel case. You can find attached the output of
>>> the parallel run with 4 processors.
>>>
>>> Some additional info:
>>>
>>> gcc version 4.8.2 OpenMPI 1.6.5 Parmetis: 4.0.3 Petsc 3.5.0) >> I
>>> don't know what I'm doing wrong, since I ran the demo tests in
>>> Petsc and they all passed without errors. If you need any further
>>> information, just let me know.
>>>
>>> Cheers, Andrea
>>> __________________________________________________________
>>>
>>> Andrea Sacconi PhD student, Applied Mathematics AMMP Section,
>>> Department of Mathematics, Imperial College London, London SW7
>>> 2AZ, UK a.sacconi11 at imperial.ac.uk
>>>
>>>
>>> _______________________________________________ dune-fem mailing
>>> list dune-fem at dune-project.org
>>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>
>
>
>
>> _______________________________________________ dune-fem mailing
>> list dune-fem at dune-project.org
>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>
>
>
>
> _______________________________________________
> dune-fem mailing list
> dune-fem at dune-project.org
> http://lists.dune-project.org/mailman/listinfo/dune-fem
>

- --
Tobias Malkmus                 <tomalk at mathematik.uni-freiburg.de>

Mathematisches Institut               Tel: +49 761 203 5627
Abt. für Angewandte Mathematik        Universität Freiburg
Hermann-Herder-Str. 10
79104 Freiburg

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.22 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/

iQEcBAEBAgAGBQJT2gYRAAoJEBMY7DHvcWNgIq0H/2EtuPbTq6HSvisyVN5RbaKH
cNLygp/5zetIlsnklmKMOEq05RGmljh3MFoNYOm97oeUIrG0zYbVarx8usu7oyPt
BAQ890+KjbzOrBhtEyn8NnXtnQAlhk4glhrFDTSToVPw5jB7Lx4wD0KsdQ/5SiVz
yon4vhr7QsVJdjrAikgsJG7VwvEuY8YaPflFxtKV89pd0YVHHtC+wLu4zyrLGJyF
ZlRbLrL+Ow/txhYdRQ/o2OsQIE0gd/uO/mcFRPXL7gr6x/pPZL0BzvtHuaaSNAhg
DheOgxeAcTVLkN5zOrMMLFNMpY9BEqGmx6KhUz6vQ/Hoo9PQNiPqIgALrFUATYE=
=sIxd
-----END PGP SIGNATURE-----

_______________________________________________
dune-fem mailing list
dune-fem at dune-project.org
http://lists.dune-project.org/mailman/listinfo/dune-fem
-------------- next part --------------
A non-text attachment was scrubbed...
Name: testPetsc.cc
Type: text/x-c++src
Size: 4073 bytes
Desc: testPetsc.cc
URL: <https://lists.dune-project.org/pipermail/dune-fem/attachments/20140731/1a4ff793/attachment.cc>


More information about the dune-fem mailing list