[dune-fem] Petsc test fails in parallel case

Tobias Malkmus tomalk at mathematik.uni-freiburg.de
Thu Jul 31 11:02:09 CEST 2014


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Andreas

I tested it with YaspGrid<2> and the test fails, due to some
communication errors within Yasp. Since the problem also exists for the
l2projection_istl test i assume that this is a bug which is not related
to Petsc.

Best Tobias

On 07/31/2014 10:35 AM, Dedner, Andreas wrote:
> Hi Tobias.
> Thanks for taking a look.. Could you repeat the test with yasp<2> because that was the default in the test within the release version.
> Thanks
> Andreas
> 
> 
> Sent from Samsung Mobile
> 
> 
> -------- Original message --------
> From: Tobias Malkmus
> Date:31/07/2014 09:14 (GMT+00:00)
> To: dune-fem at dune-project.org
> Subject: Re: [dune-fem] Petsc test fails in parallel case
> 
> Hi Andrea and Andreas
> 
> I tested the l2projection test for petsc (now in
> dune/fem/test/l2projection_petsc) and the solver example within the
> howto with the newest petsc version ( 3.5 ).
> 
> I can not see any problems in parallel/ serial using ALURGRID_SIMPLEX
> as the grid type and openmpi 1.8.1 in the master branch.
> 
> For the release branch the 'old' test fails at another place. I will
> look into it. The solver example still works fine.
> 
> As Andreas allready said, petsc tries to catch all errors and reports
> them to dune. Dune (fem) reports them as a Petsc error, although it
> might be a SegFault or an other system error.
> 
> In you case Andrea, Petsc reports a segmentation fault:
> PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> 
> First try to compile this test with CXXFLAGS="-Wall -g" to enable all
> the asserts. Please report the error messages from this test.
> 
> Could you please further check wether the solver example in the howto
> shows the same behavior.
> 
> 
> Best Tobias
> 
> On 07/30/2014 10:18 PM, Andreas Dedner wrote:
>> Hi Andrea.
> 
>> That test has apparently been removed from dune-fem  - at least it
>> is not there in the master. So it might be old....
> 
>> I had a short look. I seems to me that if you only compile
>> l2projection then it does not use petsc. You need
>> PETSCLINEAROPERATOR == 1 and that is defined if you compile
>> l2projection_petscmatrix And have usepetsc: true in the parameter
>> file.
> 
>> So I don't really see why you are getting a petsc error with
>> l2projextion since it should not be using petsc.... But perhaps I
>> missed something. Have you tried the example in the dune-fem-howto
>> - I am sure that that worked with the older version of petsc.
> 
>> I tested it myself and I also get an error. In my case an assert
>> is triggered. Are you defining NDEBUG? If so then that might
>> explain the problem and I have noticed that sometimes there when is
>> a segmentation fault somewhere a petsc error is produced although
>> it has nothing to do with petsc So perhaps that is what is happing
>> here.
> 
>> The assert btw. is incodimindexset.hh with some invalide index
>> being used. That is going to need a bit more investigation I'm
>> afraid. Could you please also test the example_solver from the
>> dune-fem-howto - I just don't want to fix some test in dune-fem
>> which is not used anymore anyway....
> 
>> Best Andreas
> 
> 
>> On 30/07/14 18:50, Sacconi, Andrea wrote:
>>> Hi all,
>>>
>>> I would like to ask you another question about Petsc and its
>>> usage in dune-fem. Recently, I installed the latest release of
>>> this library (3.5.0), and thanks to a patch from Tobias Malkmus,
>>> it now compiles correctly with dune-fem-1.4.0.
>>>
>>> I ran the demo test contained in
>>> dune/fem/petsc/tests/l2projection, and I am sorry to say that it
>>> failed in the parallel case. You can find attached the output of
>>> the parallel run with 4 processors.
>>>
>>> Some additional info:
>>>
>>> gcc version 4.8.2 OpenMPI 1.6.5 Parmetis: 4.0.3 Petsc 3.5.0) >> I
>>> don't know what I'm doing wrong, since I ran the demo tests in
>>> Petsc and they all passed without errors. If you need any further
>>> information, just let me know.
>>>
>>> Cheers, Andrea
>>> __________________________________________________________
>>>
>>> Andrea Sacconi PhD student, Applied Mathematics AMMP Section,
>>> Department of Mathematics, Imperial College London, London SW7
>>> 2AZ, UK a.sacconi11 at imperial.ac.uk
>>>
>>>
>>> _______________________________________________ dune-fem mailing
>>> list dune-fem at dune-project.org
>>> http://lists.dune-project.org/mailman/listinfo/dune-fem
> 
> 
> 
> 
>> _______________________________________________ dune-fem mailing
>> list dune-fem at dune-project.org
>> http://lists.dune-project.org/mailman/listinfo/dune-fem
> 
> 
> 
> 
> _______________________________________________
> dune-fem mailing list
> dune-fem at dune-project.org
> http://lists.dune-project.org/mailman/listinfo/dune-fem
> 

- -- 
Tobias Malkmus                 <tomalk at mathematik.uni-freiburg.de>

Mathematisches Institut               Tel: +49 761 203 5627
Abt. für Angewandte Mathematik        Universität Freiburg
Hermann-Herder-Str. 10
79104 Freiburg

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.22 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/

iQEcBAEBAgAGBQJT2gYRAAoJEBMY7DHvcWNgIq0H/2EtuPbTq6HSvisyVN5RbaKH
cNLygp/5zetIlsnklmKMOEq05RGmljh3MFoNYOm97oeUIrG0zYbVarx8usu7oyPt
BAQ890+KjbzOrBhtEyn8NnXtnQAlhk4glhrFDTSToVPw5jB7Lx4wD0KsdQ/5SiVz
yon4vhr7QsVJdjrAikgsJG7VwvEuY8YaPflFxtKV89pd0YVHHtC+wLu4zyrLGJyF
ZlRbLrL+Ow/txhYdRQ/o2OsQIE0gd/uO/mcFRPXL7gr6x/pPZL0BzvtHuaaSNAhg
DheOgxeAcTVLkN5zOrMMLFNMpY9BEqGmx6KhUz6vQ/Hoo9PQNiPqIgALrFUATYE=
=sIxd
-----END PGP SIGNATURE-----




More information about the dune-fem mailing list