[dune-fem] Petsc test fails in parallel case

Andreas Dedner a.s.dedner at warwick.ac.uk
Thu Jul 31 10:08:34 CEST 2014


So it is not a missing index in the codimindexset as I had in the release?
Andreas

On 31/07/14 10:02, Tobias Malkmus wrote:
> Hi Andreas
>
> I tested it with YaspGrid<2> and the test fails, due to some
> communication errors within Yasp. Since the problem also exists for the
> l2projection_istl test i assume that this is a bug which is not related
> to Petsc.
>
> Best Tobias
>
> On 07/31/2014 10:35 AM, Dedner, Andreas wrote:
> > Hi Tobias.
> > Thanks for taking a look.. Could you repeat the test with yasp<2>
> because that was the default in the test within the release version.
> > Thanks
> > Andreas
>
>
> > Sent from Samsung Mobile
>
>
> > -------- Original message --------
> > From: Tobias Malkmus
> > Date:31/07/2014 09:14 (GMT+00:00)
> > To: dune-fem at dune-project.org
> > Subject: Re: [dune-fem] Petsc test fails in parallel case
>
> > Hi Andrea and Andreas
>
> > I tested the l2projection test for petsc (now in
> > dune/fem/test/l2projection_petsc) and the solver example within the
> > howto with the newest petsc version ( 3.5 ).
>
> > I can not see any problems in parallel/ serial using ALURGRID_SIMPLEX
> > as the grid type and openmpi 1.8.1 in the master branch.
>
> > For the release branch the 'old' test fails at another place. I will
> > look into it. The solver example still works fine.
>
> > As Andreas allready said, petsc tries to catch all errors and reports
> > them to dune. Dune (fem) reports them as a Petsc error, although it
> > might be a SegFault or an other system error.
>
> > In you case Andrea, Petsc reports a segmentation fault:
> > PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> > probably memory access out of range
>
> > First try to compile this test with CXXFLAGS="-Wall -g" to enable all
> > the asserts. Please report the error messages from this test.
>
> > Could you please further check wether the solver example in the howto
> > shows the same behavior.
>
>
> > Best Tobias
>
> > On 07/30/2014 10:18 PM, Andreas Dedner wrote:
> >> Hi Andrea.
>
> >> That test has apparently been removed from dune-fem  - at least it
> >> is not there in the master. So it might be old....
>
> >> I had a short look. I seems to me that if you only compile
> >> l2projection then it does not use petsc. You need
> >> PETSCLINEAROPERATOR == 1 and that is defined if you compile
> >> l2projection_petscmatrix And have usepetsc: true in the parameter
> >> file.
>
> >> So I don't really see why you are getting a petsc error with
> >> l2projextion since it should not be using petsc.... But perhaps I
> >> missed something. Have you tried the example in the dune-fem-howto
> >> - I am sure that that worked with the older version of petsc.
>
> >> I tested it myself and I also get an error. In my case an assert
> >> is triggered. Are you defining NDEBUG? If so then that might
> >> explain the problem and I have noticed that sometimes there when is
> >> a segmentation fault somewhere a petsc error is produced although
> >> it has nothing to do with petsc So perhaps that is what is happing
> >> here.
>
> >> The assert btw. is incodimindexset.hh with some invalide index
> >> being used. That is going to need a bit more investigation I'm
> >> afraid. Could you please also test the example_solver from the
> >> dune-fem-howto - I just don't want to fix some test in dune-fem
> >> which is not used anymore anyway....
>
> >> Best Andreas
>
>
> >> On 30/07/14 18:50, Sacconi, Andrea wrote:
> >>> Hi all,
> >>>
> >>> I would like to ask you another question about Petsc and its
> >>> usage in dune-fem. Recently, I installed the latest release of
> >>> this library (3.5.0), and thanks to a patch from Tobias Malkmus,
> >>> it now compiles correctly with dune-fem-1.4.0.
> >>>
> >>> I ran the demo test contained in
> >>> dune/fem/petsc/tests/l2projection, and I am sorry to say that it
> >>> failed in the parallel case. You can find attached the output of
> >>> the parallel run with 4 processors.
> >>>
> >>> Some additional info:
> >>>
> >>> gcc version 4.8.2 OpenMPI 1.6.5 Parmetis: 4.0.3 Petsc 3.5.0) >> I
> >>> don't know what I'm doing wrong, since I ran the demo tests in
> >>> Petsc and they all passed without errors. If you need any further
> >>> information, just let me know.
> >>>
> >>> Cheers, Andrea
> >>> __________________________________________________________
> >>>
> >>> Andrea Sacconi PhD student, Applied Mathematics AMMP Section,
> >>> Department of Mathematics, Imperial College London, London SW7
> >>> 2AZ, UK a.sacconi11 at imperial.ac.uk
> >>>
> >>>
> >>> _______________________________________________ dune-fem mailing
> >>> list dune-fem at dune-project.org
> >>> http://lists.dune-project.org/mailman/listinfo/dune-fem
>
>
>
>
> >> _______________________________________________ dune-fem mailing
> >> list dune-fem at dune-project.org
> >> http://lists.dune-project.org/mailman/listinfo/dune-fem
>
>
>
>
> > _______________________________________________
> > dune-fem mailing list
> > dune-fem at dune-project.org
> > http://lists.dune-project.org/mailman/listinfo/dune-fem
>
>





More information about the dune-fem mailing list