[dune-pdelab] P2 in parallel - small correction

Jö Fahlke jorrit at jorrit.de
Thu Dec 15 16:39:38 CET 2011


Am Thu, 15. Dec 2011, 13:11:35 +0100 schrieb Volker Schauer:
> Am Donnerstag, den 15.12.2011, 10:27 +0100 schrieb Volker Schauer:
> > Maybe I should reformulate my question and give some additional
> > information about the way I am setting up the problem.
> > 
> > I use the P2 elements in a nonoverlapping setting, given by the alugrid.
> > In principal I use the solver 
> > 
> > Dune::PDELab::ISTLBackend_NOVLP_CG_SSORk<PGOS> 
> > 
> > to solve the problem, where I find that the solver converges, but the
> > tested residuum is not zero.
> > 
> > As input to the solver, I give an "consistent vector" which has the same
> > value on processor boundary nodes on all processes but arbitrary values
> 
> I don't have the ghost dofs with arbitrary values, but values are set to
> zero by
> 
> Dune::PDELab::set_constrained_dofs(cc,0.0,rhs_poiss);
> 
> where the constraints container cc contains also the ghost nodes.  

I'm assuming your using PDELab's GridOperator(Space) to assemble the matrix
(or compute the Jacobian on the fly).  Did you pass the template parameter
nonoverlapping_mode=true?

Bye,
Jö.

> 
> 
> Sorry for increasing your mail box needlessly
> 
> Volker
> 
> > on the ghost nodes (is that correct or should these values also be the
> > same as their interior complements ?) and I also expect to get such an
> > consistant vector in the solution.
> > 
> > So my question is, whether the upper assumptions are correct or if there
> > might be a reason in pdelab why the P2 don't work, e.g. due to
> > missing/incorrect communication.
> > 
> > I hope this explains my problem sufficiently
> > 
> > Thanks
> > 
> > Volker
> > 
> > Am Mittwoch, den 14.12.2011, 12:22 +0100 schrieb schauer:
> > > Dear dune-pdelab,
> > > 
> > > does the parallel implementation of 3-dimensional P2 elements work fine
> > > in pdelab ?
> > > 
> > > I am solving the 3D Poisson problem in parallel using alugrid. With P1
> > > elements in parallel everything works fine and gives me the same
> > > solution as on a single process. When switching to P2 (where I only
> > > change the gridfunction space) the solver still tells me that the
> > > residual is zero, but when I actually calculate it by hand it is quite
> > > large. This is often the case if things on the boundary/ghost nodes goes
> > > wrong.
> > > 
> > > Does anybody have experience with 3D P2 elements ?
> > > 
> > > Thanks for your answers
> > > 
> > > Volker
> > > 
> > > 
> > > _______________________________________________
> > > dune-pdelab mailing list
> > > dune-pdelab at dune-project.org
> > > http://lists.dune-project.org/mailman/listinfo/dune-pdelab
> > 
> 
> -- 
> 
> _/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/_/
> 
> Volker Schauer (Dipl.Phys)
> 
> Institute of Applied Mechanics (Civil Engineering), Chair 1
> University of Stuttgart
> Pfaffenwaldring 7
> D-70569 Stuttgart
> GERMANY
> 
> Phone ++49 (0)711-68560044
> email: schauer at mechbau.uni-stuttgart.de
> 
> 
> _______________________________________________
> dune-pdelab mailing list
> dune-pdelab at dune-project.org
> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
> 

-- 
Jorrit (Jö) Fahlke, Interdisciplinary Center for Scientific Computing,
Heidelberg University, Im Neuenheimer Feld 368, D-69120 Heidelberg
Tel: +49 6221 54 8890 Fax: +49 6221 54 8884

Of all the things I've lost, I miss my mind the most.
-- Ozzy Osbourne
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 828 bytes
Desc: Digital signature
URL: <https://lists.dune-project.org/pipermail/dune-pdelab/attachments/20111215/4fd090e1/attachment.sig>


More information about the dune-pdelab mailing list