[dune-pdelab] P2 in parallel
Bernd Flemisch
bernd at iws.uni-stuttgart.de
Mon Jan 9 18:02:44 CET 2012
Ok, actually it is quite obvious. The backends derived from
ISTLBackend_NOVLP_BASE_PREC employ parallel preconditioners which make
use of the class VertexExchanger to sum up matrix entries corresponding
to border vertices (I am partly responsible for this). No edges are
treated there and that seems to mess up the result.
So the immediate solution is to use non-preconditioned parallel solvers.
Of course this is only feasible as long as your systems don't get too big.
I will have a look at extending the VertexExchanger to also deal with
edges and faces.
Kind regards
Bernd
On 01/09/2012 05:24 PM, Bernd Flemisch wrote:
> Hey Volker,
>
> the choice of the solver backend has a dramatic impact. If I choose
> ISTLBackend_NOVLP_CG_NOPREC instead of ISTLBackend_NOVLP_BCGS_SSORk,
> the solution looks correct. I will dig a bit deeper and try to find
> the reason.
>
> Kind regards
> Bernd
>
> On 12/23/2011 11:07 AM, Volker Schauer wrote:
>> Thanks Bernd,
>>
>> this helps in the way that the problem is probably into pdelab and not
>> on wrong assumptions on my side.
>>
>> Then have a good christmas time
>>
>> Volker
>>
>> -------- Weitergeleitete Nachricht --------
>> Von: Bernd Flemisch<bernd at iws.uni-stuttgart.de>
>> An: dune-pdelab at dune-project.org
>> Betreff: Re: [dune-pdelab] P2 in parallel
>> Datum: Thu, 22 Dec 2011 17:06:00 +0100
>>
>> Hey Volver, PDELab,
>>
>> sorry, I don't have time to look into your code at the moment. But I had
>> a look at
>> dune-pdelab-howto/src/convection-diffusion/nonoverlappingsinglephaseflow.cc
>>
>> and adapted it to P2. I think it runs into the same problems as you do
>> with your code. A sequential run works fine, while a parallel run on two
>> processes produces completely wrong results. And the problem seems to
>> come from the edge degrees of freedom.
>>
>> Maybe a PDELab-developer has a chance to look at it, a patch against
>> r491 is attached. I can also have a closer look in the first week of
>> January.
>>
>> Kind regards
>> Bernd
>>
>>
>> On 12/22/2011 02:40 PM, Volker Schauer wrote:
>>> ok, so although I don't want to bother anybody and even if Chistmas is
>>> coming soon, anyway, I will do it now.
>>> So are in the code I sent to the mailing list last Friday some basic
>>> assumptions wrong, does it give vanishing residuals also for you or do
>>> you just agree that there might be a problem with the P2 elements in
>>> parallel ?
>>> Since others also seem to use the pdelab in parallel, and don't face
>>> this problem it might also be just a wrong application of the methods,
>>> but where ?
>>> So please give at least some comments, because somehow due to this mad
>>> deterministic behaviour of these modern computers, I feel I will
>>> come back next year and the problem won't have gone.
>>>
>>> Many thanks
>>>
>>> Volker
>>>
>>>
>>>
>>> -------- Weitergeleitete Nachricht --------
>>> Von: Volker Schauer<schauer at mechbau.uni-stuttgart.de>
>>> An: dune-pdelab at dune-project.org
>>> Betreff: Re: [dune-pdelab] P2 in parallel - small correction
>>> Datum: Fri, 16 Dec 2011 15:22:46 +0100
>>>
>>> So here is a test area for the Poisson problem. The solver
>>> definition is
>>> actually the ingredients of the class
>>>
>>> Dune::PDELab::ISTLBackend_NOVLP_CG_SSORk<PGOS>,
>>>
>>> I extracted these, since I am using the Poisson problem repeatedly and
>>> the preconditioner is changing the matrix, so that it cannot be reused
>>> in the upper case (a fact that cost me a lot of time to find out!)
>>>
>>> If one now changes the P_order to 1 everything works fine on one as
>>> well
>>> as on 2 processes, however for P_order = 2 on 2 processes the residuum
>>> does not vanish any more.
>>>
>>>
>>> @ Bernd - you were probably right, but when I read your
>>> mail I had almost completed the set up, thanks Bernd.
>>>
>>>
>>> Volker
>>>
>>>
>>> -------- Weitergeleitete Nachricht --------
>>> Von: Bernd Flemisch<bernd at iws.uni-stuttgart.de>
>>> Kopie: dune-pdelab<dune-pdelab at dune-project.org>
>>> Betreff: Re: [dune-pdelab] P2 in parallel - small correction
>>> Datum: Fri, 16 Dec 2011 13:05:53 +0100
>>>
>>> Hey Volker,
>>>
>>> would it not be easier to adapt the nonoverlappingsinglephaseflow
>>> example from the pdelab howto to get a test case? There
>>> (src/convection-diffusion/nonoverlappingsinglephaseflow.cc), I think
>>> you
>>> only have to set k equal to 2 in the Pk3DLocalFiniteElementMap in lines
>>> 314 or 376.
>>>
>>> Kind regards
>>> Bernd
>>>
>>> On 12/16/2011 12:33 PM, Volker Schauer wrote:
>>>> Hi Markus
>>>>
>>>> The norm of the residual (calculated with novlpSP.dot) which on a
>>>> single
>>>> process goes to values less then 10^-10 gives me now a value at the
>>>> size
>>>> of>= 36000 and changing the tolerance of the solver basically does not
>>>> effect this.
>>>>
>>>> I did not look at the residual in a greater detail, so maybe I should
>>>> check if it is large e.g. only on boundary nodes ? But how would I
>>>> check
>>>> that, since I do not really know/understand the assignment of
>>>> indices in
>>>> gridvectors like the residual to the entities of the grid with that
>>>> strange offset in Dune and so I do not know which of my indices
>>>> belongs
>>>> to a boundary node. (If somebody could give here a "short"
>>>> explanation,
>>>> I think this would be valueable by its own, also for other users).
>>>> So I could iterate over the entities of e.g. codim 3 and find that
>>>> one's
>>>> partition type is border, but now how do I find which index in the
>>>> grid-vector is assigned to that entity ?
>>>>
>>>>
>>>> So I will try to extract the Poisson problem and send it to the
>>>> mailing
>>>> list.
>>>>
>>>> Thanks for your help (all pdelab members) - I really appreciate that
>>>>
>>>> Volker
>>>>
>>>>
>>>>
>>>> -------- Weitergeleitete Nachricht --------
>>>> Von: Markus Blatt<markus at dr-blatt.de>
>>>> Reply-to: Markus Blatt<markus at dr-blatt.de>
>>>> An: Volker Schauer<schauer at mechbau.uni-stuttgart.de>
>>>> Kopie: dune-pdelab<dune-pdelab at dune-project.org>
>>>> Betreff: Re: [dune-pdelab] P2 in parallel - small correction
>>>> Datum: Fri, 16 Dec 2011 11:46:27 +0100
>>>>
>>>> Hi,
>>>>
>>>> On Fri, Dec 16, 2011 at 11:34:36AM +0100, Volker Schauer wrote:
>>>>> It would be helpful if someone says that he has already used the
>>>>> communication from PDELab to set up the Poisson or any other PDE
>>>>> successfully with P2 elements (I suppose this was tested in PDELab
>>>>> when
>>>>> the parallel use was implemented?). Would it help you or anybody
>>>>> else if
>>>>> I extract the Poisson problem (which is embedded in a larger code
>>>>> doing
>>>>> some more stuff) and send it to the mailing list ?
>>>>>
>>>> Yes, that would help in two ways: giving us an easy test case and
>>>> checking whether there is a mistake in the usage of PDELab.
>>>>
>>>> You were saying that your residual is not not zero. Would do you mean
>>>> by zero? Our solvers only achieve a relative residual reduction, so
>>>> this might happen if your intial residual is large.
>>>>
>>>> Have you looked at the residual? If so, where does is deviate from the
>>>> expected values?
>>>>
>>>> Cheers,
>>>>
>>>> Markus
>>>>
>>>>
>>>
>>> _______________________________________________
>>> dune-pdelab mailing list
>>> dune-pdelab at dune-project.org
>>> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
>>>
>>
>> _______________________________________________
>> dune-pdelab mailing list
>> dune-pdelab at dune-project.org
>> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
>>
>
>
--
_____________________________________________________________________
Bernd Flemisch phone: +49 711 685 69162
IWS, Universität Stuttgart fax: +49 711 685 60430
Pfaffenwaldring 61 email: bernd at iws.uni-stuttgart.de
D-70569 Stuttgart url: www.hydrosys.uni-stuttgart.de
_____________________________________________________________________
More information about the dune-pdelab
mailing list