[dune-pdelab] Dune::PDELab::ISTLBackend_NOVLP_CG_AMG_SSOR with ALUGrid

Eike Mueller E.Mueller at bath.ac.uk
Thu Mar 1 14:47:56 CET 2012


Hi Christian and Rebecca,

I've now tried using the FV discretisation for ALUGrid as you suggested, 
and it works, at least as long as I use the parallel overlapping 
CG_AMG_SSOR ISTL backend. In this case I don't have to make any changes 
to the code, not even use P0GhostConstraints instead of 
OverlappingConformingDirichletConstraints. As a test case, I solved a 
PDE with known solution in a unit cube, implemented with ALUGrid and 
compared to the same implementation with YaspGrid. The convergence rate 
is better with YaspGrid, but this might also be down to the fact that I 
partition the horizontal direction into 2x2=4 identical squares, whereas 
with ALUGrid the horizontal partitioning is done with Metis and the four 
partitions have different shapes.

It does not work with the overlapping ISTL backend for CG, 
preconditioned with SSOR only, but I think that is because there I can 
not use the FV discretisation (it does not work with YaspGrid either, 
but there I can use FEM).

Rebecca, which solver did you use? Is it possible that you only have to 
use p0ghostconstraints if you use one of the overlapping preconditioners 
, but not if you use AMG?

For us, AMG has to be turned out to be the most robust preconditioner 
anyway (possibly with ILU0 as the smoother), so I think we now have 
everything together to run on the sphere.

Thanks again for your help,

Eike

Christian Engwer wrote:
> Hi Eike,
> 
>> that's very interesting. So far I've only been using the overlapping
>> solvers on YaspGrid, but I need to move to ALUGrid for my spherical
>> grid, and as far as I know I can only use the non-overlapping
>> solvers there as ALUGrid does not have any overlap.
> 
> in principal yes, but it depends a littel bit on your
> discretization. If you are using finite-volume for example, you only
> have couplings to the face-neighbors and these are all available,
> although alugrid doesn't provide a full overlap. In this case you
> should be able to use an overlapping solver.
> 
>> I'll try to set up a simple test case that reproduces the problem.
>> I'm using version 2.1 of Dune, so that might explain why it's not
>> working for me at the moment.
>>
>> Eike
>>
>> Rebecca Neumann wrote:
>>> Dear Eike,
>>> yes, a test setup would be nice. (Do you use the dune trunk or the
>>> 2.1 version? I think until now only the istl trunk supports
>>> NOVLP_AMG on ALUGrid.)
>>> Solving a simple poisson problem with ALUGrid and
>>> NOVLP_BCGS_AMG_SSOR with Q1 works fine.
>>> But the NOVLP AMG is still very new and not really well tested,
>>> and for ALUGrid the convergence rates are still not as good as we
>>> would expect it.
>>> Hopefully I will have time to explore this further in April.
>>> Best Regards,
>>> Rebecca
>>>
>>> On 27.01.2012 12:10, Markus Blatt wrote:
>>>> On Fri, Jan 27, 2012 at 10:17:23AM +0000, Eike Mueller wrote:
>>>>> Dear dune-/pdelab-list,
>>>>>
>>>>> I'm trying to use the non-overlapping ISTL backend
>>>>> Dune::PDELab::ISTLBackend_NOVLP_CG_AMG_SSOR for solving a problem on
>>>>> my spherical shell grid, which I implemented with ALUGrid. This
>>>>> works fine in serial, but if I try to run in parallel I get the
>>>>> runtime error given below [1]. I have also tried the same backend on
>>>>> a non-overlapping YaspGrid (just a simple cube), where it works fine
>>>>> both serially and in parallel. In addition, the ISTL backend
>>>>> Dune::PDELab::ISTLBackend_NOVLP_CG_SSORk works without problems in
>>>>> parallel with my ALUGrid implementation.
>>>>>
>>>>> Does anyone have any idea what might be going wrong here?
>>>> This meas that some ghost nodes still carry the unaggregated flag even
>>>> though the information about the aggregation was communicated.
>>>>
>>>> Which probably means that the communication interface was not setup
>>>> correctly.
>>>>
>>>> Please provide a test case and post a bug in the pdelab bug
>>>> tracker at http://users.dune-project.org/projects/dune-pdelab/issues
>>>>
>>>> Thanks!
>>>>
>>>> Markus
>>> _______________________________________________
>>> dune-pdelab mailing list
>>> dune-pdelab at dune-project.org
>>> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
>>>
>>
>> _______________________________________________
>> dune-pdelab mailing list
>> dune-pdelab at dune-project.org
>> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
>>
> 





More information about the dune-pdelab mailing list