[dune-pdelab] Dune::PDELab::ISTLBackend_NOVLP_CG_AMG_SSOR with ALUGrid

Eike Mueller E.Mueller at bath.ac.uk
Mon Jan 30 17:21:00 CET 2012


Hi Christian,

thanks, I will try to do that. For my scaling tests so far, I have only 
used the overlapping solvers with YaspGrid (both with FV and FEM).
The spherical grid, for which I need to use ALUGrid, is non-orthogonal 
(the angle between gridlines can be as small as 60 degrees), so that 
only using the face-neighbours with FV will lead to large discretisation 
errors, which is why I'd rather use FEM where I don't have this problem.

Markus & Olaf, which solver/discretisation did you use for the extreme 
scaling tests on JuGene?

Eike

Christian Engwer wrote:
> Hi Eike,
> 
>> that's very interesting. So far I've only been using the overlapping
>> solvers on YaspGrid, but I need to move to ALUGrid for my spherical
>> grid, and as far as I know I can only use the non-overlapping
>> solvers there as ALUGrid does not have any overlap.
> 
> in principal yes, but it depends a littel bit on your
> discretization. If you are using finite-volume for example, you only
> have couplings to the face-neighbors and these are all available,
> although alugrid doesn't provide a full overlap. In this case you
> should be able to use an overlapping solver.
> 
>> I'll try to set up a simple test case that reproduces the problem.
>> I'm using version 2.1 of Dune, so that might explain why it's not
>> working for me at the moment.
>>
>> Eike
>>
>> Rebecca Neumann wrote:
>>> Dear Eike,
>>> yes, a test setup would be nice. (Do you use the dune trunk or the
>>> 2.1 version? I think until now only the istl trunk supports
>>> NOVLP_AMG on ALUGrid.)
>>> Solving a simple poisson problem with ALUGrid and
>>> NOVLP_BCGS_AMG_SSOR with Q1 works fine.
>>> But the NOVLP AMG is still very new and not really well tested,
>>> and for ALUGrid the convergence rates are still not as good as we
>>> would expect it.
>>> Hopefully I will have time to explore this further in April.
>>> Best Regards,
>>> Rebecca
>>>
>>> On 27.01.2012 12:10, Markus Blatt wrote:
>>>> On Fri, Jan 27, 2012 at 10:17:23AM +0000, Eike Mueller wrote:
>>>>> Dear dune-/pdelab-list,
>>>>>
>>>>> I'm trying to use the non-overlapping ISTL backend
>>>>> Dune::PDELab::ISTLBackend_NOVLP_CG_AMG_SSOR for solving a problem on
>>>>> my spherical shell grid, which I implemented with ALUGrid. This
>>>>> works fine in serial, but if I try to run in parallel I get the
>>>>> runtime error given below [1]. I have also tried the same backend on
>>>>> a non-overlapping YaspGrid (just a simple cube), where it works fine
>>>>> both serially and in parallel. In addition, the ISTL backend
>>>>> Dune::PDELab::ISTLBackend_NOVLP_CG_SSORk works without problems in
>>>>> parallel with my ALUGrid implementation.
>>>>>
>>>>> Does anyone have any idea what might be going wrong here?
>>>> This meas that some ghost nodes still carry the unaggregated flag even
>>>> though the information about the aggregation was communicated.
>>>>
>>>> Which probably means that the communication interface was not setup
>>>> correctly.
>>>>
>>>> Please provide a test case and post a bug in the pdelab bug
>>>> tracker at http://users.dune-project.org/projects/dune-pdelab/issues
>>>>
>>>> Thanks!
>>>>
>>>> Markus
>>> _______________________________________________
>>> dune-pdelab mailing list
>>> dune-pdelab at dune-project.org
>>> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
>>>
>>
>> _______________________________________________
>> dune-pdelab mailing list
>> dune-pdelab at dune-project.org
>> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
>>
> 





More information about the dune-pdelab mailing list