[dune-pdelab] Dune::PDELab::ISTLBackend_NOVLP_CG_AMG_SSOR with ALUGrid

Eike Mueller E.Mueller at bath.ac.uk
Mon Jan 30 17:44:35 CET 2012


Hi Rebecca,

you are right, it works if I use the trunk (latest versions checked out 
this morning). I attach a simple test case for solving a Poisson 
equation on a spherical shell, including the .alu files with the grid.

The convergence rates I get for a 6 x 16 x 8 grid on four processes are

=== CGSolver
  Iter          Defect            Rate
     0         0.274785
     1         0.116961         0.425645
     2        0.0176198         0.150647
     3       0.00206652         0.117284
     4      0.000276307         0.133706
     5      5.84861e-05         0.211671
     6      1.08342e-05         0.185244
     7      1.97369e-06         0.182172
=== rate=0.184155, T=0.088986, TIT=0.0127123, IT=7

and if I globalRefine() the grid once, I get

=== CGSolver
  Iter          Defect            Rate
     0         0.101624
     1        0.0820958         0.807841
     2        0.0390633         0.475826
     3       0.00580812         0.148685
     4       0.00171588         0.295428
     5      0.000456948         0.266306
     6      0.000132406         0.289761
     7      3.85196e-05         0.290922
     8      9.92988e-06         0.257787
     9      2.22157e-06         0.223726
    10      7.17117e-07         0.322797
=== rate=0.305393, T=0.896864, TIT=0.0896864, IT=10

Is that comparable with what you get?

Thanks,

Eike



Rebecca Neumann wrote:
> Dear Eike,
> yes, a test setup would be nice. (Do you use the dune trunk or the 2.1 
> version? I think until now only the istl trunk supports NOVLP_AMG on 
> ALUGrid.)
> Solving a simple poisson problem with ALUGrid and  NOVLP_BCGS_AMG_SSOR 
> with Q1 works fine.
> But the NOVLP AMG is still very new and not really well tested, and for 
> ALUGrid the convergence rates are still not as good as we would expect it.
> Hopefully I will have time to explore this further in April.
> Best Regards,
> Rebecca
> 
> On 27.01.2012 12:10, Markus Blatt wrote:
>> On Fri, Jan 27, 2012 at 10:17:23AM +0000, Eike Mueller wrote:
>>> Dear dune-/pdelab-list,
>>>
>>> I'm trying to use the non-overlapping ISTL backend
>>> Dune::PDELab::ISTLBackend_NOVLP_CG_AMG_SSOR for solving a problem on
>>> my spherical shell grid, which I implemented with ALUGrid. This
>>> works fine in serial, but if I try to run in parallel I get the
>>> runtime error given below [1]. I have also tried the same backend on
>>> a non-overlapping YaspGrid (just a simple cube), where it works fine
>>> both serially and in parallel. In addition, the ISTL backend
>>> Dune::PDELab::ISTLBackend_NOVLP_CG_SSORk works without problems in
>>> parallel with my ALUGrid implementation.
>>>
>>> Does anyone have any idea what might be going wrong here?
>>
>> This meas that some ghost nodes still carry the unaggregated flag even
>> though the information about the aggregation was communicated.
>>
>> Which probably means that the communication interface was not setup
>> correctly.
>>
>> Please provide a test case and post a bug in the pdelab bug
>> tracker at http://users.dune-project.org/projects/dune-pdelab/issues
>>
>> Thanks!
>>
>> Markus
> 
> _______________________________________________
> dune-pdelab mailing list
> dune-pdelab at dune-project.org
> http://lists.dune-project.org/mailman/listinfo/dune-pdelab
> 

-------------- next part --------------
A non-text attachment was scrubbed...
Name: ALU_NOVLP_CGAMGSSOR_Testcase.tgz
Type: application/x-compressed-tar
Size: 351149 bytes
Desc: not available
URL: <https://lists.dune-project.org/pipermail/dune-pdelab/attachments/20120130/5d3035a8/attachment.bin>


More information about the dune-pdelab mailing list