[Dune] dune parallel matrix format, example

Feng Xing snakexf at gmail.com
Sat Jul 23 08:50:16 CEST 2016


Hello,

I found the problem, I declared the communicator as nonoverlapping. Thank you!

Best regards,
Feng Xing
Postdoc INRIA France


> On 22 Jul 2016, at 16:46, Feng Xing <snakexf at gmail.com> wrote:
> 
> Dear Markus,
> 
> Thank you very much for the help, I do really progresse a lot to realise a small test myself. 
> However, I would like to ask a little more help about a small difficulty. Thank you very much.
> 
> I try to consider a 1d Poisson problem with 2 MPI procs and each proc contains 4 nodes for example (0,1,2,3 are in MPI 0 etc.).
> I follow the example “parallelamgtest.cc <http://parallelamgtest.cc/>” in dune/istl which consider a 2d Poisson problem with parallel AMG method.
> However I got 'Dune::ISTLError’ errors.
> 
> It seems that the matrix and the index are correct and also the sparse pattern is symmetric. 
> I think perhaps the error is from "AMG amg(op, criterion, smootherArgs, comm);”
> I copy my AMG configuration also in the mail.
> 
> Best regards,
> Feng Xing
> Postdoc INRIA France 
> 
> MPI Proc 0:
> 	the matrix is 
> row    0   2.00e+00  -1.00e+00          .          .          .
> row    1  -1.00e+00   2.00e+00  -1.00e+00          .          .
> row    2          .  -1.00e+00   2.00e+00  -1.00e+00          .
> row    3          .          .  -1.00e+00   2.00e+00  -1.00e+00
> row    4          .          .          .   0.00e+00   1.00e+00
> 	the index is
> {global=0, local={local=0, attr=1, public=1}}
> {global=1, local={local=1, attr=1, public=1}}
> {global=2, local={local=2, attr=1, public=1}}
> {global=3, local={local=3, attr=1, public=1}}
> {global=4, local={local=4, attr=3, public=1}}
> 
> MPI Proc 1: 
> 	the matrix is:
> row    0   1.00e+00   0.00e+00          .          .          .
> row    1  -1.00e+00   2.00e+00  -1.00e+00          .          .
> row    2          .  -1.00e+00   2.00e+00  -1.00e+00          .
> row    3          .          .  -1.00e+00   2.00e+00  -1.00e+00
> row    4          .          .          .  -1.00e+00   2.00e+00
> 	the index is:
> {global=3, local={local=0, attr=3, public=1}}
> {global=4, local={local=1, attr=1, public=1}}
> {global=5, local={local=2, attr=1, public=1}}
> {global=6, local={local=3, attr=1, public=1}}
> {global=7, local={local=4, attr=1, public=1}}
> 
> AMG Configuration
>   typedef Dune::Amg::CoarsenCriterion<Dune::Amg::SymmetricCriterion<BMat,Dune::Amg::FirstDiagonal> > Criterion;
>   typedef Dune::SeqSSOR<BMat,BVec,BVec> Smoother;
>   typedef Dune::BlockPreconditioner<BVec,BVec,Comm,Smoother> ParSmoother;
>   typedef Dune::Amg::SmootherTraits<ParSmoother>::Arguments SmootherArgs;
> 
>   SmootherArgs smootherArgs;
>   smootherArgs.iterations = 1;
> 
>   Criterion criterion;  
>   criterion.setNoPreSmoothSteps(1); 
>   criterion.setNoPostSmoothSteps(1);
>   criterion.setGamma(1); // V-cycle=1, W-cycle=2
>   criterion.setAdditive(false);
>   criterion.setMaxLevel(2);
>   criterion.setCoarsenTarget(4);
> 
>   typedef Dune::Amg::AMG<Operator,BVec,ParSmoother,Comm> AMG;
>   AMG amg(op, criterion, smootherArgs, comm);
> 
> 
> 
>> On 18 Jul 2016, at 10:23, Markus Blatt <markus at dr-blatt.de <mailto:markus at dr-blatt.de>> wrote:
>> 
>> On Mon, Jul 18, 2016 at 10:19:56AM +0200, Markus Blatt wrote:
>>> 
>>> Now you have your complete parallel index set with a global index and
>>> a marker for each local index. Let l(i) be the function that
>>> determines the local index of the global (PETSc) index i, them your
>>> local matrix can be set up, if you insert an entry l(i),l(j) for each
>>> PETSc sparse matrix entry (i,j). For all column indices g not in {n_i,
>>> ..., n_{i+1}-1}, you insert a row l(g) with a 1 on the diagonal and
>>> zeros elsewhere. Please make sure that for a symmetric sparse matrix
>>> the sparsity pattern should be symetric, too.
>>> 
>> 
>> Small addition: Now you can setup a parallel linear Operator by
>> passing the local sparse matrix and the communication object to
>> OverlappingSchwarzOperator. Or even write your own one.
>> 
>> Markus
>> 
>> -- 
>> Dr. Markus Blatt - HPC-Simulation-Software & Services http://www.dr-blatt.de <http://www.dr-blatt.de/>
>> Hans-Bunte-Str. 8-10, 69123 Heidelberg, Germany
>> Tel.: +49 (0) 160 97590858
>> _______________________________________________
>> Dune mailing list
>> Dune at dune-project.org <mailto:Dune at dune-project.org>
>> http://lists.dune-project.org/mailman/listinfo/dune
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.dune-project.org/pipermail/dune/attachments/20160723/ff0da70f/attachment.htm>


More information about the Dune mailing list