[dune-pdelab] Scalability issue in overlapping solvers

Linus Seelinger mail at linusseelinger.de
Mon Jan 3 16:48:12 CET 2022


Hi Aswin,

first of all, I think you might be mislead by how periodic boundaries are handled in DUNE. Periodic boundaries (at least using YASPGrid) require a parallel run (i.e. more than one MPI rank), since they essentially use the same overlapping communication framework that otherwise handles "regular" overlaps between subdomains. Think of a 2D grid, gluing together the periodic ends; in the resulting cylinder shape, subdomains at periodic boundaries are just regular neighbors and can be handled accordingly.

Long story short, I think the sequential (-np 1) run does not give you a correct solution (write to VTK and check the output to confirm) and is therefore not a good reference. The other runs look not as bad regarding scalability.

If you still need better scalability, you might look into more advanced methods. The appropriate choice then depends a lot on the particular problem you want to solve, so more detail from you would be helpful.

One example might be GenEO, which we could scale up to around 16000 cores (see https://doi.org/10.1007/978-3-030-43229-4_11[1] ). Might be a bit overkill though depending on what you want to do.

Best,
Linus

Am Montag, 3. Januar 2022, 11:09:14 CET schrieb Aswin vs:
> Hello,
> 
> Can somebody suggest me how to get good scalability while using the
> overlapping solvers in DUNE. I tried the following test example in DUNE
> PDELab, but not getting good scalability.
> Thank you.
> 
> $  mpirun -np 1 ./test-heat-instationary-periodic
> .
> .
> .
> STAGE 2 time (to):   1.5000e-02.
> === matrix setup skipped (matrix already allocated)
> === matrix assembly (max) 0.9944 s
> === residual assembly (max) 0.4861 s
> === solving (reduction: 1e-10) === Dune::BiCGSTABSolver
> === rate=0.8397, T=12.57, TIT=0.09524, IT=132
> 12.68 s
> ::: timesteps           2 (2)
> ::: nl iterations     565 (565)
> ::: lin iterations    565 (565)
> ::: assemble time    8.0477e+00 (8.0477e+00)
> ::: lin solve time   5.3414e+01 (5.3414e+01)
> ---------------------------------------------------------------------------------------
> $ mpirun -np 2 ./testheat-instationary-periodic
> .
> .
> .
> STAGE 2 time (to):   1.5000e-02.
> === matrix setup skipped (matrix already allocated)
> === matrix assembly (max) 0.5053 s
> === residual assembly (max) 0.2465 s
> === solving (reduction: 1e-10) === Dune::BiCGSTABSolver
> === rate=0.9268, T=26.95, TIT=0.08895, IT=303
> 27.05 s
> ::: timesteps           2 (2)
> ::: nl iterations    1254 (1254)
> ::: lin iterations   1254 (1254)
> ::: assemble time    4.0910e+00 (4.0910e+00)
> ::: lin solve time   1.1201e+02 (1.1201e+02)
> ---------------------------------------------------------------------------------------
> $ mpirun -np 4 ./testheat-instationary-periodic
> .
> .
> .
> STAGE 2 time (to):   1.5000e-02.
> === matrix setup skipped (matrix already allocated)
> === matrix assembly (max) 0.271 s
> === residual assembly (max) 0.1318 s
> === solving (reduction: 1e-10) === Dune::BiCGSTABSolver
> === rate=0.9232, T=26.02, TIT=0.0894, IT=291
> 26.11 s
> ::: timesteps           2 (2)
> ::: nl iterations    1249 (1249)
> ::: lin iterations   1249 (1249)
> ::: assemble time    2.1746e+00 (2.1746e+00)
> ::: lin solve time   1.1165e+02 (1.1165e+02)
> ---------------------------------------------------------------------------------------
> $ mpirun -np 8 ./testheat-instationary-periodic
> .
> .
> .
> STAGE 2 time (to):   1.5000e-02.
> === matrix setup skipped (matrix already allocated)
> === matrix assembly (max) 0.1772 s
> === residual assembly (max) 0.08259 s
> === solving (reduction: 1e-10) === Dune::BiCGSTABSolver
> === rate=0.9288, T=30.81, TIT=0.09751, IT=316
> 30.89 s
> ::: timesteps           2 (2)
> ::: nl iterations    1329 (1329)
> ::: lin iterations   1329 (1329)
> ::: assemble time    1.3485e+00 (1.3485e+00)
> ::: lin solve time   1.2796e+02 (1.2796e+02)
> 
> 
> 



--------
[1] https://doi.org/10.1007/978-3-030-43229-4_11
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.dune-project.org/pipermail/dune-pdelab/attachments/20220103/a9769e09/attachment.htm>


More information about the dune-pdelab mailing list