[dune-pdelab] reg.: parallel solver for coupled FEM and FV
Shubhangi Gupta
sgupta at geomar.de
Mon Feb 8 11:32:23 CET 2021
Dear Peter,
Thanks a lot for your reply.
I am trying to solve a coupled system of equations with multiple PDEs, a
couple of ODEs and some AEs.
I will use here the example of a simple two phase flow model to explain
my problem:
# 1D 2phase flow problem
# MODEL:
# -------
# PDE1: \partial_t S + \nabla \cdot ( - [K]
\frac{kr1(S)}{mu1} \nabla P1 ) = q1
# PDE2: \partial_t (1-S) + \nabla \cdot ( - [K]
\frac{kr2(S)}{mu2} \nabla P2 ) = q2
# AE: S - qs(P2-P1) = 0
# Primary variables:
# -------------------
# P1 (wetting phase pressure), P2(non-wetting phase pressure), S(wetting
phase saturation)
Here, I want to discretize, say, PDE1 and PDE2 with cG-FEM and the AE
with the lowest possible order (like ccFV). (I say FV, because in my
model, I also have some PDEs with I solve using cc-FV).
Consider the following combinations of the composite GFS:
GFS1:
using VBE
=Dune::PDELab::ISTL::VectorBackend<Dune::PDELab::ISTL::Blocking::none>;
using OrderingTag = Dune::PDELab::EntityBlockedOrderingTag;
using GFS = Dune::PDELab::CompositeGridFunctionSpace<
VBE,OrderingTag,GFS_P,GFS_P,GFS_S> ;
GFS gfs(gfs_p1,gfs_p2,gfs_s);
GFS2:
using VBE
=Dune::PDELab::ISTL::VectorBackend<Dune::PDELab::ISTL::Blocking::fixed>;
using OrderingTag = Dune::PDELab::EntityBlockedOrderingTag;
using GFS = Dune::PDELab::CompositeGridFunctionSpace<
VBE,OrderingTag,GFS_P,GFS_P,GFS_S> ;
GFS gfs(gfs_p1,gfs_p2,gfs_s);
When I consider a full cG scheme, the parallel BCGS_AMG_SSORk solver
works like a charm, especially with GFS2.
However, if I consider my intended cGFEM + FV scheme, the GFS2 does not
seem to work at all. GFS1 works quite well with the SEQ_SUPERLU, but
completely fails for the parallel BCGS_AMG_SSORk linear solver (and all
other solvers too).
I am wondering why this is so.
I have made a "minimum" example of the two phase flow problem where this
behaviour is reproduced.
You can switch between the full cG scheme and cG + FV scheme using the
#define S_FV option in main.cc.
Other than that, in the input.ini file you can find all other options.
In particular, you can switch between the superlu solver and the
bcgs_amg_ssork solvers by choosing the option linear_solver.superlu=true
or false.
The input.ini file gives a short summary. If you need any further
information, please let me know.
Thanks once again for your help!
With warm wishes, Shubhangi
PS: I also have the same implementation with the dG scheme instead of
cG, and I face the same problem there too!
On 06.02.21 14:17, Bastian, Prof. Dr. Peter wrote:
> Dear Subhangi,
>
> Could you share more detail about what you are trying to solve?
>
> Best, Peter
>
>> Am 06.02.2021 um 10:24 schrieb Shubhangi Gupta <sgupta at geomar.de>:
>>
>> Dear all,
>>
>> I am solving a system of 2 equations. First equation is a PDE and the second is an AE.
>> I use cG-FEM for the PDE and for the AE, I use the P0 space (so, something like finite volume, with only alpha volume terms), and I implement them as a fully coupled system.
>>
>> The problem is, none of the parallel solvers seem to be working!
>>
>> I have already tried all different combinations of ordering and blocking for the composite grid function space.
>>
>> I will appreciate any hints and suggestions to figure out how I can get the parallel solvers to work for this problem.
>>
>> Thanks, and warm wishes, Shubhangi
>>
>>
>>
>> _______________________________________________
>> dune-pdelab mailing list
>> dune-pdelab at lists.dune-project.org
>> https://lists.dune-project.org/mailman/listinfo/dune-pdelab
-------------- next part --------------
A non-text attachment was scrubbed...
Name: problem00.zip
Type: application/zip
Size: 11692 bytes
Desc: not available
URL: <https://lists.dune-project.org/pipermail/dune-pdelab/attachments/20210208/3d58e917/attachment.zip>
More information about the dune-pdelab
mailing list