[dune-pdelab] artifact in DiscreteGridfunctionGradient in parallel run
Michael Wenske
m_wens01 at wwu.de
Mon Aug 5 10:39:11 CEST 2019
Hi!
I now get where its coming from. The effect should also be present in
other finite elements too!
For my use-case, writing the gradient in a nonconforming way should be
just what I need. That way I can do some statistics on the gradients
throughout my field without any domain-boundary or processor-boundary
effects.
It still leaves me a bit worried for other cases though. Let's say somebody
wants to evaluate the gradient-convergence of a new method, then
this behavior could result in a false-negative result. I have not
checked the situation
when there are even more ranks. If I remember correctly, starting from 8
ranks
or so, the domain decomposition also divides in the y-direction, which
might hold
more interesting behavior at the crossing points.
The only consistent remedies I can think of would be to do an average
of the available gradients at each vertex, which would lead to smoothing in
high curvature regions. The second option would be to limit writing the
gradient
to nonconforming mode in some way, to not hide the inherent ambiguity.
Thanks you for your time,
Michael
On 02.08.19 17:58, Jö Fahlke wrote:
> Hi!
>
> The reason for this behaviour is that you are using Q1 finite elements. For
> Q1 the gradients cannot be continuous across element boundaries.
>
> So in each vertex the vtk-writer will use the gradient obtained in one of the
> adjacent elements -- it just happens that yaspgrid is a structured grid
> manager and which element's value ends up being used is not arbitrary.
>
> Also this happens not only on the parallel boundary, but also on other domain
> bounadiries -- it just so happens that it is very obvious on the parallel
> boundary, and your choice of function to interpolate makes it very difficult
> to see at the domain boundaries. I've attached output from a function x*x+y*y
> instead, which better shows the effect.
>
> You can use the VTKWriters non-conforming mode to get a better visualization
> of the finite element method's idea of what the gradients are. Whether that
> is good enough for further analysis however is another question.
>
> Regards,
> Jö.
>
> Am Do, 1. Aug 2019, 18:06:08 +0200 schrieb Jö Fahlke:
>> Here is the output produced by the program and a screenshot of the defect. It
>> shows the x-component of the gradient, which is perpendicular to the partition
>> boundary. Not that on right partition, the x-value of the gradient seem to
>> have been copied from the first interior vertex to the closest border vertex.
>>
>> Regards,
>> Jö.
>>
>> --
>> Jorrit (Jö) Fahlke, Institute for Computational und Applied Mathematics,
>> University of Münster, Orleans-Ring 10, D-48149 Münster
>> Tel: +49 251 83 35146 Fax: +49 251 83 32729
>>
>> Kiss a non-smoker; taste the difference.
>> -- fortune
>
>> <?xml version="1.0"?>
>> <VTKFile type="PUnstructuredGrid" version="0.1" byte_order="LittleEndian">
>> <PUnstructuredGrid GhostLevel="0">
>> <PPointData Scalars="field" Vectors="gradient">
>> <PDataArray type="Float32" Name="field" NumberOfComponents="1"/>
>> <PDataArray type="Float32" Name="gradient" NumberOfComponents="3"/>
>> </PPointData>
>> <PCellData>
>> </PCellData>
>> <PPoints>
>> <PDataArray type="Float32" Name="Coordinates" NumberOfComponents="3"/>
>> </PPoints>
>> <Piece Source="s0002-p0000-output-00000.vtu"/>
>> <Piece Source="s0002-p0001-output-00000.vtu"/>
>> </PUnstructuredGrid>
>> </VTKFile>
>
>
>
>
>
>> _______________________________________________
>> dune-pdelab mailing list
>> dune-pdelab at lists.dune-project.org
>> https://lists.dune-project.org/mailman/listinfo/dune-pdelab
>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 473 bytes
Desc: OpenPGP digital signature
URL: <https://lists.dune-project.org/pipermail/dune-pdelab/attachments/20190805/57892ae6/attachment.sig>
More information about the dune-pdelab
mailing list