[Dune] How to implement gridview.communicate() ?
Markus Blatt
markus at dr-blatt.de
Thu Jan 8 12:03:34 CET 2015
Hi,
On Tue, Jan 06, 2015 at 02:57:28PM +0100, Aleksejs Fomins wrote:
> I am currently implementing the Data Handle interface for the
> CurvilinearGrid. I suppose that the communication paradigm is as follows:
>
> 1) User creates DataHandle, where they put in data using .gather() and
> receive data using .scatter() procedures.
Exactly. Please bear in mind how much freedom this means to the
user. Basically he can send anything in the gather method, e.g. the
process number, rubbish.
On the receiving end in scatter there might also be some filtering or data
modification. E.g. scatter might write to a vector or add to entries
of a vector.
> 2) User calls gridview.communicate(DataHandle, InterfaceType,
> CommunicationDirection)
> 2.1) grid loops over provided send-interface, calls gather
> 2.2) internal MessageBuffers are synchronized over all processes
> 2.3) grid loops over provided receive-interface, calls scatter
>
Seems sound.
>
> Questions:
>
> --------------------------------------------------------------------
>
> 2.1) I do not understand the the exact meaning of the InterfaceType.
>
> "InteriorBorder_InteriorBorder_Interface - send/receive interior and
> border entities"
>
> Do I understand correctly that here each process boundary communicates
> to itself on the neighbouring process, and each InteriorBorder element
> communicates to a ghost element(s) on the neighbouring process(es)?
>
If I am not mistaking no codim 0 entities are communicated. If they
are interior to one process then they can neither be interior nor
border on another one. See the grid howto for pictures.
> "All_All_Interface - send all and receive all entities"
>
> What is the meaning of this Interface? Does it mean that for each
> entity on the process some data is communicated?
No.
> Then how are the
> destination processes determined, or is the data sent to all
> processes?
That the task of the grid. It has to know where replications (same
global id) of the entities exist.
> And finally, how is such data received on the other
> processes. In order to request data from the DataHandle, an entity
> reference must be provided, but each process normally only has
> references to its own entities?
>
Your grid needs to have some kind of map where it stores for each
Interface and process which entities it needs to send from/receive
from. There needs to be some kind of ordering in the data such that
the grid can identify which data belongs where when receiving data.
> -----------------------------------------------------------------------
>
> 2.2) If I understand the synchronization step correctly, then each
> process sends specific data to each other process, for example,
> communicating between neighbouring process boundary faces. If I am not
> mistaken, in MPI this operation is called MPI_alltoallv(), however,
> this operation can not send arbitrary data, only fixed data types like
> Int and Double. When I look at Dune::CollectiveCommunication, I only
> see methods to communicate 1-to-all or all-to-1 or all-to-all, but not
> each-to-each.
You will probably need MPI calls here.
>
> What is the best way of performing such communication in Dune?
>
Who knows. Every grid might do this in its own way which may be best
for its use case.
I once implemented it based on the VariableSizeCommunicator
<http://www.dune-project.org/doc/doxygen/dune-common-html/class_dune_1_1_variable_size_communicator.html>
in dune-cornerpoint
<https://github.com/OPM/dune-cornerpoint/blob/master/dune/grid/cpgrid/CpGridData.hpp>
Markus
--
Do you need more support with DUNE or HPC in general?
Dr. Markus Blatt - HPC-Simulation-Software & Services http://www.dr-blatt.de
Hans-Bunte-Str. 8-10, 69123 Heidelberg, Germany
Tel.: +49 (0) 160 97590858
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: Digital signature
URL: <https://lists.dune-project.org/pipermail/dune/attachments/20150108/db263006/attachment.sig>
More information about the Dune
mailing list