![]() |
Mesh Oriented datABase
(version 5.4.1)
Array-based unstructured mesh datastructure
|
00001 /*
00002 * ParCommGraph.hpp
00003 *
00004 * will be used to setup communication between 2 distributed meshes, in which one mesh was migrated
00005 * from the other. (one example is atmosphere mesh migrated to coupler pes)
00006 *
00007 * there are 3 communicators in play, one for each mesh, and one for the joined
00008 * communicator, that spans both sets of processes; to send mesh or tag data we need to use the
00009 * joint communicator, use nonblocking MPI_iSend and blocking MPI_Recv receives
00010 *
00011 * various methods should be available to migrate meshes; trivial, using graph partitioner (Zoltan
00012 * PHG) and using a geometric partitioner (Zoltan RCB)
00013 *
00014 * communicators are represented by their MPI groups, not by their communicators, because
00015 * the groups are always defined, irrespective of what tasks are they on. Communicators can be
00016 * MPI_NULL, while MPI_Groups are always defined
00017 *
00018 * Some of the methods in here are executed over the sender communicator, some are over the
00019 * receiver communicator They can switch places, what was sender becomes the receiver and viceversa
00020 *
00021 * The name "graph" is in the sense of a bipartite graph, in which we can separate senders and
00022 * receivers tasks
00023 *
00024 * The info stored in the ParCommGraph helps in migrating fields (MOAB tags) from component to the
00025 * coupler and back
00026 *
00027 * So initially the ParCommGraph is assisting in mesh migration (from component to coupler) and
00028 * then is used to migrate tag data from component to coupler and back from coupler to component.
00029 *
00030 * The same class is used after intersection (which is done on the coupler pes between 2 different
00031 * component migrated meshes) and it alters communication pattern between the original component pes
00032 * and coupler pes;
00033 *
00034 * We added a new way to send tags between 2 models; the first application of the new method is to
00035 * send tag from atm dynamics model (spectral elements, with np x np tags defined on each element,
00036 * according to the GLOBAL_DOFS tag associated to each element) towards the atm physics model, which
00037 * is just a point cloud of vertices distributed differently to the physics model pes; matching is
00038 * done using GLOBAL_ID tag on vertices; Right now, we assume that the models are on different pes,
00039 * but the joint communicator covers both and that the ids of the tasks are with respect to the
00040 * joint communicator
00041 *
00042 *
00043 */
00044 #include "moab_mpi.h"
00045 #include "moab/Interface.hpp"
00046 #include "moab/ParallelComm.hpp"
00047 #include