-
Notifications
You must be signed in to change notification settings - Fork 289
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ghost periodic point neighbors in GhostPointNeighbors #2725
Ghost periodic point neighbors in GhostPointNeighbors #2725
Conversation
src/base/ghost_point_neighbors.C
Outdated
@@ -67,30 +89,97 @@ void GhostPointNeighbors::operator() | |||
CouplingMatrix * nullcm = nullptr; | |||
|
|||
for (const auto & elem : as_range(range_begin, range_end)) | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
libMesh indentation is two spaces to the brace, plus another two spaces to the code in the next block.
src/base/ghost_point_neighbors.C
Outdated
{ | ||
binfo.boundary_ids(ppn, ppn_s, ppn_bcids); | ||
for (const auto & pb_pair : *_periodic_bcs) | ||
if (std::find(ppn_bcids.begin(), ppn_bcids.end(), pb_pair.first) != ppn_bcids.end()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of a loop over _periodic_bcs, then a linear std::find for each, should we loop over ppn_bcids and use std::map::find for each? I think both the constant and the asymptotics (which you wouldn't think are important for BCs but then you hear about the crazy stuff Akselos does...) ought to be better that way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
src/base/ghost_point_neighbors.C
Outdated
if (std::find(ppn_bcids.begin(), ppn_bcids.end(), pb_pair.first) != ppn_bcids.end()) | ||
{ | ||
on_periodic_boundary = true; | ||
goto jump; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Of course, even switching loop ordering doesn't get rid of this goto, which I know is the right thing to do in C/C++ in this ONE USE CASE of breaking out of nested loops, but which gives me painful Apple BASIC flashbacks nonetheless.
Good workaround. I'm afraid it doesn't fix everything I know is wrong here (systems_of_equations_ex9 still gives me fits with DistributedMesh on 21 processors), but every little bit helps. Can you get that failing case into CI in MOOSE, though, please? #2703 is still mocking me, and I swear that at some point I'm going to fix distributed PBCs completely, but I fear that before that point happens I'm going to come up with attempted fixes that don't quite work (see #2590), and the more corner cases we have test coverage for, the less likely I am to let a non-working "fix" get merged. |
e354e42
to
f16f215
Compare
Oh definitely. |
Hopefully comments are addressed |
Woops. Hold on a second... Copying over the |
Ok now this code makes a MOOSE test case with a distributed displaced mesh and with adaptivity and periodic boundary conditions work. |
Ready for re-review/merge |
Darn, my redistribute problem is because I’m comparing to the mesh processor id and not the processor id arg. Smh |
Ok after the bug-fix in 3f90154 this is truly ready for re-review 😆 |
src/base/ghost_point_neighbors.C
Outdated
std::unique_ptr<PointLocatorBase> point_locator; | ||
if (check_periodic_bcs) | ||
{ | ||
libmesh_assert(_mesh); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Worried the mesh might have disappeared in the last 9 lines?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Uh yea! You can never be too sure. Haha I will fix that
This fixes my distributed periodic boundary issue. Currently our periodic constraint algorithm relies on identifying a primary element in charge of constraining (vertex) dofs. That designation is determined by looking at point neighbors and comparing things like ids. If not all the point neighbors are available, then a process may make an invalid determination that it owns a primary elem when in fact the primary elem is off-process. In order to to validly determine primary elems, we must have all the periodic point neighbors available
Since we now clone all the ghosting functors coming from the other mesh, the old code added the new object's default ghosting functor as well as a clone of the other object's default ghosting functor. With the new code path we only add one default ghosting-like object
3f90154
to
14490ef
Compare
This was regressed last year in libMesh#2725
This was regressed last year in libMesh#2725
This fixes my distributed periodic boundary issue. Currently our
periodic constraint algorithm relies on identifying a primary element in
charge of constraining (vertex) dofs. That designation is determined by
looking at point neighbors and comparing things like ids. If not all the
point neighbors are available, then a process may make an invalid
determination that it owns a primary elem when in fact the primary elem
is off-process. In order to to validly determine primary elems, we must
have all the periodic point neighbors available