-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update PETSc to 3.20.3, update libmesh #26462
Conversation
@roystgnr I'd like to get one round of testing in on this before piling more on, but would you be interested in a libMesh update going in alongside this? |
Job Test timings on fbd5908 wanted to post the following: View timings here This comment will be updated on new commits. |
Job Documentation on fbd5908 wanted to post the following: View the site here This comment will be updated on new commits. |
Probably not. I've got a minor fix that would be useful in MOOSE, but not worth the hassle alone, and I've got a major branch that isn't likely to be ready to merge before the end of the week. |
Alright - will ping again if this ends up lingering through Friday or Monday. |
Job Coverage on fbd5908 wanted to post the following: Framework coverage
Modules coverageCoverage did not change Full coverage reportsReports
This comment will be updated on new commits. |
I would be interested in a libmesh update |
Can't build #26477 without a libmesh update. A couple cool new things in libMesh directly leveraged in that MOOSE PR are That being said, I would be willing to wait for @roystgnr's major branch to merge (is it libMesh/libmesh#3759 or something else?) |
Those are pretty good reasons for the update. I don't want to rush libMesh/libmesh#3759, though that is what I was thinking about. The current status of So we should probably update the submodule after all, and do it without that. I'd like libMesh/libmesh#3758 to make it to master, and I do have one other PR that I'm hoping to get in today too, but there's actually more than enough to justify an update as-is, now that I look over the git logs. If we wait much longer getting libMesh/libmesh#3736 downstream I'll probably forget entirely about the MOOSE side of it... |
I've pushed newsletter docs to roystgnr@1bfae7d if you want to cherry-pick that to go alongside a libmesh submodule update. |
Thanks for your input! I will update this PR tomorrow to include a libMesh update as well. The only thing holding this back on testing at the moment is a FBLASLAPACK download issue during PETSc alt container creation....working with Logan on it. |
If there's more futzing to be done anyway, it wouldn't hurt to get libMesh up to libMesh/libmesh@3fcc7e7 - the additional changelog entries would be:
IMHO neither of those are super high priority if you just want to focus on getting the PETSc issues fixed and getting this in. |
@roystgnr The GCC min / PETSc-alt failures are all tied to the FBLASLAPACK download issue (which is preventing the build of a new container for those tests), but the new app failures I am seeing are due to the libMesh additions. Do you mind taking a look today? They all seem to be related to the nodeset changes or similar. |
Will do! Looks like we've got a segfault in a Malamute test and 7 (looks like 2 independent) exodiffs in Bison tests? |
The good news is that I understand the MALAMUTE failure now. The bad news is that the MALAMUTE failure is coming from MOOSE doing something unsupported and, up until now, getting away with it via dumb luck. That input deck creates a mesh of I'm trying to figure out a way to make this robust. |
Where is MOOSE doing this? Doesn’t moose rely on libMesh’s api for converting the mesh to second order? |
Haven’t you been doing a bunch of work in these mesh order upgrade methods? Is it possible the behavior is changing in this update? It would seem like quite a coincidence for this to have been working for all CI runs for months |
The
It's definitely triggered by this update (something is performing another allocation in between the On the bright side, the refactoring should at least make it possible to fix every case of this all in one place. On the not-so-bright side, the fix is going to make some of my new capabilities asymptotically slower. We can't safely upgrade O(N_range) elements until after we first check O(N_elem) elements to see if they have any |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't speak to the PETSc part of the change, but everything on the libMesh side should be ready to go.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Going to add in a minor apptainer change and one more libmesh bump.
I'd like to thank Nemesis for only striking me down with a new bug report from Alex, and not with the lightning bolt that my hubris above may have deserved. I'm looking into it now. |
7163db6
to
fbd5908
Compare
@milljm Could you please give us another set of eyes for these changes? Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the Conda stuff, all looks well!
The fix for that last remaining libMesh bug is in libMesh/libmesh#3772 After we get that to master (hopefully tomorrow), we can update the submodule here and we'll have taken care of everything outstanding I know about. I'm thinking my approval for merging as-is still stands, though, if anyone is chomping at the bit. The updates in this PR right now include fixes for potentially more serious and definitely more wide-ranging bugs, in long-standing use cases where as far as I can tell we've so far been saved from data corruption by kindly compilers and dumb luck, whereas the bug fixed in libMesh/libmesh#3772 only affects Tet14 refinement, a relatively new feature. |
so we didn't get Roy's last fix in? |
Refs failures from the combination of idaholab#26462 and idaholab#26227 To avoid negative viscosity interpolations I shrank the growth factor from 2 to 1.5. Once I did this I got a CSVDiff which signaled to me that the original result was solved to too loose a tolerance. So I removed the steady state detection, which has potential issues with it (see idaholab#26312), and extended the `end_time` until we hit a fairly tight absolute tolerance
Nope; sorry. Hope you're okay working from a manual submodule update until next month. |
Looks like we missed the |
I need to remove the |
Refs failures from the combination of idaholab#26462 and idaholab#26227 To avoid negative viscosity interpolations I shrank the growth factor from 2 to 1.5. Once I did this I got a CSVDiff which signaled to me that the original result was solved to too loose a tolerance. So I removed the steady state detection, which has potential issues with it (see idaholab#26312), and extended the `end_time` until we hit a fairly tight absolute tolerance
Closes #26461