Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some new and some improved tensor operations #76

Merged
merged 21 commits into from
Feb 28, 2024

Conversation

btalamini
Copy link
Collaborator

@btalamini btalamini commented Jan 17, 2024

This PR contributes the following:

  • Missing common tensor operators: matrix square root, polar decomposition (and their derivatives)
  • 3x3-specific operators that outperform the more general jax versions
  • Extensible way for adding symmetric matrix functions that are scalar functions applied to the eigenvalues. Converts the existing operators to this (matrix log, log-sqrt)
  • Additional testing that found a bug in the matrix power operator
  • Factored out general (as in, not 3x3-specific) matrix routines that didn't belong here into their own module: matrix log, square root

I changed the matrix log/exp derivative calculations form Mike's very clever implementation that maintains high precision with Taylor and Pade approximations. The version I put in seems to do just as well in the test cases. It doesn't use Pade approximants, it uses the built-in implementations of log1p and expm1 (which may use the same trick as Mike under the hood, for all I know). I kept Mike's code around in case I'm wrong and these versions are not as accurate.

Finally: There are still two calls to the eigenvalue solve when computing the derivative of the matrix functions. I think it's possible to reduce it to one, but it's trickier than I thought. I'll wait until we profile and confirm this is a bottleneck before working on that.

btalamini and others added 18 commits November 19, 2023 07:05
Before: the primal output was computed in-line in the custom jvp
function. The advantage is that this avoids a second call to the
eigendecomposition. The downside is that this in-line computation
doesn't itself have a custom jvp, so its derivative can be wrong.

After: I re-compute the primal value through the base function
(e.g., log_symm), which has the custom jvp defined on it. The
eigendecomposition is repeated. We can refactor to eliminate this
later if profiling reveals it to be a performance bottleneck.
Put a leading underscore on functions menat for internal use.
Most Python tools will ignore these when reporting contents of
a module.
Replace all calls except in the new viscoelastic model. Changes are
about to merge there and I want to handle the conflicts separately.
@btalamini btalamini requested a review from tupek2 January 18, 2024 13:31
@btalamini btalamini merged commit e920781 into sandialabs:main Feb 28, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants