-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add the capability to do adjoint transforms #633
base: master
Are you sure you want to change the base?
Conversation
Just to be clear: at this point, there is no interface support for the new functioality. I just want to show which kind of impact the new feature has on the existing implementation. |
I've added the C and Python interfaces, as well as basic Python unit tests. |
Test failures seem to be "near misses" in adjoint type 3 transforms. No idea why that direction should be less accurate, and why it only happens in some of the tests. |
…s. Perhaps 1e-6 is just a bit too close to the machine epsilon for single precision
I think this is ready for "technical" review. If there is agreement that the change is desirable, I can try to
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a few explanatory comments
python/finufft/examples/guru1d1f.py
Outdated
@@ -20,7 +20,7 @@ | |||
strt = time.time() | |||
|
|||
#plan | |||
plan = fp.Plan(1,(N,),dtype='single') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change is unrelated to the PR, but without it, CI simply fails. I don't know why this hasn't caused issues so far.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems because we removed the conversion for real dtypes in https://github.com/flatironinstitute/finufft/pull/606/files
python/finufft/examples/guru2d1f.py
Outdated
@@ -34,7 +34,7 @@ | |||
|
|||
# instantiate the plan (note n_trans must be set here), also setting tolerance: | |||
t0 = time.time() | |||
plan = finufft.Plan(nufft_type, (N1, N2), eps=1e-4, n_trans=K, dtype='float32') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change is unrelated to the PR, but without it, CI simply fails. I don't know why this hasn't caused issues so far.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for catching this, it seems all the *f.py
examples(in python/finufft/examples) should change according to PR606? otherwise the check is_single_dtype(dtype)
finufft/python/finufft/finufft/_interfaces.py
Line 106 in d2e0ff7
is_single = is_single_dtype(dtype) |
@@ -86,6 +120,19 @@ def test_finufft3_plan(dtype, dim, n_source_pts, n_target_pts, output_arg): | |||
|
|||
utils.verify_type3(source_pts, source_coefs, target_pts, target_coefs, 1e-6) | |||
|
|||
# test adjoint type 3 | |||
plan = Plan(3, dim, dtype=dtype, isign=-1, eps=1e-5) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm increasing eps
from 1e-6 to 1e-5 here, because I get occasional failures with single precision otherwise. Given that 1e-6 is uncomfortably close to machine epsilon, I'm not too worried about this change.
@@ -154,7 +154,7 @@ def verify_type1(pts, coefs, shape, sig_est, tol): | |||
|
|||
type1_rel_err = np.linalg.norm(fk_target - fk_est) / np.linalg.norm(fk_target) | |||
|
|||
assert type1_rel_err < 25 * tol |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Switching from assert
to np.testing.assert_allclose
here, because the latter will provide more information in case of failure, which speeds up debugging a lot.
} | ||
} | ||
} | ||
#endif | ||
#else | ||
p->fftPlan->execute(); // if thisBatchSize<batchSize it wastes some flops |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This needs discussion: how do we want to deal with this situation? The trick used here is nice and simple, but it will make the adjoint call slower than the forward one.
I just fixed py examples in 19a24c6
…On Mon, Feb 24, 2025 at 9:30 AM Libin Lu ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In python/finufft/examples/guru2d1f.py
<#633 (comment)>
:
> @@ -34,7 +34,7 @@
# instantiate the plan (note n_trans must be set here), also setting tolerance:
t0 = time.time()
-plan = finufft.Plan(nufft_type, (N1, N2), eps=1e-4, n_trans=K, dtype='float32')
Thanks for catching this, it seems all the *f.py examples(in
python/finufft/examples) should change according to PR606? otherwise the
check is_single_dtype(dtype)
https://github.com/flatironinstitute/finufft/blob/d2e0ff7cd3a7bd66384d961d647060f497fde13c/python/finufft/finufft/_interfaces.py#L106
will fail.
—
Reply to this email directly, view it on GitHub
<#633 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACNZRSUEOSDPVX33Q6B3WTT2RMUHHAVCNFSM6AAAAABXLHSEQ2VHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZDMMZXGI4TSMZSGQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
*-------------------------------------------------------------------~^`^~._.~'
|\ Alex Barnett Center for Computational Mathematics, Flatiron Institute
| \ http://users.flatironinstitute.org/~ahb 646-876-5942
|
This is a first outline how I propose to add adjoint transforms; it is mainly meant as a basis for discussions and measurements. The computation of the actual adjoint is completely untested yet, but the standard functionality should still be OK, as far as my tests show.
@ahbarnett, @DiamonDinoia please let me know your thoughts on this one!