Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix IBL streaming tests #3718

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Open

Conversation

alejoe91
Copy link
Member

Somehow the same PID resulted in a different number of units.

@oliche anything changed on the ONE-api side? Is it possible that a PID is now pointing to a different insertion?

@alejoe91 alejoe91 added the extractors Related to extractors module label Feb 24, 2025
@@ -198,9 +198,9 @@ def test_ibl_sorting_extractor(self):
except:
pytest.skip("Skipping test due to server being down.")
sorting = read_ibl_sorting(pid=PID, one=one)
assert len(sorting.unit_ids) == 733
assert len(sorting.unit_ids) == 1091
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems really risky to have a gt data set and then just test that we load that dataset. Especially if they can change. When I was looking at the failed tests it also said a failure between subtracting datetime.datetime and NoneType. Is that error related to this too? Seems like changes in IBL overall no?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah maybe we can pin it to a previous version? I agree that probably we don't need to assert the number of units. Maybe just asserting that good units are less than all?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That seems safer for a test from my perspective.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I can show you how to use the revision argument that pins the query to a version of the dataset.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks that would be great!

@oliche
Copy link
Contributor

oliche commented Feb 25, 2025

Hi Alessio,
This is what I would propose to pin the IBL dataset and also provide the users with a way to do it themselves if they wish so. alejoe91#25

@alejoe91
Copy link
Member Author

@zm711 I think that after adding revision we should keep the current behavior to test that the IBL retriever is actually getting the right data!

Reayd to merge on my end

@zm711
Copy link
Collaborator

zm711 commented Feb 25, 2025

@zm711 I think that after adding revision we should keep the current behavior to test that the IBL retriever is actually getting the right data!

Reayd to merge on my end

Sounds good to me!

@zm711
Copy link
Collaborator

zm711 commented Feb 25, 2025

And this should fix the datetime issue too?

@alejoe91
Copy link
Member Author

And this should fix the datetime issue too?

No :( @oliche can you help us debug this error as well?

self.ssl = SpikeSortingLoader(one=one, pid=pid)
<string>:18: in __init__
    ???
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/brainbox/io/one.py:815: in __post_init__
    self.eid, self.pname = self.one.pid2eid(self.pid)
/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/one/util.py:162: in wrapper
    self.refresh_cache(mode=mode)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = One (online, https://openalyx.internationalbrainlab.org),/ mode = 'auto'

    def refresh_cache(self, mode='auto'):
        """Check and reload cache tables.
    
        Parameters
        ----------
        mode : {'local', 'refresh', 'auto', 'remote'}
            Options are 'local' (don't reload); 'refresh' (reload); 'auto' (reload if expired);
            'remote' (don't reload).
    
        Returns
        -------
        datetime.datetime
            Loaded timestamp.
        """
        # NB: Currently modified table will be lost if called with 'refresh';
        # May be instances where modified cache is saved then immediately replaced with a new
        # remote cache. Also it's too slow :(
        # self.save_cache()  # Save cache if modified
        if mode in {'local', 'remote'}:
            pass
        elif mode == 'auto':
>           if datetime.now() - self._cache['_meta']['loaded_time'] >= self.cache_expiry:
E           TypeError: unsupported operand type(s) for -: 'datetime.datetime' and 'NoneType'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
extractors Related to extractors module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants