You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Mini-SBIBM should offer an easy and fast way to test if certain modifications impact the performance of the algorithms. For this, a small set of tasks are implemented on which different methods or configurations can be benchmarked against each other. This can be directly launched via pytest and is thus easily accessible for any developer.
It has a few modes to test different method classes, but here, one can improve quite a bit. There are the following areas of improvement:
Think about what level of customizability would be helpful. Mini-SBIBM should not be a full-baked benchmark to check if something in development works reasonably well quickly.
Think about if we want to allow this set of customizations through pytest API, without requiring to change the code directly. This is currently rather static and can be changed in the code. One can only select different "modes" that run a fixed set of i.e. NPE,SNPE, NRE configurations (depending on what is selected)
Properly organize configurations in some way.
Simple tasks for more "specialized" applications e.g. IID data ...
The text was updated successfully, but these errors were encountered:
Improve Mini-SBIBM
Mini-SBIBM should offer an easy and fast way to test if certain modifications impact the performance of the algorithms. For this, a small set of tasks are implemented on which different methods or configurations can be benchmarked against each other. This can be directly launched via pytest and is thus easily accessible for any developer.
It has a few modes to test different method classes, but here, one can improve quite a bit. There are the following areas of improvement:
The text was updated successfully, but these errors were encountered: