Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge rxinferenceand inference functions into infer #190

Merged
merged 15 commits into from
Dec 14, 2023
Merged

Conversation

albertpod
Copy link
Member

@albertpod albertpod commented Nov 21, 2023

This PR addresses issue #181 and marks significant update to the RxInfer, wherein we merge the functionalities of inference and rxinference into a single, "umbrella" function: infer. This change simplifies the user interface by providing a unified method for conducting both static and streaming probabilistic inference.

The decision to use infer for static or streaming datasets is now determined by the presence of the autoupdates keyword. This keyword is crucial in controlling how updated posteriors are transformed into priors for the following observations, making it specifically applicable and essential for streaming data (real-world) scenarios.

Key Changes:

  1. Unified Inference Functionality: The infer function consolidates all features previously divided between inference and rxinference. The original functions have been internally renamed to __inference and __rxinference and are no longer exported. This unified approach enables infer to support both batch/static and streaming/online applications.

  2. Updated Documentation: Comprehensive updates have been made to the documentation to reflect the merging of inference and rxinference. This includes a detailed and clear docstring for the infer function, ensuring ease of use and understanding.

  3. Updated Examples: All examples across the package have been updated to utilize the new infer function, demonstrating its application in various contexts.

  4. Deprecation Notices: Appropriate deprecation notices have been added for the inference and rxinference functions, guiding users towards the new unified approach. (I haven't managed to make @deprecate inference(; kwargs...) infer(kwargs...) work, seems deprecate macro isn't good for keyword arguments)

@bvdmitri
Copy link
Member

I guess we can make it mandatory and allow missing as an input in case someone comes up with a scenario where it is not mandatory.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@albertpod albertpod changed the title [WIP] Merge rxinference and inference functions Merge rxinferenceand inference functions into infer Nov 23, 2023
@albertpod albertpod marked this pull request as ready for review November 23, 2023 14:50
@albertpod albertpod requested a review from bvdmitri November 23, 2023 14:59
Copy link

codecov bot commented Nov 23, 2023

Codecov Report

Attention: 5 lines in your changes are missing coverage. Please review.

Comparison is base (792c6e3) 80.31% compared to head (a805ce9) 80.31%.

Files Patch % Lines
src/inference.jl 88.88% 5 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #190      +/-   ##
==========================================
- Coverage   80.31%   80.31%   -0.01%     
==========================================
  Files          11       11              
  Lines        1285     1290       +5     
==========================================
+ Hits         1032     1036       +4     
- Misses        253      254       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@albertpod
Copy link
Member Author

I recommend not halting this PR for long @bvdmitri

Copy link
Member

@bvdmitri bvdmitri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some documentation has been lost in the process. Also the function documentation is kinda big. Do we want to maybe make a documentation page instead (I'm not sure, just asking your opinion)?

src/inference.jl Outdated Show resolved Hide resolved
- `events = nothing`: inference cycle events, optional, see below for more info (exclusive for streamline inference)
- `uselock = false`: specifies either to use the lock structure for the inference or not, if set to true uses `Base.Threads.SpinLock`. Accepts custom `AbstractLock`. (exclusive for streamline inference)
- `autostart = true`: specifies whether to call `RxInfer.start` on the created engine automatically or not (exclusive for streamline inference) (exclusive for streamline inference)
- `warn = true`: enables/disables warnings
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

catch_exception argument explanation is missing (did we have it?)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the short explanation was given in line 1493, but the extended description was missing indeed. Added now.

src/inference.jl Outdated Show resolved Hide resolved
- `pipeline`: changes the default pipeline for each factor node in the graph
- `global_reactive_scheduler`: changes the scheduler of reactive streams, see Rocket.jl for more info, defaults to no scheduler

- ### `returnvars`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something has been lost here, the returnvars argument behaves differently for batched and streamlined versions. I think a part of the documentation has been lost in the process.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.


**Note**: The `predictvars` argument is exclusive for batch setting.

- ### `historyvars`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

historyvars is exclusive for streamline setting

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is said in both short and extended description.

src/inference.jl Outdated

Specifies the number of variational (or loopy belief propagation) iterations. By default set to `nothing`, which is equivalent of doing 1 iteration.

- ### `free_energy`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The free_energy behaves differently for the streamlined and batched versions. In the streamlined it is indeed an observable, but in the batched version it is just a vector.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(again probably some documentation has been lost in the process)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

src/inference.jl Outdated
This setting specifies whenever the `infer` function should create an observable of Bethe Free Energy (BFE) values. The BFE observable returns a new computed value for each VMP iteration.
Note, however, that it may be not possible to compute BFE values for every model. If `free_energy = true` and `keephistory > 0` the engine exposes extra fields to access the history of the Bethe free energy updates:

- `engine.free_energy_history`: Returns a free energy history averaged over the VMP iterations
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is exclusive to the streamlined version, the batched version does not have those

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed

@albertpod albertpod requested a review from bvdmitri December 11, 2023 13:35
Copy link
Member

@bvdmitri bvdmitri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well done @albertpod ! You can merge after the CI pass

@albertpod albertpod merged commit afa07ef into main Dec 14, 2023
6 of 7 checks passed
@albertpod albertpod deleted the dev-infer branch December 14, 2023 14:35
@albertpod albertpod mentioned this pull request Jan 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants