Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support React Native #118

Open
wants to merge 171 commits into
base: main
Choose a base branch
from
Open

Support React Native #118

wants to merge 171 commits into from

Conversation

hans00
Copy link
Contributor

@hans00 hans00 commented May 19, 2023

Make it support React Native.

And I made a example for it.
https://github.com/hans00/react-native-transformers-example

On Android need some library patch.

TODO:

  • Check models are works fine
  • Research more efficiently image processing
  • Merge v3
  • Check everything are works fine on v3

@xenova
Copy link
Collaborator

xenova commented May 19, 2023

Woah nice! There are a lot of people who are interested in doing stuff like this! Looking forward to reviewing this when you're ready! cc @pcuenca

package.json Outdated Show resolved Hide resolved
@jhen0409
Copy link

jhen0409 commented May 20, 2023

I'm investigating a performance issue with Whisper tiny.en, and it looks like the performance is not as expected in the example.

I quickly add log for onnxruntime-react-native/lib/backend.ts, this is the result:

# Encode result
 LOG  ONNX session run finished 767 ms

# Decode result
 LOG  ONNX session run finished 4518 ms
# ... 4 runs ...
 LOG  ONNX session run finished 4856 ms
 LOG  Time: 40016
 LOG  Result: {"text": " (buzzing)"}
Click to expand
const t0 = performance.now()
// this.#inferenceSession === NativeModules.Onnxruntime
const results: Binding.ReturnType = await this.#inferenceSession.run(this.#key, input, outputNames, options);
const output = this.decodeReturnType(results);
console.log('ONNX session run finished', performance.now() - t0)

Also see logs of native part (added log to OnnxruntimeModule.java), it is significantly less than the time taken by JS part:

# Encode result
09:47:59.203 ONNXRUNTIME_RN run() is finished, time: 273 ms

# Decode result
09:48:00.280 ONNXRUNTIME_RN run() is finished, time: 339 ms
# ... 4 runs ...
09:48:23.807 ONNXRUNTIME_RN run() is finished, time: 541 ms

I think this problem may come from blocking caused by native module passing too large data. I think request/help onnxruntime for migrate to JSI module may solve this problem.

EDIT: It seems the logger have some bugs leads me to think that the problem is from the native bridge. I add timeout await to the decodeReturnType call and found the issue is from this function.

@simonwh
Copy link

simonwh commented Nov 11, 2024

+1, would be really great to get this merged - any news @xenova ?

@VikingLichens
Copy link

Thanks @hans00 I made it work with Expo in my small project. Here the PR that made it on my side:

@hans00
Copy link
Contributor Author

hans00 commented Jan 20, 2025

@xenova here is ready for review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.