-
-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running TFLite Inference Offline #524
Comments
👋 Hello @volome, thank you for raising an issue about Ultralytics HUB 🚀! Please visit our HUB Docs to learn more:
If this is a 🐛 Bug Report, please provide screenshots and steps to reproduce your problem to help us get started working on a fix. If this is a ❓ Question, please provide as much information as possible, including dataset, model, environment details etc. so that we might provide the most helpful response. We try to respond to all issues as promptly as possible. Thank you for your patience! |
@volome hello! Thank you for reaching out with your question. Yes, you can train a model and convert it to TFLite for use on your embedded hardware without an internet connection. There is no extra cost associated with converting your model to TFLite and deploying it on your hardware for offline use. The AGPL-3.0 license allows for such use cases, provided you comply with the terms of the license. For detailed guidance on the conversion process, please refer to the Ultralytics HUB Docs. If you have any further questions or need assistance, feel free to ask. We're here to help! |
Thank you for your response!
Super very helpful
…On Mon, 8 Jan 2024 at 14:15, Ultralytics AI Assistant < ***@***.***> wrote:
@volome <https://github.com/volome> hello!
Thank you for reaching out with your question. Yes, you can train a model
and convert it to TFLite for use on your embedded hardware without an
internet connection. There is no extra cost associated with converting your
model to TFLite and deploying it on your hardware for offline use. The
AGPL-3.0 license allows for such use cases, provided you comply with the
terms of the license.
For detailed guidance on the conversion process, please refer to the
Ultralytics HUB Docs. If you have any further questions or need assistance,
feel free to ask. We're here to help!
—
Reply to this email directly, view it on GitHub
<#524 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAWEYO735FV4ROIV3UBL23YNPWQXAVCNFSM6AAAAABBQ7URO2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQOBQHE4DONRVGU>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
You're very welcome, @volome! I'm glad I could assist you. 😊 If you have any more questions in the future or need further support, don't hesitate to reach out. Good luck with your project, and have a great time working with your model on your embedded hardware! |
Search before asking
Question
If I train a model and need to convert it to say TFlite and use it on my embedded hardware that has no internet connection and just my own app. Can I do that and is that an extra cost.
Additional
This is not very clear in your movie, you always refer using our app.
The text was updated successfully, but these errors were encountered: