Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ray Serve vLLM example] 'LoRAModulePath' Cannot be found for example #50260

Closed
KPHippe opened this issue Feb 5, 2025 · 2 comments
Closed

[Ray Serve vLLM example] 'LoRAModulePath' Cannot be found for example #50260

KPHippe opened this issue Feb 5, 2025 · 2 comments
Labels
docs An issue or change related to documentation triage Needs triage (eg: priority, bug/not-bug, and owning component)

Comments

@KPHippe
Copy link

KPHippe commented Feb 5, 2025

Description

It appears the Ray vLLM Example has an import that must be from a previous release.

The line:

from vllm.entrypoints.openai.serving_engine import LoRAModulePath, PromptAdapterPath

Needs to change to

from vllm.entrypoints.openai.serving_models import LoRAModulePath, PromptAdapterPath

Link

https://docs.ray.io/en/latest/serve/tutorials/vllm-example.html#serve-a-large-language-model-with-vllm

@KPHippe KPHippe added docs An issue or change related to documentation triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Feb 5, 2025
@pcmoritz
Copy link
Contributor

pcmoritz commented Feb 5, 2025

Thanks for bringing this up, this is being fixed in #50192 :)

@pcmoritz pcmoritz closed this as completed Feb 6, 2025
@KPHippe
Copy link
Author

KPHippe commented Feb 8, 2025

Great! Thanks for the update

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs An issue or change related to documentation triage Needs triage (eg: priority, bug/not-bug, and owning component)
Projects
None yet
Development

No branches or pull requests

2 participants