How to use a LoRA. #506
Answered
by
gustrd
Herman5555
asked this question in
Q&A
-
Does anyone know if you can use a LoRA?, unfortunately there are sparse guides and conflicting information. What is the easiest way to apply a LoRA to a GGUF quantized model?. Any helpful information is much appreciated. |
Beta Was this translation helpful? Give feedback.
Answered by
gustrd
Nov 13, 2023
Replies: 1 comment 4 replies
-
The best way is to merge it into the base model externally, before converting it to GGUF, and then you convert the result into a pure new GGUF. You will have best quality and compatibility that way. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I believe you will need to use the Transformers library from HuggingFace at Python to merge de weights, then convert to GGUF.
Here is an example of how the merge is done: https://discuss.huggingface.co/t/help-with-merging-lora-weights-back-into-base-model/40968/3