Trying to replace the model embeddings #11773
Unanswered
SrinivasaGogul
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello guys i am trying to replace the MegatronGPT model's embedding matrix with a new one. I am facing a problem while saving the model where the model is not saving with the new embeddings instead it is saving with the old embeddings
self.model.model.module.embedding.word_embeddings.weight.data = torch.nn.Parameter(self.total_embeddings.to(dtype=torch.bfloat16))
By this method only i am replacing the embeddings and iam saving the model using self.model.save_to("./exp.nemo")
Beta Was this translation helpful? Give feedback.
All reactions