-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mini-batch and negative sampling code #2
Comments
For mini-batch training, refer to the following snippet
|
For negative sampling, you can refer to this repository: https://github.com/DMPierre/LINE |
Thanks for your reply. I modify the nonoverlapping.py file and run on the cora dataset. The loss is decreasing but nmi and modularity are not increasing. I implement forward function as follows. def forward(self, w, c, neg, temp):
w = self.node_embeddings(w).to(self.device)
c_ = self.node_embeddings(c).to(self.device)
c_context = self.contextnode_embeddings(c).to(self.device)
c_context_community = self.community_embeddings(c_context)
neg_context = self.contextnode_embeddings(neg).to(self.device)
neg_context_community = - self.community_embeddings(neg_context) # neg
q = self.community_embeddings(w * c_)
prior = self.community_embeddings(w)
prior = F.softmax(prior, dim=-1)
mulpositivebatch = torch.mul(q, c_context_community)
positivebatch = F.logsigmoid(torch.sum(mulpositivebatch, dim=1))
mulnegativebatch = torch.mul(q.view(len(q), 1, self.embedding_dim), neg_context_community)
negativebatch = torch.sum(
F.logsigmoid(
torch.sum(mulnegativebatch, dim=2)
),
dim=1)
loss = positivebatch + negativebatch
return -torch.mean(loss), F.softmax(q, dim=-1), prior Is there any mistakes in this function? Thank you! |
Note that it will take a longer time to observe increase in nmi and modularity if you use negative sampling. |
Hi Fan-Yun,
Thanks for sharing your code. Do you plan to release the code with mini-batch training and negative sampling for large graphs? Thank you.
The text was updated successfully, but these errors were encountered: