We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The problem arises in chapter:
I ran the
trainer = Trainer(model=model, args=training_args, compute_metrics=compute_metrics, train_dataset=emotions_encoded["train"], eval_dataset=emotions_encoded["validation"], tokenizer=tokenizer) trainer.train();
section, and got an error complaining about no loss in the outputs:
The model did not return a loss from the inputs, only the following keys: logits. For reference, the inputs it received are input_ids,attention_mask.
Turns out AutoModelForSequenceClassification was expecting labels, not label. so I relabeled:
def relabel(batch): return {'labels': batch["label"]} emotions_encoded = emotions_encoded.map(relabel) emotions_encoded.column_names
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Information
The problem arises in chapter:
Describe the bug
I ran the
section, and got an error complaining about no loss in the outputs:
Turns out AutoModelForSequenceClassification was expecting labels, not label. so I relabeled:
The text was updated successfully, but these errors were encountered: