-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathner_classification_large.log
27 lines (27 loc) · 2.54 KB
/
ner_classification_large.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Some weights of the model checkpoint at xlm-roberta-large were not used when initializing XLMRobertaForTokenClassification: ['lm_head.dense.bias', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.bias', 'lm_head.dense.weight']
- This IS expected if you are initializing XLMRobertaForTokenClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing XLMRobertaForTokenClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of XLMRobertaForTokenClassification were not initialized from the model checkpoint at xlm-roberta-large and are newly initialized: ['classifier.weight', 'classifier.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
INFO:simpletransformers.ner.ner_model: Converting to features started.
['B-per', 'O', 'B-deriv-per', 'B-misc', 'I-misc', 'I-per', 'B-org', 'I-org', 'B-loc', 'I-loc', 'B-*', 'I-*']
(73943, 3) (9122, 3) (9206, 3)
sentence_id words labels
0 0 @vukomand B-per
1 0 Gospođo O
2 0 Dijana B-per
3 0 koje O
4 0 lekove O
Training started. Current model: xlm-r-large, no. of epochs: 11
/home/tajak/NER-recognition/ner/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:139: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of `lr_scheduler.step()` before `optimizer.step()`. "
INFO:simpletransformers.ner.ner_model: Training of xlmroberta model complete. Saved to outputs/.
INFO:simpletransformers.ner.ner_model: Converting to features started.
Training completed.
It took 8.64 minutes for 73943 instances.
INFO:simpletransformers.ner.ner_model:{'eval_loss': 0.12421889339422403, 'precision': 0.8579710144927536, 'recall': 0.8176795580110497, 'f1_score': 0.8373408769448374}
Evaluation completed.
It took 0.35 minutes for 9122 instances.
Macro f1: 0.762, Micro f1: 0.986
Accuracy: 0.986
Run 0 finished.