nb-bert-edu-scorer-lr3e4-bs32
This model is a fine-tuned version of NbAiLab/nb-bert-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1845
- Precision: 0.5087
- Recall: 0.3283
- F1 Macro: 0.3194
- Accuracy: 0.3564
Model description
More information needed
Intended uses & limitations
More information needed
Test results
Binary classification accuracy (threshold at label 3) โ 78.18%
Report:
precision recall f1-score support
0 0.77 0.54 0.64 100
1 0.32 0.42 0.36 100
2 0.25 0.44 0.32 100
3 0.29 0.40 0.34 100
4 0.42 0.15 0.22 100
5 1.00 0.02 0.04 50
accuracy 0.36 550
macro avg 0.51 0.33 0.32 550
weighted avg 0.46 0.36 0.34 550
Confusion Matrix:
[[54 38 7 1 0 0]
[13 42 38 7 0 0]
[ 3 39 44 13 1 0]
[ 0 11 46 40 3 0]
[ 0 1 33 51 15 0]
[ 0 0 7 25 17 1]]
Metrics
epoch = 20.0
eval_accuracy = 0.3564
eval_f1_macro = 0.3194
eval_loss = 1.1845
eval_precision = 0.5087
eval_recall = 0.3283
eval_runtime = 0:00:05.11
eval_samples_per_second = 107.474
eval_steps_per_second = 3.517
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 Macro | Accuracy |
|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 2.3345 | 0.1086 | 0.1651 | 0.0954 | 0.3476 |
| 0.862 | 0.3368 | 1000 | 0.8527 | 0.4130 | 0.2980 | 0.2791 | 0.4624 |
| 0.8114 | 0.6736 | 2000 | 0.7838 | 0.4017 | 0.3237 | 0.3203 | 0.4868 |
| 0.81 | 1.0104 | 3000 | 0.7608 | 0.3934 | 0.3218 | 0.3154 | 0.4824 |
| 0.7887 | 1.3473 | 4000 | 0.9429 | 0.3713 | 0.3335 | 0.3193 | 0.3882 |
| 0.8082 | 1.6841 | 5000 | 0.7526 | 0.3936 | 0.3315 | 0.3298 | 0.4994 |
| 0.7536 | 2.0209 | 6000 | 0.7296 | 0.4059 | 0.3360 | 0.3325 | 0.4792 |
| 0.7932 | 2.3577 | 7000 | 0.7459 | 0.4036 | 0.3359 | 0.3314 | 0.4634 |
| 0.7472 | 2.6945 | 8000 | 0.7204 | 0.3888 | 0.3467 | 0.3468 | 0.5026 |
| 0.743 | 3.0313 | 9000 | 0.7214 | 0.3964 | 0.3408 | 0.3432 | 0.5036 |
| 0.705 | 3.3681 | 10000 | 0.7082 | 0.3956 | 0.3575 | 0.3600 | 0.51 |
| 0.7321 | 3.7050 | 11000 | 0.7231 | 0.3995 | 0.3419 | 0.3414 | 0.5176 |
| 0.7346 | 4.0418 | 12000 | 0.6929 | 0.4091 | 0.3560 | 0.3585 | 0.5052 |
| 0.7125 | 4.3786 | 13000 | 0.6933 | 0.4106 | 0.3491 | 0.3482 | 0.5052 |
| 0.735 | 4.7154 | 14000 | 0.7265 | 0.3979 | 0.3369 | 0.3396 | 0.5106 |
| 0.7009 | 5.0522 | 15000 | 0.7024 | 0.4024 | 0.3435 | 0.3444 | 0.5098 |
| 0.7068 | 5.3890 | 16000 | 0.7089 | 0.3951 | 0.3476 | 0.3499 | 0.5214 |
| 0.6774 | 5.7258 | 17000 | 0.7333 | 0.4000 | 0.3315 | 0.3281 | 0.5174 |
| 0.6799 | 6.0626 | 18000 | 0.7095 | 0.4167 | 0.3356 | 0.3332 | 0.5168 |
| 0.6956 | 6.3995 | 19000 | 0.6896 | 0.3969 | 0.3609 | 0.3645 | 0.5156 |
| 0.6647 | 6.7363 | 20000 | 0.6845 | 0.4050 | 0.3533 | 0.3559 | 0.5162 |
| 0.6509 | 7.0731 | 21000 | 0.6809 | 0.4004 | 0.3521 | 0.3525 | 0.4982 |
| 0.6775 | 7.4099 | 22000 | 0.6796 | 0.4021 | 0.3584 | 0.3617 | 0.5136 |
| 0.6744 | 7.7467 | 23000 | 0.6749 | 0.3994 | 0.3510 | 0.3531 | 0.511 |
| 0.6479 | 8.0835 | 24000 | 0.6750 | 0.4103 | 0.3556 | 0.3560 | 0.5234 |
| 0.6495 | 8.4203 | 25000 | 0.6797 | 0.4007 | 0.3516 | 0.3543 | 0.5184 |
| 0.691 | 8.7572 | 26000 | 0.6801 | 0.4114 | 0.3551 | 0.3577 | 0.515 |
| 0.7 | 9.0940 | 27000 | 0.6736 | 0.4034 | 0.3564 | 0.3572 | 0.5056 |
| 0.6697 | 9.4308 | 28000 | 0.6672 | 0.4063 | 0.3597 | 0.3616 | 0.5132 |
| 0.6228 | 9.7676 | 29000 | 0.6723 | 0.4109 | 0.3553 | 0.3579 | 0.5164 |
| 0.6459 | 10.1044 | 30000 | 0.6829 | 0.3976 | 0.3518 | 0.3528 | 0.5238 |
| 0.6534 | 10.4412 | 31000 | 0.6918 | 0.4015 | 0.3486 | 0.3485 | 0.5216 |
| 0.6229 | 10.7780 | 32000 | 0.6666 | 0.4016 | 0.3571 | 0.3587 | 0.5172 |
| 0.654 | 11.1149 | 33000 | 0.6687 | 0.4045 | 0.3640 | 0.3664 | 0.5126 |
| 0.6465 | 11.4517 | 34000 | 0.6717 | 0.4093 | 0.3490 | 0.3482 | 0.516 |
| 0.64 | 11.7885 | 35000 | 0.6836 | 0.3976 | 0.3639 | 0.3687 | 0.517 |
| 0.6338 | 12.1253 | 36000 | 0.6729 | 0.3957 | 0.3554 | 0.3574 | 0.521 |
| 0.6276 | 12.4621 | 37000 | 0.6703 | 0.4042 | 0.3616 | 0.3657 | 0.5164 |
| 0.6619 | 12.7989 | 38000 | 0.6673 | 0.4027 | 0.3599 | 0.3607 | 0.5096 |
| 0.5977 | 13.1357 | 39000 | 0.6741 | 0.4030 | 0.3561 | 0.3573 | 0.5258 |
| 0.6377 | 13.4725 | 40000 | 0.6726 | 0.3944 | 0.3686 | 0.3703 | 0.5146 |
| 0.6251 | 13.8094 | 41000 | 0.6734 | 0.4048 | 0.3565 | 0.3590 | 0.5228 |
| 0.6095 | 14.1462 | 42000 | 0.6655 | 0.4027 | 0.3619 | 0.3651 | 0.516 |
| 0.6175 | 14.4830 | 43000 | 0.6741 | 0.4015 | 0.3671 | 0.3689 | 0.5058 |
| 0.5936 | 14.8198 | 44000 | 0.6637 | 0.3959 | 0.3599 | 0.3604 | 0.508 |
| 0.6491 | 15.1566 | 45000 | 0.6721 | 0.4074 | 0.3673 | 0.3713 | 0.5184 |
| 0.6345 | 15.4934 | 46000 | 0.6725 | 0.3950 | 0.3539 | 0.3558 | 0.519 |
| 0.6295 | 15.8302 | 47000 | 0.6628 | 0.4040 | 0.3571 | 0.3597 | 0.5174 |
| 0.6262 | 16.1671 | 48000 | 0.6719 | 0.3989 | 0.3686 | 0.3696 | 0.509 |
| 0.6397 | 16.5039 | 49000 | 0.6706 | 0.3995 | 0.3653 | 0.3679 | 0.5186 |
| 0.586 | 16.8407 | 50000 | 0.6640 | 0.4017 | 0.3630 | 0.3656 | 0.5218 |
| 0.631 | 17.1775 | 51000 | 0.6669 | 0.3946 | 0.3568 | 0.3598 | 0.5144 |
| 0.6026 | 17.5143 | 52000 | 0.6797 | 0.3999 | 0.3544 | 0.3569 | 0.5256 |
| 0.5906 | 17.8511 | 53000 | 0.6608 | 0.4069 | 0.3662 | 0.3690 | 0.5214 |
| 0.5529 | 18.1879 | 54000 | 0.6630 | 0.3967 | 0.3638 | 0.3655 | 0.5182 |
| 0.6216 | 18.5248 | 55000 | 0.6645 | 0.4004 | 0.3671 | 0.3692 | 0.5106 |
| 0.5945 | 18.8616 | 56000 | 0.6602 | 0.3986 | 0.3577 | 0.3593 | 0.5172 |
| 0.6105 | 19.1984 | 57000 | 0.6602 | 0.3986 | 0.3596 | 0.3610 | 0.5148 |
| 0.6245 | 19.5352 | 58000 | 0.6617 | 0.3986 | 0.3623 | 0.3646 | 0.5124 |
| 0.5857 | 19.8720 | 59000 | 0.6621 | 0.3982 | 0.3627 | 0.3649 | 0.5138 |
Framework versions
- Transformers 4.53.2
- Pytorch 2.7.1+cu126
- Datasets 4.0.0
- Tokenizers 0.21.2
- Downloads last month
- 5
Model tree for versae/nb-bert-edu-scorer-lr3e4-bs32
Base model
NbAiLab/nb-bert-base