mdeberta-v3-base-finetuned-climate-support-new
This model is a fine-tuned version of microsoft/mdeberta-v3-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5523
- Accuracy: 0.8990
- Accuracy Balanced: 0.8461
- Precision Macro: 0.8550
- Recall Macro: 0.8461
- F1 Macro: 0.8504
- Precision Micro: 0.8990
- Recall Micro: 0.8990
- F1 Micro: 0.8990
- Precision Weighted: 0.8978
- Recall Weighted: 0.8990
- F1 Weighted: 0.8983
- Precision Class 0: 0.7788
- Recall Class 0: 0.7521
- F1 Class 0: 0.7652
- Support Class 0: 468
- Precision Class 1: 0.9312
- Recall Class 1: 0.9401
- F1 Class 1: 0.9356
- Support Class 1: 1670
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 4
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Accuracy Balanced | Precision Macro | Recall Macro | F1 Macro | Precision Micro | Recall Micro | F1 Micro | Precision Weighted | Recall Weighted | F1 Weighted | Precision Class 0 | Recall Class 0 | F1 Class 0 | Support Class 0 | Precision Class 1 | Recall Class 1 | F1 Class 1 | Support Class 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.4698 | 0.6046 | 500 | 0.4537 | 0.8770 | 0.7744 | 0.8435 | 0.7744 | 0.8010 | 0.8770 | 0.8770 | 0.8770 | 0.8714 | 0.8770 | 0.8701 | 0.7937 | 0.5919 | 0.6781 | 468 | 0.8932 | 0.9569 | 0.9240 | 1670 |
| 0.3089 | 1.2092 | 1000 | 0.5018 | 0.8873 | 0.7825 | 0.8692 | 0.7825 | 0.8145 | 0.8873 | 0.8873 | 0.8873 | 0.8839 | 0.8873 | 0.8798 | 0.8429 | 0.5962 | 0.6984 | 468 | 0.8954 | 0.9689 | 0.9307 | 1670 |
| 0.2397 | 1.8138 | 1500 | 0.4276 | 0.8896 | 0.8178 | 0.8480 | 0.8178 | 0.8314 | 0.8896 | 0.8896 | 0.8896 | 0.8862 | 0.8896 | 0.8871 | 0.7802 | 0.6902 | 0.7324 | 468 | 0.9159 | 0.9455 | 0.9305 | 1670 |
| 0.1779 | 2.4184 | 2000 | 0.5046 | 0.8648 | 0.8543 | 0.7991 | 0.8543 | 0.8200 | 0.8648 | 0.8648 | 0.8648 | 0.8839 | 0.8648 | 0.8705 | 0.6484 | 0.8355 | 0.7302 | 468 | 0.9498 | 0.8731 | 0.9098 | 1670 |
| 0.1417 | 3.0230 | 2500 | 0.5791 | 0.8784 | 0.8506 | 0.8168 | 0.8506 | 0.8315 | 0.8784 | 0.8784 | 0.8784 | 0.8870 | 0.8784 | 0.8815 | 0.6919 | 0.8013 | 0.7426 | 468 | 0.9417 | 0.9 | 0.9204 | 1670 |
| 0.091 | 3.6276 | 3000 | 0.5523 | 0.8990 | 0.8461 | 0.8550 | 0.8461 | 0.8504 | 0.8990 | 0.8990 | 0.8990 | 0.8978 | 0.8990 | 0.8983 | 0.7788 | 0.7521 | 0.7652 | 468 | 0.9312 | 0.9401 | 0.9356 | 1670 |
Framework versions
- Transformers 4.56.2
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.1
- Downloads last month
- 11
Model tree for mljn/mdeberta-v3-base-finetuned-climate-support-new
Base model
microsoft/mdeberta-v3-base