finbert-ft-icar-a-v0.11-aps
This model is a fine-tuned version of project-aps/finbert-finetune on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2343
- Accuracy: 0.8904
- Precision: 0.8787
- Recall: 0.8542
- F1: 0.8647
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 3
- total_train_batch_size: 3
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
|---|---|---|---|---|---|---|---|
| 3.6033 | 1.0 | 704 | 1.0430 | 0.8034 | 0.8146 | 0.7110 | 0.7320 |
| 1.9526 | 2.0 | 1408 | 0.9009 | 0.8261 | 0.8087 | 0.7546 | 0.7711 |
| 1.4631 | 3.0 | 2112 | 0.9722 | 0.8299 | 0.8069 | 0.7731 | 0.7824 |
| 1.0839 | 4.0 | 2816 | 0.8984 | 0.8526 | 0.8544 | 0.7870 | 0.8080 |
| 0.8472 | 5.0 | 3520 | 0.9364 | 0.8526 | 0.8534 | 0.7901 | 0.8112 |
| 0.6508 | 6.0 | 4224 | 0.8462 | 0.8677 | 0.8448 | 0.8274 | 0.8350 |
| 0.4698 | 7.0 | 4928 | 0.9436 | 0.8658 | 0.8570 | 0.8159 | 0.8313 |
| 0.3059 | 8.0 | 5632 | 0.9649 | 0.8620 | 0.8582 | 0.8190 | 0.8349 |
| 0.2214 | 9.0 | 6336 | 1.0001 | 0.8733 | 0.8521 | 0.8416 | 0.8466 |
| 0.1601 | 10.0 | 7040 | 1.0287 | 0.8620 | 0.8391 | 0.8248 | 0.8312 |
| 0.1301 | 11.0 | 7744 | 1.0760 | 0.8639 | 0.8366 | 0.8317 | 0.8340 |
| 0.1296 | 12.0 | 8448 | 1.1559 | 0.8677 | 0.8583 | 0.8250 | 0.8383 |
| 0.077 | 13.0 | 9152 | 1.1705 | 0.8601 | 0.8339 | 0.8315 | 0.8326 |
| 0.094 | 14.0 | 9856 | 1.1400 | 0.8752 | 0.8594 | 0.8409 | 0.8490 |
| 0.0663 | 15.0 | 10560 | 1.2298 | 0.8696 | 0.8565 | 0.8218 | 0.8345 |
| 0.065 | 16.0 | 11264 | 1.2036 | 0.8696 | 0.8429 | 0.8365 | 0.8394 |
| 0.0848 | 17.0 | 11968 | 1.1844 | 0.8752 | 0.8537 | 0.8372 | 0.8443 |
| 0.0742 | 18.0 | 12672 | 1.2367 | 0.8715 | 0.8512 | 0.8328 | 0.8403 |
| 0.0917 | 19.0 | 13376 | 1.2492 | 0.8715 | 0.8618 | 0.8261 | 0.8392 |
| 0.0501 | 20.0 | 14080 | 1.2344 | 0.8601 | 0.8325 | 0.8239 | 0.8275 |
| 0.0432 | 21.0 | 14784 | 1.2575 | 0.8752 | 0.8497 | 0.8483 | 0.8490 |
| 0.0407 | 22.0 | 15488 | 1.2300 | 0.8771 | 0.8603 | 0.8392 | 0.8484 |
| 0.0641 | 23.0 | 16192 | 1.2257 | 0.8828 | 0.8753 | 0.8400 | 0.8538 |
| 0.0561 | 24.0 | 16896 | 1.2529 | 0.8828 | 0.8718 | 0.8509 | 0.8601 |
| 0.0455 | 25.0 | 17600 | 1.3678 | 0.8639 | 0.8450 | 0.8198 | 0.8294 |
| 0.0374 | 26.0 | 18304 | 1.3127 | 0.8771 | 0.8571 | 0.8470 | 0.8517 |
| 0.0439 | 27.0 | 19008 | 1.2788 | 0.8828 | 0.8632 | 0.8552 | 0.8590 |
| 0.0403 | 28.0 | 19712 | 1.2715 | 0.8771 | 0.8572 | 0.8435 | 0.8498 |
| 0.0266 | 29.0 | 20416 | 1.2829 | 0.8752 | 0.8562 | 0.8374 | 0.8456 |
| 0.03 | 30.0 | 21120 | 1.3335 | 0.8733 | 0.8581 | 0.8387 | 0.8474 |
| 0.0455 | 31.0 | 21824 | 1.3037 | 0.8752 | 0.8580 | 0.8391 | 0.8471 |
| 0.0484 | 32.0 | 22528 | 1.2934 | 0.8771 | 0.8517 | 0.8541 | 0.8526 |
| 0.0413 | 33.0 | 23232 | 1.2343 | 0.8904 | 0.8787 | 0.8542 | 0.8647 |
| 0.0438 | 34.0 | 23936 | 1.3027 | 0.8847 | 0.8778 | 0.8407 | 0.8554 |
| 0.0319 | 35.0 | 24640 | 1.2800 | 0.8752 | 0.8497 | 0.8506 | 0.8501 |
| 0.0311 | 36.0 | 25344 | 1.2994 | 0.8790 | 0.8644 | 0.8368 | 0.8480 |
| 0.0368 | 37.0 | 26048 | 1.3318 | 0.8715 | 0.8594 | 0.8214 | 0.8359 |
| 0.0308 | 38.0 | 26752 | 1.2342 | 0.8885 | 0.8708 | 0.8657 | 0.8682 |
| 0.0412 | 39.0 | 27456 | 1.2783 | 0.8790 | 0.8691 | 0.8411 | 0.8528 |
| 0.0387 | 40.0 | 28160 | 1.2715 | 0.8828 | 0.8624 | 0.8539 | 0.8578 |
| 0.031 | 41.0 | 28864 | 1.2464 | 0.8828 | 0.8607 | 0.8533 | 0.8568 |
| 0.0289 | 42.0 | 29568 | 1.2761 | 0.8809 | 0.8569 | 0.8569 | 0.8568 |
| 0.0331 | 43.0 | 30272 | 1.2748 | 0.8809 | 0.8569 | 0.8569 | 0.8568 |
| 0.0276 | 44.0 | 30976 | 1.2644 | 0.8809 | 0.8642 | 0.8466 | 0.8544 |
| 0.0295 | 45.0 | 31680 | 1.2594 | 0.8828 | 0.8646 | 0.8528 | 0.8582 |
| 0.0227 | 46.0 | 32384 | 1.2658 | 0.8847 | 0.8671 | 0.8528 | 0.8592 |
| 0.0241 | 47.0 | 33088 | 1.2809 | 0.8828 | 0.8678 | 0.8485 | 0.8570 |
| 0.0314 | 48.0 | 33792 | 1.2853 | 0.8828 | 0.8678 | 0.8485 | 0.8570 |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 4.4.2
- Tokenizers 0.21.2
- Downloads last month
- 4