move bcp47 out of languages tag
Browse files
README.md
CHANGED
|
@@ -3,6 +3,8 @@
|
|
| 3 |
---
|
| 4 |
language:
|
| 5 |
- da
|
|
|
|
|
|
|
| 6 |
- da-bornholm
|
| 7 |
- da-synnejyl
|
| 8 |
tags:
|
|
@@ -16,6 +18,7 @@ co2_eq_emissions:
|
|
| 16 |
training_type: "pretraining"
|
| 17 |
geographical_location: "Copenhagen, Denmark"
|
| 18 |
hardware_used: "4 A100 GPUs, 91 training hours"
|
|
|
|
| 19 |
---
|
| 20 |
|
| 21 |
`dant5-small` is a 60M parameter model with architecture identical to `t5-small`. It was trained for 10 epochs on the Danigh GigaWord Corpus ([official website](https://gigaword.dk), [paper](https://aclanthology.org/2021.nodalida-main.46/)).
|
|
|
|
| 3 |
---
|
| 4 |
language:
|
| 5 |
- da
|
| 6 |
+
language_bcp47:
|
| 7 |
+
- da
|
| 8 |
- da-bornholm
|
| 9 |
- da-synnejyl
|
| 10 |
tags:
|
|
|
|
| 18 |
training_type: "pretraining"
|
| 19 |
geographical_location: "Copenhagen, Denmark"
|
| 20 |
hardware_used: "4 A100 GPUs, 91 training hours"
|
| 21 |
+
emissions: 23660
|
| 22 |
---
|
| 23 |
|
| 24 |
`dant5-small` is a 60M parameter model with architecture identical to `t5-small`. It was trained for 10 epochs on the Danigh GigaWord Corpus ([official website](https://gigaword.dk), [paper](https://aclanthology.org/2021.nodalida-main.46/)).
|