Update README.md
Browse files
README.md
CHANGED
|
@@ -54,7 +54,7 @@ All pre-training is done on the [Cultura-X](https://huggingface.co/datasets/uonl
|
|
| 54 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
| 55 |
|
| 56 |
## Evaluation
|
| 57 |
-
|
| 58 |
|
| 59 |
## Uses
|
| 60 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
|
@@ -95,12 +95,12 @@ We would like to give a special thanks to the following groups:
|
|
| 95 |
|
| 96 |
## Cite SambaLingo
|
| 97 |
```
|
| 98 |
-
@
|
| 99 |
-
|
| 100 |
-
|
| 101 |
-
|
| 102 |
-
|
| 103 |
-
|
| 104 |
-
|
| 105 |
}
|
| 106 |
```
|
|
|
|
| 54 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
| 55 |
|
| 56 |
## Evaluation
|
| 57 |
+
For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
| 58 |
|
| 59 |
## Uses
|
| 60 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
|
|
|
| 95 |
|
| 96 |
## Cite SambaLingo
|
| 97 |
```
|
| 98 |
+
@misc{csaki2024sambalingo,
|
| 99 |
+
title={SambaLingo: Teaching Large Language Models New Languages},
|
| 100 |
+
author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
|
| 101 |
+
year={2024},
|
| 102 |
+
eprint={2404.05829},
|
| 103 |
+
archivePrefix={arXiv},
|
| 104 |
+
primaryClass={cs.CL}
|
| 105 |
}
|
| 106 |
```
|