Update README.md
Browse files
README.md
CHANGED
|
@@ -19,7 +19,7 @@ This is a fine-tuned BERT-based language model to classify NLP-related research
|
|
| 19 |
It is a multi-label classifier that can predict concepts from all levels of the NLP taxonomy.
|
| 20 |
If the model identifies a lower-level concept, it did learn to predict both the lower-level concept and its hypernyms in the NLP taxonomy.
|
| 21 |
The model is fine-tuned on a weakly labeled dataset of 178,521 scientific papers from the ACL Anthology, the arXiv cs.CL domain, and Scopus.
|
| 22 |
-
Prior to fine-tuning, the model is initialized with weights from [allenai/
|
| 23 |
|
| 24 |
📄 Paper: [Exploring the Landscape of Natural Language Processing Research (RANLP 2023)](https://arxiv.org/abs/2307.10652).
|
| 25 |
|
|
|
|
| 19 |
It is a multi-label classifier that can predict concepts from all levels of the NLP taxonomy.
|
| 20 |
If the model identifies a lower-level concept, it did learn to predict both the lower-level concept and its hypernyms in the NLP taxonomy.
|
| 21 |
The model is fine-tuned on a weakly labeled dataset of 178,521 scientific papers from the ACL Anthology, the arXiv cs.CL domain, and Scopus.
|
| 22 |
+
Prior to fine-tuning, the model is initialized with weights from [allenai/specter2_base](https://huggingface.co/allenai/specter2_base).
|
| 23 |
|
| 24 |
📄 Paper: [Exploring the Landscape of Natural Language Processing Research (RANLP 2023)](https://arxiv.org/abs/2307.10652).
|
| 25 |
|