File size: 1,814 Bytes
d56c4d8
0c39b71
ca7c638
0c39b71
 
 
b7cb178
 
0c39b71
b7cb178
0c39b71
b7cb178
0c39b71
 
d56c4d8
 
ca7c638
d56c4d8
ca7c638
 
 
d56c4d8
 
 
ca7c638
8c68e3a
d56c4d8
ca7c638
d56c4d8
 
ca7c638
d56c4d8
ca7c638
d56c4d8
ca7c638
3b17562
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b7cb178
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---
library_name: zeroshot_classifier
tags:
  - transformers
  - sentence-transformers
  - zeroshot_classifier
license: mit
datasets:
  - claritylab/UTCD
language:
  - en
pipeline_tag: zero-shot-classification
metrics:
  - accuracy
---

# Zero-shot Vanilla Binary BERT 

This model is a BERT model. 
It was introduced in the Findings of ACL'23 Paper **Label Agnostic Pre-training for Zero-shot Text Classification** by ***Christopher Clarke, Yuzhao Heng, Yiping Kang, Krisztian Flautner, Lingjia Tang and Jason Mars***. 
The code for training and evaluating this model can be found [here](https://github.com/ChrisIsKing/zero-shot-text-classification/tree/master). 

## Model description

This model was trained via the binary classification framework. It is intended for zero-shot text classification. 
It was trained as a baseline with the aspect-normalized [UTCD](https://huggingface.co/datasets/claritylab/UTCD) dataset. 

- **Finetuned from model:** [`bert-base-uncased`](https://huggingface.co/bert-base-uncased)


## Usage

You can use the model like this:

```python
>>> from zeroshot_classifier.models import BinaryBertCrossEncoder
>>> model = BinaryBertCrossEncoder(model_name='claritylab/zero-shot-vanilla-binary-bert')

>>> text = "I'd like to have this track onto my Classical Relaxations playlist."
>>> labels = [
>>>     'Add To Playlist', 'Book Restaurant', 'Get Weather', 'Play Music', 'Rate Book', 'Search Creative Work',
>>>     'Search Screening Event'
>>> ]
>>> query = [[text, lb] for lb in labels]
>>> logits = model.predict(query, apply_softmax=True)
>>> print(logits)

[[1.1909954e-04 9.9988091e-01]
 [9.9997509e-01 2.4927122e-05]
 [9.9997497e-01 2.5082643e-05]
 [2.4483365e-04 9.9975520e-01]
 [9.9996781e-01 3.2211588e-05]
 [9.9985993e-01 1.4002046e-04]
 [9.9976152e-01 2.3845369e-04]]
```