<s>
Sentence	B-General_Concept
embedding	I-General_Concept
is	O
the	O
collective	O
name	O
for	O
a	O
set	O
of	O
techniques	O
in	O
natural	B-Language
language	I-Language
processing	I-Language
(	O
NLP	B-Language
)	O
where	O
sentences	O
are	O
mapped	O
to	O
vectors	O
of	O
real	O
numbers	O
.	O
</s>
<s>
Sentence	B-General_Concept
embedding	I-General_Concept
is	O
used	O
by	O
the	O
deep	B-Algorithm
learning	I-Algorithm
software	O
libraries	O
PyTorch	B-Algorithm
and	O
TensorFlow	B-Language
.	O
</s>
<s>
Popular	O
embeddings	O
are	O
based	O
on	O
the	O
hidden	O
layer	O
outputs	O
of	O
transformer	O
models	O
like	O
BERT	B-General_Concept
,	O
see	O
SBERT	O
.	O
</s>
<s>
An	O
alternative	O
direction	O
is	O
to	O
aggregate	O
word	B-General_Concept
embeddings	I-General_Concept
,	O
such	O
those	O
returned	O
by	O
Word2vec	B-Algorithm
,	O
into	O
sentence	B-General_Concept
embeddings	I-General_Concept
.	O
</s>
<s>
The	O
most	O
straightforward	O
approach	O
is	O
to	O
simply	O
compute	O
the	O
average	O
of	O
word	B-General_Concept
vectors	I-General_Concept
,	O
known	O
as	O
continuous	O
bag-of-words	O
(	O
CBOW	O
)	O
.	O
</s>
<s>
However	O
,	O
more	O
elaborate	O
solutions	O
based	O
on	O
word	B-General_Concept
vector	I-General_Concept
quantization	O
have	O
also	O
been	O
proposed	O
.	O
</s>
<s>
One	O
such	O
approach	O
is	O
the	O
vector	O
of	O
locally	O
aggregated	O
word	B-General_Concept
embeddings	I-General_Concept
(	O
VLAWE	O
)	O
,	O
which	O
demonstrated	O
performance	O
improvements	O
in	O
downstream	O
text	O
classification	O
tasks	O
.	O
</s>
<s>
In	O
the	O
best	O
results	O
are	O
obtained	O
using	O
a	O
BiLSTM	B-Algorithm
network	I-Algorithm
trained	O
on	O
the	O
.	O
</s>
<s>
A	O
slight	O
improvement	O
over	O
previous	O
scores	O
is	O
presented	O
in	O
:	O
SICK-R	O
:	O
0.888	O
and	O
SICK-E	O
:	O
87.8	O
using	O
a	O
concatenation	O
of	O
bidirectional	O
Gated	B-Algorithm
recurrent	I-Algorithm
unit	I-Algorithm
.	O
</s>
