<s>
In	O
natural	B-Language
language	I-Language
processing	I-Language
(	O
NLP	B-Language
)	O
,	O
a	O
word	B-General_Concept
embedding	I-General_Concept
is	O
a	O
representation	O
of	O
a	O
word	O
.	O
</s>
<s>
Word	B-General_Concept
embeddings	I-General_Concept
can	O
be	O
obtained	O
using	O
language	B-Language
modeling	I-Language
and	O
feature	B-General_Concept
learning	I-General_Concept
techniques	O
,	O
where	O
words	O
or	O
phrases	O
from	O
the	O
vocabulary	O
are	O
mapped	O
to	O
vectors	O
of	O
real	O
numbers	O
.	O
</s>
<s>
Methods	O
to	O
generate	O
this	O
mapping	O
include	O
neural	B-Architecture
networks	I-Architecture
,	O
dimensionality	B-Algorithm
reduction	I-Algorithm
on	O
the	O
word	O
co-occurrence	B-Algorithm
matrix	I-Algorithm
,	O
probabilistic	O
models	O
,	O
explainable	O
knowledge	O
base	O
method	O
,	O
and	O
explicit	O
representation	O
in	O
terms	O
of	O
the	O
context	O
in	O
which	O
words	O
appear	O
.	O
</s>
<s>
Word	O
and	O
phrase	O
embeddings	O
,	O
when	O
used	O
as	O
the	O
underlying	O
input	O
representation	O
,	O
have	O
been	O
shown	O
to	O
boost	O
the	O
performance	O
in	O
NLP	B-Language
tasks	O
such	O
as	O
syntactic	B-Language
parsing	I-Language
and	O
sentiment	O
analysis	O
.	O
</s>
<s>
In	O
Distributional	O
semantics	O
,	O
a	O
quantitative	O
methodological	O
approach	O
to	O
understanding	O
meaning	O
in	O
observed	O
language	O
,	O
word	B-General_Concept
embeddings	I-General_Concept
or	O
semantic	O
vector	O
space	O
models	O
have	O
been	O
used	O
as	O
a	O
knowledge	O
representation	O
for	O
some	O
time	O
.	O
</s>
<s>
Curse	B-Algorithm
of	I-Algorithm
dimensionality	I-Algorithm
)	O
.	O
</s>
<s>
Reducing	O
the	O
number	O
of	O
dimensions	O
using	O
linear	O
algebraic	O
methods	O
such	O
as	O
singular	O
value	O
decomposition	O
then	O
led	O
to	O
the	O
introduction	O
of	O
latent	O
semantic	O
analysis	O
in	O
the	O
late	O
1980s	O
and	O
the	O
Random	B-Algorithm
indexing	I-Algorithm
approach	O
for	O
collecting	O
word	O
cooccurrence	O
contexts	O
.	O
</s>
<s>
provided	O
in	O
a	O
series	O
of	O
papers	O
the	O
"	O
Neural	O
probabilistic	O
language	B-Language
models	I-Language
"	O
to	O
reduce	O
the	O
high	O
dimensionality	O
of	O
words	O
representations	O
in	O
contexts	O
by	O
"	O
learning	O
a	O
distributed	O
representation	O
for	O
words	O
"	O
.	O
</s>
<s>
Word	B-General_Concept
embeddings	I-General_Concept
come	O
in	O
two	O
different	O
styles	O
,	O
one	O
in	O
which	O
words	O
are	O
expressed	O
as	O
vectors	O
of	O
co-occurring	O
words	O
,	O
and	O
another	O
in	O
which	O
words	O
are	O
expressed	O
as	O
vectors	O
of	O
linguistic	O
contexts	O
in	O
which	O
the	O
words	O
occur	O
;	O
these	O
different	O
styles	O
are	O
studied	O
in	O
(	O
Lavelli	O
et	O
al.	O
,	O
2004	O
)	O
.	O
</s>
<s>
Most	O
new	O
word	B-General_Concept
embedding	I-General_Concept
techniques	O
after	O
about	O
2005	O
rely	O
on	O
a	O
neural	B-Architecture
network	I-Architecture
architecture	O
instead	O
of	O
more	O
probabilistic	O
and	O
algebraic	O
models	O
,	O
since	O
some	O
foundational	O
work	O
by	O
Yoshua	O
Bengio	O
and	O
colleagues	O
.	O
</s>
<s>
In	O
2013	O
,	O
a	O
team	O
at	O
Google	B-Application
led	O
by	O
Tomas	O
Mikolov	O
created	O
word2vec	B-Algorithm
,	O
a	O
word	B-General_Concept
embedding	I-General_Concept
toolkit	O
that	O
can	O
train	O
vector	O
space	O
models	O
faster	O
than	O
the	O
previous	O
approaches	O
.	O
</s>
<s>
The	O
word2vec	B-Algorithm
approach	O
has	O
been	O
widely	O
used	O
in	O
experimentation	O
and	O
was	O
instrumental	O
in	O
raising	O
interest	O
for	O
word	B-General_Concept
embeddings	I-General_Concept
as	O
a	O
technology	O
,	O
moving	O
the	O
research	O
strand	O
out	O
of	O
specialised	O
research	O
into	O
broader	O
experimentation	O
and	O
eventually	O
paving	O
the	O
way	O
for	O
practical	O
application	O
.	O
</s>
<s>
Historically	O
,	O
one	O
of	O
the	O
main	O
limitations	O
of	O
static	O
word	B-General_Concept
embeddings	I-General_Concept
or	O
word	B-General_Concept
vector	I-General_Concept
space	I-General_Concept
models	O
is	O
that	O
words	O
with	O
multiple	O
meanings	O
are	O
conflated	O
into	O
a	O
single	O
representation	O
(	O
a	O
single	O
vector	O
in	O
the	O
semantic	O
space	O
)	O
.	O
</s>
<s>
"	O
,	O
it	O
is	O
not	O
clear	O
if	O
the	O
term	O
club	O
is	O
related	O
to	O
the	O
word	O
sense	O
of	O
a	O
club	O
sandwich	O
,	O
baseball	B-Application
club	I-Application
,	O
clubhouse	O
,	O
golf	O
club	O
,	O
or	O
any	O
other	O
sense	O
that	O
club	O
might	O
have	O
.	O
</s>
<s>
The	O
necessity	O
to	O
accommodate	O
multiple	O
meanings	O
per	O
word	O
in	O
different	O
vectors	O
(	O
multi-sense	O
embeddings	O
)	O
is	O
the	O
motivation	O
for	O
several	O
contributions	O
in	O
NLP	B-Language
to	O
split	O
single-sense	O
embeddings	O
into	O
multi-sense	O
ones	O
.	O
</s>
<s>
Based	O
on	O
word2vec	B-Algorithm
skip-gram	O
,	O
Multi-Sense	O
Skip-Gram	O
(	O
MSSG	O
)	O
performs	O
word-sense	O
discrimination	O
and	O
embedding	O
simultaneously	O
,	O
improving	O
its	O
training	O
time	O
,	O
while	O
assuming	O
a	O
specific	O
number	O
of	O
senses	O
for	O
each	O
word	O
.	O
</s>
<s>
Combining	O
the	O
prior	O
knowledge	O
of	O
lexical	O
databases	O
(	O
e.g.	O
,	O
WordNet	B-General_Concept
,	O
ConceptNet	O
,	O
BabelNet	B-Application
)	O
,	O
word	B-General_Concept
embeddings	I-General_Concept
and	O
word	B-General_Concept
sense	I-General_Concept
disambiguation	I-General_Concept
,	O
Most	O
Suitable	O
Sense	O
Annotation	O
(	O
MSSA	O
)	O
labels	O
word-senses	O
through	O
an	O
unsupervised	O
and	O
knowledge-based	O
approach	O
,	O
considering	O
a	O
word	O
's	O
context	O
in	O
a	O
pre-defined	O
sliding	O
window	O
.	O
</s>
<s>
Once	O
the	O
words	O
are	O
disambiguated	O
,	O
they	O
can	O
be	O
used	O
in	O
a	O
standard	O
word	B-General_Concept
embeddings	I-General_Concept
technique	O
,	O
so	O
multi-sense	O
embeddings	O
are	O
produced	O
.	O
</s>
<s>
MSSA	O
architecture	O
allows	O
the	O
disambiguation	B-General_Concept
and	O
annotation	O
process	O
to	O
be	O
performed	O
recurrently	O
in	O
a	O
self-improving	O
manner	O
.	O
</s>
<s>
The	O
use	O
of	O
multi-sense	O
embeddings	O
is	O
known	O
to	O
improve	O
performance	O
in	O
several	O
NLP	B-Language
tasks	O
,	O
such	O
as	O
part-of-speech	O
tagging	O
,	O
semantic	O
relation	O
identification	O
,	O
semantic	O
relatedness	O
,	O
named	B-General_Concept
entity	I-General_Concept
recognition	I-General_Concept
and	O
sentiment	O
analysis	O
.	O
</s>
<s>
As	O
of	O
the	O
late	O
2010s	O
,	O
contextually-meaningful	O
embeddings	O
such	O
as	O
ELMo	B-General_Concept
and	O
BERT	B-General_Concept
have	O
been	O
developed	O
.	O
</s>
<s>
Unlike	O
static	O
word	B-General_Concept
embeddings	I-General_Concept
,	O
these	O
embeddings	O
are	O
at	O
the	O
token-level	O
,	O
in	O
that	O
each	O
occurrence	O
of	O
a	O
word	O
has	O
its	O
own	O
embedding	O
.	O
</s>
<s>
These	O
embeddings	O
better	O
reflect	O
the	O
multi-sense	O
nature	O
of	O
words	O
,	O
because	O
occurrences	O
of	O
a	O
word	O
in	O
similar	O
contexts	O
are	O
situated	O
in	O
similar	O
regions	O
of	O
BERT	B-General_Concept
’s	O
embedding	O
space	O
.	O
</s>
<s>
Word	B-General_Concept
embeddings	I-General_Concept
for	O
n-grams	O
in	O
biological	O
sequences	O
(	O
e.g.	O
</s>
<s>
Word	B-General_Concept
embeddings	I-General_Concept
with	O
applications	O
in	O
game	O
design	O
have	O
been	O
proposed	O
by	O
Rabii	O
and	O
Cook	O
as	O
a	O
way	O
to	O
discover	O
emergent	O
gameplay	O
using	O
logs	O
of	O
gameplay	O
data	O
.	O
</s>
<s>
The	O
process	O
requires	O
to	O
transcribe	O
actions	O
happening	O
during	O
the	O
game	O
within	O
a	O
formal	O
language	O
and	O
then	O
use	O
the	O
resulting	O
text	O
to	O
create	O
word	B-General_Concept
embeddings	I-General_Concept
.	O
</s>
<s>
The	O
results	O
presented	O
by	O
Rabii	O
and	O
Cook	O
suggest	O
that	O
the	O
resulting	O
vectors	O
can	O
capture	O
expert	O
knowledge	O
about	O
games	O
like	O
chess	B-Application
,	O
that	O
are	O
not	O
explicitly	O
stated	O
in	O
the	O
game	O
's	O
rules	O
.	O
</s>
<s>
in	O
the	O
form	O
of	O
the	O
thought	B-Algorithm
vectors	I-Algorithm
concept	O
.	O
</s>
<s>
In	O
2015	O
,	O
some	O
researchers	O
suggested	O
"	O
skip-thought	O
vectors	O
"	O
as	O
a	O
means	O
to	O
improve	O
the	O
quality	O
of	O
machine	B-Application
translation	I-Application
.	O
</s>
<s>
A	O
more	O
recent	O
and	O
popular	O
approach	O
for	O
representing	O
sentences	O
is	O
Sentence-BERT	O
,	O
or	O
SentenceTransformers	O
,	O
which	O
modifies	O
pre-trained	O
BERT	B-General_Concept
with	O
the	O
use	O
of	O
siamese	O
and	O
triplet	O
network	O
structures	O
.	O
</s>
<s>
Software	O
for	O
training	O
and	O
using	O
word	B-General_Concept
embeddings	I-General_Concept
includes	O
Tomas	O
Mikolov	O
's	O
Word2vec	B-Algorithm
,	O
Stanford	O
University	O
's	O
GloVe	B-General_Concept
,	O
GN-GloVe	O
,	O
Flair	O
embeddings	O
,	O
AllenNLP	O
's	O
ELMo	B-General_Concept
,	O
BERT	B-General_Concept
,	O
fastText	B-Application
,	O
Gensim	B-Application
,	O
Indra	O
,	O
and	O
Deeplearning4j	B-Library
.	O
</s>
<s>
Principal	B-Application
Component	I-Application
Analysis	I-Application
(	O
PCA	O
)	O
and	O
T-Distributed	B-Algorithm
Stochastic	I-Algorithm
Neighbour	I-Algorithm
Embedding	I-Algorithm
(	O
t-SNE	B-Algorithm
)	O
are	O
both	O
used	O
to	O
reduce	O
the	O
dimensionality	O
of	O
word	B-General_Concept
vector	I-General_Concept
spaces	I-General_Concept
and	O
visualize	O
word	B-General_Concept
embeddings	I-General_Concept
and	O
clusters	B-General_Concept
.	O
</s>
<s>
For	O
instance	O
,	O
the	O
fastText	B-Application
is	O
also	O
used	O
to	O
calculate	O
word	B-General_Concept
embeddings	I-General_Concept
for	O
text	O
corpora	O
in	O
Sketch	B-Algorithm
Engine	I-Algorithm
that	O
are	O
available	O
online	O
.	O
</s>
<s>
Word	B-General_Concept
embeddings	I-General_Concept
may	O
contain	O
the	O
biases	O
and	O
stereotypes	O
contained	O
in	O
the	O
trained	O
dataset	O
,	O
as	O
Bolukbasi	O
et	O
al	O
.	O
</s>
<s>
Debiasing	O
Word	B-General_Concept
Embeddings	I-General_Concept
”	O
that	O
a	O
publicly	O
available	O
(	O
and	O
popular	O
)	O
word2vec	B-Algorithm
embedding	O
trained	O
on	O
Google	B-Application
News	O
texts	O
(	O
a	O
commonly	O
used	O
data	O
corpus	O
)	O
,	O
which	O
consists	O
of	O
text	O
written	O
by	O
professional	O
journalists	O
,	O
still	O
shows	O
disproportionate	O
word	O
associations	O
reflecting	O
gender	O
and	O
racial	O
biases	O
when	O
extracting	O
word	O
analogies	O
.	O
</s>
<s>
For	O
example	O
,	O
one	O
of	O
the	O
analogies	O
generated	O
using	O
the	O
aforementioned	O
word	B-General_Concept
embedding	I-General_Concept
is	O
“	O
man	O
is	O
to	O
computer	O
programmer	O
as	O
woman	O
is	O
to	O
homemaker	O
”	O
.	O
</s>
<s>
The	O
applications	O
of	O
these	O
trained	O
word	B-General_Concept
embeddings	I-General_Concept
without	O
careful	O
oversight	O
likely	O
perpetuates	O
existing	O
bias	O
in	O
society	O
,	O
which	O
is	O
introduced	O
through	O
unaltered	O
training	O
data	O
.	O
</s>
<s>
Furthermore	O
,	O
word	B-General_Concept
embeddings	I-General_Concept
can	O
even	O
amplify	O
these	O
biases	O
.	O
</s>
<s>
Given	O
word	B-General_Concept
embeddings	I-General_Concept
popular	O
usage	O
in	O
NLP	B-Language
applications	O
such	O
as	O
search	O
ranking	O
,	O
CV	O
parsing	B-Language
and	O
recommendation	O
systems	O
,	O
the	O
biases	O
that	O
exist	O
in	O
pre-trained	O
word	B-General_Concept
embeddings	I-General_Concept
may	O
have	O
further	O
reaching	O
impact	O
than	O
we	O
realize	O
.	O
</s>
