<s>
Mixture	B-Algorithm
of	I-Algorithm
experts	I-Algorithm
(	O
MoE	O
)	O
is	O
a	O
machine	O
learning	O
technique	O
where	O
multiple	O
expert	O
networks	O
(	O
learners	O
)	O
are	O
used	O
to	O
divide	O
a	O
problem	O
space	O
into	O
homogeneous	O
regions	O
.	O
</s>
<s>
It	O
differs	O
from	O
ensemble	B-Algorithm
techniques	I-Algorithm
in	O
that	O
typically	O
only	O
one	O
or	O
a	O
few	O
expert	O
models	O
will	O
be	O
run	O
,	O
rather	O
than	O
combining	O
results	O
from	O
all	O
models	O
.	O
</s>
<s>
An	O
example	O
from	O
computer	B-Application
vision	I-Application
is	O
combining	O
one	O
neural	B-Architecture
network	I-Architecture
model	O
for	O
human	O
detection	O
with	O
another	O
for	O
pose	B-General_Concept
estimation	I-General_Concept
.	O
</s>
<s>
If	O
the	O
output	O
is	O
conditioned	O
on	O
multiple	O
levels	O
of	O
(	O
probabilistic	O
)	O
gating	O
functions	O
,	O
the	O
mixture	O
is	O
called	O
a	O
hierarchical	O
mixture	B-Algorithm
of	I-Algorithm
experts	I-Algorithm
.	O
</s>
<s>
It	O
uses	O
multiple	O
MoE	O
models	O
that	O
share	O
capacity	O
for	O
use	O
by	O
low-resource	O
language	B-Language
models	I-Language
with	O
relatively	O
little	O
data	O
.	O
</s>
