<s>
BigScience	O
Large	O
Open-science	O
Open-access	O
Multilingual	O
Language	O
Model	O
(	O
BLOOM	B-General_Concept
)	O
is	O
a	O
transformer-based	O
large	O
language	O
model	O
.	O
</s>
<s>
Trained	O
on	O
around	O
176	O
billion	O
parameters	O
over	O
March	O
through	O
July	O
2022	O
,	O
it	O
is	O
considered	O
an	O
alternative	O
to	O
OpenAI	O
's	O
GPT-3	B-General_Concept
trained	O
on	O
176	O
billion	O
parameters	O
.	O
</s>
<s>
BLOOM	B-General_Concept
uses	O
a	O
decoder-only	O
transformer	B-Algorithm
model	I-Algorithm
architecture	O
modified	O
from	O
Megatron-LM	O
GPT-2	B-General_Concept
.	O
</s>
<s>
The	O
BLOOM	B-General_Concept
project	O
was	O
started	O
by	O
a	O
co-founder	O
of	O
Hugging	B-Application
Face	I-Application
.	O
</s>
<s>
Six	O
main	O
groups	O
of	O
people	O
were	O
involved	O
,	O
including	O
HuggingFace	O
's	O
BigScience	O
team	O
,	O
the	O
Microsoft	O
DeepSpeed	O
team	O
,	O
the	O
NVIDIA	O
Megatron-LM	O
team	O
,	O
the	O
IDRIS/GENCI	O
team	O
,	O
the	O
PyTorch	B-Algorithm
team	O
,	O
and	O
the	O
volunteers	O
in	O
the	O
BigScience	O
Engineering	O
workgroup	O
.	O
</s>
<s>
BLOOM	B-General_Concept
was	O
trained	O
using	O
data	O
of	O
46	O
natural	O
languages	O
and	O
13	O
programming	O
languages	O
.	O
</s>
<s>
In	O
total	O
,	O
1.6	O
TeraByte	O
pre-processed	O
text	O
was	O
converted	O
into	O
350	O
Billion	O
unique	O
tokens	O
as	O
BLOOM	B-General_Concept
's	O
training	O
datasets	O
.	O
</s>
