Getting the following error with VLLM: KeyError: 'ministral3'
#1
by
avishekjana
- opened
vllm-server-1 | (APIServer pid=1) text_config = CONFIG_MAPPINGtext_config["model_type"]
vllm-server-1 | (APIServer pid=1) ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
vllm-server-1 | (APIServer pid=1) File "/usr/local/lib/python3.12/dist-packages/transformers/models/auto/configuration_auto.py", line 1049, in getitem
vllm-server-1 | (APIServer pid=1) raise KeyError(key)
vllm-server-1 | (APIServer pid=1) KeyError: 'ministral3'
Docker image: vllm/vllm-openai:latest
Transformers version needed update:
>>> import transformers
>>> from transformers.models.auto import CONFIG_MAPPING
>>> CONFIG_MAPPING['ministral3']
<class 'transformers.models.ministral3.configuration_ministral3.Ministral3Config'>
>>> transformers.__version__
'5.0.0.dev0'
You need to ensure you add these flag at the end:
vllm serve ... --tokenizer_mode mistral --config_format mistral --load_format mistral \
--enable-auto-tool-choice --tool-call-parser mistral