Anyone have a script to run this with vLLM?

#15
by Ataylorm - opened

I have been running this on vLLM for months without any issue. Last night RunPod sent me an email that there was an issue with my pod. Which turns out to be they literally wiped it out. Now I am trying to get it setup again, but my scripts I had to run it, are no longer working, with a variety of errors. I am guessing something has changed with vLLM that is causing these issues. Wondering if anyone has a working setup script they could share?

Here is my latest error:
Errors

Here is my run script:
Run

Here is my setup script:

Setup

I know that the docker image for vLLM v0.10.2 works, so I would try to install specifically v0.10.2 of vLLM, or if possible use that docker image as the base for your pod (I'm not familiar with what runpod requires).

If that fails let me know what versions of vLLM, torch, and transformers your pod is running with.

This comment has been hidden (marked as Off-Topic)

Sign up or log in to comment