vllm across serving open api compatible api?
#15
by
prudant
- opened
Can you provide an example of VLLM usage with open ai compatible serving vllm API in order to host the model inside a vllm docker container
regards!
there is none ! I tried installing with vLLM and it didn't run ! waiting for it to be honest so dar the model is looking promising
Would be much appreciated indeed. It's likely going to wait until vLLM v0.11.1 release for their model arch to be supported