Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Luigi
/
ZeroGPU-LLM-Inference
like
33
Running
App
Files
Files
Community
1
Fetching metadata from the HF Docker repository...
ZeroGPU-LLM-Inference runs on CPU, not ZeroGPU
#1
by
jglowa
- opened
1 day ago
Discussion
jglowa
1 day ago
It actually runs on CPU instead of ZeroGPU:
Please enable ZeroGPU.
See translation
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Comment
·
Sign up
or
log in
to comment