|
|
--- |
|
|
library_name: "transformers.js" |
|
|
--- |
|
|
|
|
|
https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english with ONNX weights to be compatible with Transformers.js. |
|
|
|
|
|
Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`). |
|
|
|
|
|
<html> |
|
|
<head> |
|
|
<script type="module" crossorigin src="https://cdn.jsdelivr.net/npm/@gradio/lite/dist/lite.js"></script> |
|
|
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@gradio/lite/dist/lite.css" /> |
|
|
</head> |
|
|
</html> |
|
|
|
|
|
<gradio-lite> |
|
|
|
|
|
<gradio-requirements> |
|
|
transformers_js_py |
|
|
</gradio-requirements> |
|
|
|
|
|
<gradio-file name="app.py" entrypoint> |
|
|
from transformers_js import import_transformers_js |
|
|
import gradio as gr |
|
|
|
|
|
transformers = await import_transformers_js() |
|
|
pipeline = transformers.pipeline |
|
|
pipe = await pipeline('sentiment-analysis', 'osanseviero/distilbert-base-uncased-finetuned-quantized') |
|
|
|
|
|
async def classify(text): |
|
|
return await pipe(text) |
|
|
|
|
|
demo = gr.Interface(classify, "textbox", "json") |
|
|
demo.launch() |
|
|
</gradio-file> |
|
|
|
|
|
</gradio-lite> |