This model was converted to onnx format from grammarly/coedit-xl using ONNXRUNNTIME. Refer to the original model card for more details on the model.
Then it was quanted to 8BIT
By A Cool student
- Downloads last month
- 53
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for IDK100boysaj/coedit-xl-onnx-8bit
Base model
grammarly/coedit-xl