# Optimum

## Docs

- [Installation](https://huggingface.co/docs/optimum/v0.0.1/installation.md)
- [Quickstart](https://huggingface.co/docs/optimum/v0.0.1/quickstart.md)
- [🤗 Optimum ONNX](https://huggingface.co/docs/optimum/v0.0.1/index.md)
- [Overview](https://huggingface.co/docs/optimum/v0.0.1/onnx/overview.md)
- [Export a model to ONNX with optimum.exporters.onnx](https://huggingface.co/docs/optimum/v0.0.1/onnx/usage_guides/export_a_model.md)
- [Adding support for an unsupported architecture](https://huggingface.co/docs/optimum/v0.0.1/onnx/usage_guides/contribute.md)
- [Configuration classes for ONNX exports](https://huggingface.co/docs/optimum/v0.0.1/onnx/package_reference/configuration.md)
- [Export functions](https://huggingface.co/docs/optimum/v0.0.1/onnx/package_reference/export.md)
- [Overview](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/overview.md)
- [Quickstart](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/quickstart.md)
- [Accelerated inference on AMD GPUs supported by ROCm](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/usage_guides/amdgpu.md)
- [Quantization](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/usage_guides/quantization.md)
- [Optimum Inference with ONNX Runtime](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/usage_guides/models.md)
- [Inference pipelines with the ONNX Runtime accelerator](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/usage_guides/pipelines.md)
- [Accelerated inference on NVIDIA GPUs](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/usage_guides/gpu.md)
- [Optimization](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/usage_guides/optimization.md)
- [ONNX Runtime Diffusion Pipelines](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/package_reference/modeling_diffusion.md)
- [Configuration](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/package_reference/configuration.md)
- [ONNX Runtime Models](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/package_reference/modeling.md)
- [Quantization](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/package_reference/quantization.md)
- [ONNX Runtime Pipelines[[optimum.onnxruntime.pipeline]]](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/package_reference/pipelines.md)
- [Optimization](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/package_reference/optimization.md)
- [ONNX 🤝 ONNX Runtime](https://huggingface.co/docs/optimum/v0.0.1/onnxruntime/concept_guides/onnx.md)
