# Lighteval

## Docs

- [Using Inference Providers as Backend](https://huggingface.co/docs/lighteval/v0.12.0/use-inference-providers-as-backend.md)
- [Lighteval](https://huggingface.co/docs/lighteval/v0.12.0/index.md)
- [Caching System](https://huggingface.co/docs/lighteval/v0.12.0/caching.md)
- [Using Hugging Face Inference Endpoints or TGI as Backend](https://huggingface.co/docs/lighteval/v0.12.0/use-huggingface-inference-endpoints-or-tgi-as-backend.md)
- [Adding a New Metric](https://huggingface.co/docs/lighteval/v0.12.0/adding-a-new-metric.md)
- [Quick Tour](https://huggingface.co/docs/lighteval/v0.12.0/quicktour.md)
- [Using SGLang as Backend](https://huggingface.co/docs/lighteval/v0.12.0/use-sglang-as-backend.md)
- [Using VLLM as Backend](https://huggingface.co/docs/lighteval/v0.12.0/use-vllm-as-backend.md)
- [Using LiteLLM as Backend](https://huggingface.co/docs/lighteval/v0.12.0/use-litellm-as-backend.md)
- [Metric List](https://huggingface.co/docs/lighteval/v0.12.0/metric-list.md)
- [Adding a Custom Task](https://huggingface.co/docs/lighteval/v0.12.0/adding-a-custom-task.md)
- [Installation](https://huggingface.co/docs/lighteval/v0.12.0/installation.md)
- [Evaluate your model with Inspect-AI](https://huggingface.co/docs/lighteval/v0.12.0/inspect-ai.md)
- [Contributing to Multilingual Evaluations](https://huggingface.co/docs/lighteval/v0.12.0/contributing-to-multilingual-evaluations.md)
- [Saving and Reading Results](https://huggingface.co/docs/lighteval/v0.12.0/saving-and-reading-results.md)
- [Using the Python API](https://huggingface.co/docs/lighteval/v0.12.0/using-the-python-api.md)
- [Evaluating Custom Models](https://huggingface.co/docs/lighteval/v0.12.0/evaluating-a-custom-model.md)
- [Available tasks](https://huggingface.co/docs/lighteval/v0.12.0/available-tasks.md)
- [Logging](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/logging.md)
- [EvaluationTracker[[lighteval.logging.evaluation_tracker.EvaluationTracker]]](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/evaluation_tracker.md)
- [Model's Output[[lighteval.models.model_output.ModelResponse]]](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/models_outputs.md)
- [Doc[[lighteval.tasks.requests.Doc]]](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/doc.md)
- [Tasks](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/tasks.md)
- [Model Configs](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/models.md)
- [Metrics](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/metrics.md)
- [Pipeline](https://huggingface.co/docs/lighteval/v0.12.0/package_reference/pipeline.md)
