Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 38 items • Updated 26 days ago • 57
view article Article Training and Finetuning Reranker Models with Sentence Transformers v4 Mar 26 • 175
A Tale of Trust and Accuracy: Base vs. Instruct LLMs in RAG Systems Paper • 2406.14972 • Published Jun 21, 2024 • 7
The Power of Noise: Redefining Retrieval for RAG Systems Paper • 2401.14887 • Published Jan 26, 2024 • 3