File size: 464 Bytes
9ebed6d
 
1316be1
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
---
license: mit
library_name: transformers
pipeline_tag: text-generation
---

We introduce LLaDA (<b>L</b>arge <b>La</b>nguage <b>D</b>iffusion with m<b>A</b>sking), a diffusion model with an unprecedented 8B scale, trained entirely from scratch, 
rivaling LLaMA3 8B in performance, as described in [the paper](https://hf.co/papers/2502.09992).

Project page: https://ml-gsai.github.io/LLaDA-demo/.

For code and sample usage, see https://github.com/ML-GSAI/SMDM.