Model Card for AICrossSim/bitflip-clm-1.1b

A 200M parameter bitflip-aware language model trained on 22 * 1.1B tokens from FineWeb-Edu dataset.

Model Details

bitflip-aixsim-1.1B is a transformer-based language model with approximately 1.1 billion parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb-Edu dataset.

Training Details

Experiment setup and training logs can be found at wandb run.

Downloads last month
-
Safetensors
Model size
1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train AICrossSim/bitflip-fc-clm-1.1b

Collection including AICrossSim/bitflip-fc-clm-1.1b