Dataset Viewer
	| text
				 string | 
|---|
| 
	conditioner.embedders.0.transformer.text_model.embeddings.position_embedding.weight | 
| 
	conditioner.embedders.0.transformer.text_model.embeddings.position_ids | 
| 
	conditioner.embedders.0.transformer.text_model.embeddings.token_embedding.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.layer_norm1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.layer_norm1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.layer_norm2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.layer_norm2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.mlp.fc1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.mlp.fc1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.mlp.fc2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.mlp.fc2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.k_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.k_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.out_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.out_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.q_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.q_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.v_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.0.self_attn.v_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.layer_norm1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.layer_norm1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.layer_norm2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.layer_norm2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.mlp.fc1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.mlp.fc1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.mlp.fc2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.mlp.fc2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.k_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.k_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.out_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.out_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.q_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.q_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.v_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.1.self_attn.v_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.layer_norm1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.layer_norm1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.layer_norm2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.layer_norm2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.mlp.fc1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.mlp.fc1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.mlp.fc2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.mlp.fc2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.k_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.k_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.out_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.out_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.q_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.q_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.v_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.10.self_attn.v_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.layer_norm1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.layer_norm1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.layer_norm2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.layer_norm2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.mlp.fc1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.mlp.fc1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.mlp.fc2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.mlp.fc2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.k_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.k_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.out_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.out_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.q_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.q_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.v_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.11.self_attn.v_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.layer_norm1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.layer_norm1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.layer_norm2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.layer_norm2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.mlp.fc1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.mlp.fc1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.mlp.fc2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.mlp.fc2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.k_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.k_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.out_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.out_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.q_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.q_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.v_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.2.self_attn.v_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.layer_norm1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.layer_norm1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.layer_norm2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.layer_norm2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.mlp.fc1.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.mlp.fc1.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.mlp.fc2.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.mlp.fc2.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.k_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.k_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.out_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.out_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.q_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.q_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.v_proj.bias | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.3.self_attn.v_proj.weight | 
| 
	conditioner.embedders.0.transformer.text_model.encoder.layers.4.layer_norm1.bias | 
End of preview. Expand
						in Data Studio
					
	No dataset card yet
- Downloads last month
- 4
