0
stringclasses 12
values | 1
float64 0
2.17k
|
|---|---|
megatron.core.transformer.attention.forward.linear_proj
| 1.299904
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.815616
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451648
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.86608
|
megatron.core.transformer.mlp.forward.activation
| 0.087232
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.843648
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.808864
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452
|
megatron.core.transformer.attention.forward.qkv
| 0.46848
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 5.015424
|
megatron.core.transformer.attention.forward.linear_proj
| 1.317184
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.82464
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45184
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.867488
|
megatron.core.transformer.mlp.forward.activation
| 0.08784
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.84096
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.808256
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452256
|
megatron.core.transformer.attention.forward.qkv
| 0.470208
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 5.01424
|
megatron.core.transformer.attention.forward.linear_proj
| 1.30512
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.813312
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451872
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.865664
|
megatron.core.transformer.mlp.forward.activation
| 0.088064
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.843584
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.809312
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451392
|
megatron.core.transformer.attention.forward.qkv
| 0.472352
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 5.013504
|
megatron.core.transformer.attention.forward.linear_proj
| 1.30336
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.812992
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451392
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.86576
|
megatron.core.transformer.mlp.forward.activation
| 0.08768
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.851168
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.81664
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451936
|
megatron.core.transformer.attention.forward.qkv
| 0.469952
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 5.019552
|
megatron.core.transformer.attention.forward.linear_proj
| 1.28496
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.798624
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45152
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.868704
|
megatron.core.transformer.mlp.forward.activation
| 0.087232
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.840256
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.808224
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.45296
|
megatron.core.transformer.attention.forward.qkv
| 0.471104
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003072
|
megatron.core.transformer.attention.forward.core_attention
| 5.018624
|
megatron.core.transformer.attention.forward.linear_proj
| 1.309184
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.824896
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451552
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.869088
|
megatron.core.transformer.mlp.forward.activation
| 0.088064
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.845184
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.814176
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452192
|
megatron.core.transformer.attention.forward.qkv
| 0.472736
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104
|
megatron.core.transformer.attention.forward.core_attention
| 5.009888
|
megatron.core.transformer.attention.forward.linear_proj
| 1.311104
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.817472
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451808
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.864
|
megatron.core.transformer.mlp.forward.activation
| 0.08784
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.854208
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.81792
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452416
|
megatron.core.transformer.attention.forward.qkv
| 0.472192
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288
|
megatron.core.transformer.attention.forward.core_attention
| 5.014368
|
megatron.core.transformer.attention.forward.linear_proj
| 1.299616
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.809728
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452448
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.866112
|
megatron.core.transformer.mlp.forward.activation
| 0.089792
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.84112
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.808928
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.45104
|
megatron.core.transformer.attention.forward.qkv
| 0.47104
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288
|
megatron.core.transformer.attention.forward.core_attention
| 5.023072
|
megatron.core.transformer.attention.forward.linear_proj
| 1.28912
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.80688
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452128
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.865216
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.