0
stringclasses 12
values | 1
float64 0
2.17k
|
|---|---|
megatron.core.transformer.attention.forward.qkv
| 0.471296
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 5.027552
|
megatron.core.transformer.attention.forward.linear_proj
| 1.278752
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.801184
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452352
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.866464
|
megatron.core.transformer.mlp.forward.activation
| 0.08768
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.839744
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.805408
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451264
|
megatron.core.transformer.attention.forward.qkv
| 0.469888
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 5.028096
|
megatron.core.transformer.attention.forward.linear_proj
| 1.298592
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.820704
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451136
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.868448
|
megatron.core.transformer.mlp.forward.activation
| 0.088416
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.833184
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.801632
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.450528
|
megatron.core.transformer.attention.forward.qkv
| 0.468448
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 5.014816
|
megatron.core.transformer.attention.forward.linear_proj
| 1.304864
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.812192
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.450816
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.867136
|
megatron.core.transformer.mlp.forward.activation
| 0.087936
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.84128
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.808256
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.450816
|
megatron.core.transformer.attention.forward.qkv
| 0.470528
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 4.99856
|
megatron.core.transformer.attention.forward.linear_proj
| 1.3136
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.806528
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451008
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.86688
|
megatron.core.transformer.mlp.forward.activation
| 0.088352
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.837856
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.804992
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.450848
|
megatron.core.transformer.attention.forward.qkv
| 0.46944
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 5.014656
|
megatron.core.transformer.attention.forward.linear_proj
| 1.308096
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.816256
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45072
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.866752
|
megatron.core.transformer.mlp.forward.activation
| 0.088064
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.84176
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.808896
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.450048
|
megatron.core.transformer.attention.forward.qkv
| 0.475904
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104
|
megatron.core.transformer.attention.forward.core_attention
| 5.018816
|
megatron.core.transformer.attention.forward.linear_proj
| 1.308128
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.827008
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452256
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.856352
|
megatron.core.transformer.mlp.forward.activation
| 0.087584
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.843552
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.799296
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452224
|
megatron.core.transformer.attention.forward.qkv
| 0.469792
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 5.03008
|
megatron.core.transformer.attention.forward.linear_proj
| 1.30256
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.825696
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45216
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.864736
|
megatron.core.transformer.mlp.forward.activation
| 0.08784
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.851552
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.816
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451744
|
megatron.core.transformer.attention.forward.qkv
| 0.4712
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 5.009152
|
megatron.core.transformer.attention.forward.linear_proj
| 1.31616
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.819936
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.451776
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.863136
|
megatron.core.transformer.mlp.forward.activation
| 0.088064
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.857472
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.820384
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.451424
|
megatron.core.transformer.attention.forward.qkv
| 0.472064
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003072
|
megatron.core.transformer.attention.forward.core_attention
| 5.020288
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.