0
stringclasses 12
values | 1
float64 0
120k
|
|---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.573664
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.236065
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.1776
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.324384
|
megatron.core.transformer.mlp.forward.activation
| 0.037248
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.733216
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.105728
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176064
|
megatron.core.transformer.attention.forward.qkv
| 0.187264
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848
|
megatron.core.transformer.attention.forward.core_attention
| 23.520544
|
megatron.core.transformer.attention.forward.linear_proj
| 0.567136
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.297249
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176992
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325728
|
megatron.core.transformer.mlp.forward.activation
| 0.037056
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.735392
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.109504
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176576
|
megatron.core.transformer.attention.forward.qkv
| 0.183712
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002816
|
megatron.core.transformer.attention.forward.core_attention
| 23.528641
|
megatron.core.transformer.attention.forward.linear_proj
| 0.572448
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.307167
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175808
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.327168
|
megatron.core.transformer.mlp.forward.activation
| 0.037792
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.735872
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.112096
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.17584
|
megatron.core.transformer.attention.forward.qkv
| 0.18624
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288
|
megatron.core.transformer.attention.forward.core_attention
| 23.56192
|
megatron.core.transformer.attention.forward.linear_proj
| 0.562496
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.333088
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175168
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.325536
|
megatron.core.transformer.mlp.forward.activation
| 0.037184
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.734784
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.108608
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.175392
|
megatron.core.transformer.attention.forward.qkv
| 0.18384
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002816
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288
|
megatron.core.transformer.attention.forward.core_attention
| 23.561632
|
megatron.core.transformer.attention.forward.linear_proj
| 0.551296
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.319391
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.175456
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.32752
|
megatron.core.transformer.mlp.forward.activation
| 0.037216
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.738656
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.114176
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.17616
|
megatron.core.transformer.attention.forward.qkv
| 0.18768
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 23.549728
|
megatron.core.transformer.attention.forward.linear_proj
| 0.576576
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.336927
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176832
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.329568
|
megatron.core.transformer.mlp.forward.activation
| 0.038304
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.732416
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.111232
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176448
|
megatron.core.transformer.attention.forward.qkv
| 0.187264
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002784
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 23.558624
|
megatron.core.transformer.attention.forward.linear_proj
| 0.567264
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.33584
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.177248
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328352
|
megatron.core.transformer.mlp.forward.activation
| 0.037952
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73664
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.113888
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176832
|
megatron.core.transformer.attention.forward.qkv
| 0.1864
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002848
|
megatron.core.transformer.attention.forward.core_attention
| 23.576256
|
megatron.core.transformer.attention.forward.linear_proj
| 0.5472
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.331936
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176736
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328224
|
megatron.core.transformer.mlp.forward.activation
| 0.037056
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.73744
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.113504
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.176992
|
megatron.core.transformer.attention.forward.qkv
| 0.187104
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002784
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288
|
megatron.core.transformer.attention.forward.core_attention
| 23.582176
|
megatron.core.transformer.attention.forward.linear_proj
| 0.532864
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 24.324385
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.176608
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.328736
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.