0
stringclasses 12
values | 1
float64 0
26.5k
|
|---|---|
megatron.core.transformer.attention.forward.qkv
| 0.649632
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.076448
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.08464
|
megatron.core.transformer.attention.forward.core_attention
| 11.3912
|
megatron.core.transformer.attention.forward.linear_proj
| 4.980576
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 17.510817
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 16.746529
|
megatron.core.transformer.mlp.forward.linear_fc1
| 7.12944
|
megatron.core.transformer.mlp.forward.activation
| 6.176672
|
megatron.core.transformer.mlp.forward.linear_fc2
| 10.5376
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 24.014528
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 17.66032
|
megatron.core.transformer.attention.forward.qkv
| 292.496582
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.137312
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.09824
|
megatron.core.transformer.attention.forward.core_attention
| 4,282.993164
|
megatron.core.transformer.attention.forward.linear_proj
| 3.6344
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,581.708496
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 224.746246
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.432544
|
megatron.core.transformer.mlp.forward.activation
| 185.803131
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.340512
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 189.376068
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.522976
|
megatron.core.transformer.attention.forward.qkv
| 0.598592
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.08288
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.096928
|
megatron.core.transformer.attention.forward.core_attention
| 34.429153
|
megatron.core.transformer.attention.forward.linear_proj
| 0.383712
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 35.905472
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.124512
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.223584
|
megatron.core.transformer.mlp.forward.activation
| 0.027072
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.504096
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.766624
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.1208
|
megatron.core.transformer.attention.forward.qkv
| 265.184296
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.111776
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.099712
|
megatron.core.transformer.attention.forward.core_attention
| 4,481.565918
|
megatron.core.transformer.attention.forward.linear_proj
| 3.558336
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 4,751.977051
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 300.443176
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.889696
|
megatron.core.transformer.mlp.forward.activation
| 173.172348
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.909312
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 175.89209
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.663968
|
megatron.core.transformer.attention.forward.qkv
| 0.620576
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.083808
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.092512
|
megatron.core.transformer.attention.forward.core_attention
| 1,855.620728
|
megatron.core.transformer.attention.forward.linear_proj
| 0.215712
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,856.935913
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.065728
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.120128
|
megatron.core.transformer.mlp.forward.activation
| 0.017664
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.28848
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.43712
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.0648
|
megatron.core.transformer.attention.forward.qkv
| 226.104477
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.11408
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.087424
|
megatron.core.transformer.attention.forward.core_attention
| 6,143.557129
|
megatron.core.transformer.attention.forward.linear_proj
| 3.459456
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6,374.62793
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 214.498459
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.970272
|
megatron.core.transformer.mlp.forward.activation
| 170.084
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.872512
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 172.755234
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.431488
|
megatron.core.transformer.attention.forward.qkv
| 0.545024
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.082272
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.091808
|
megatron.core.transformer.attention.forward.core_attention
| 892.340027
|
megatron.core.transformer.attention.forward.linear_proj
| 0.135968
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 893.490662
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.037184
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.06112
|
megatron.core.transformer.mlp.forward.activation
| 0.01184
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.172896
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.257344
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.037472
|
megatron.core.transformer.attention.forward.qkv
| 1.300736
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 37.401024
|
megatron.core.transformer.attention.forward.linear_proj
| 0.68608
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 39.41328
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.233568
|
megatron.core.transformer.mlp.forward.linear_fc1
| 2.878208
|
megatron.core.transformer.mlp.forward.activation
| 0.335744
|
megatron.core.transformer.mlp.forward.linear_fc2
| 2.752576
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 5.978624
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.232672
|
megatron.core.transformer.attention.forward.qkv
| 1.297376
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 37.344833
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.