0
stringclasses 12
values | 1
float64 0
26.5k
|
|---|---|
megatron.core.transformer.attention.forward.linear_proj
| 0.669216
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 39.335487
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.230752
|
megatron.core.transformer.mlp.forward.linear_fc1
| 2.87728
|
megatron.core.transformer.mlp.forward.activation
| 0.336
|
megatron.core.transformer.mlp.forward.linear_fc2
| 2.759264
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 5.9848
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.23056
|
megatron.core.transformer.attention.forward.qkv
| 2.233024
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.310368
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.372
|
megatron.core.transformer.attention.forward.core_attention
| 23.458529
|
megatron.core.transformer.attention.forward.linear_proj
| 0.340896
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 27.954271
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.119904
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.506944
|
megatron.core.transformer.mlp.forward.activation
| 0.168928
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.374784
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.06224
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.119552
|
megatron.core.transformer.attention.forward.qkv
| 0.655296
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 19.97456
|
megatron.core.transformer.attention.forward.linear_proj
| 0.334656
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 20.988768
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.119424
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.510752
|
megatron.core.transformer.mlp.forward.activation
| 0.1696
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.378464
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.070656
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.119296
|
megatron.core.transformer.attention.forward.qkv
| 0.758016
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.081632
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.097088
|
megatron.core.transformer.attention.forward.core_attention
| 11.934368
|
megatron.core.transformer.attention.forward.linear_proj
| 0.177888
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 13.386368
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.064608
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.751424
|
megatron.core.transformer.mlp.forward.activation
| 0.087488
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.693952
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.544448
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.064832
|
megatron.core.transformer.attention.forward.qkv
| 0.33808
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003584
|
megatron.core.transformer.attention.forward.core_attention
| 10.962912
|
megatron.core.transformer.attention.forward.linear_proj
| 0.176608
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.502336
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.0648
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.7608
|
megatron.core.transformer.mlp.forward.activation
| 0.086688
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.697568
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.55664
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.064992
|
megatron.core.transformer.attention.forward.qkv
| 0.770944
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.086208
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.094592
|
megatron.core.transformer.attention.forward.core_attention
| 36.763615
|
megatron.core.transformer.attention.forward.linear_proj
| 0.092672
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 38.149887
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.03792
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.369792
|
megatron.core.transformer.mlp.forward.activation
| 0.048288
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.343808
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.773248
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.038592
|
megatron.core.transformer.attention.forward.qkv
| 0.175072
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 9.009952
|
megatron.core.transformer.attention.forward.linear_proj
| 0.092192
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 9.300832
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.03808
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.373536
|
megatron.core.transformer.mlp.forward.activation
| 0.047776
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.344768
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.777568
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.039488
|
megatron.core.transformer.attention.forward.qkv
| 0.70544
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003168
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003168
|
megatron.core.transformer.attention.forward.core_attention
| 19.045376
|
megatron.core.transformer.attention.forward.linear_proj
| 0.832288
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 20.608864
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.233856
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.55376
|
megatron.core.transformer.mlp.forward.activation
| 0.169888
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.897632
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 3.633632
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.23088
|
megatron.core.transformer.attention.forward.qkv
| 0.696384
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003168
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003104
|
megatron.core.transformer.attention.forward.core_attention
| 18.883327
|
megatron.core.transformer.attention.forward.linear_proj
| 0.8696
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 20.475168
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.231392
|
megatron.core.transformer.mlp.forward.linear_fc1
| 1.55728
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.