0
stringclasses 12
values | 1
float64 0
26.5k
|
---|---|
megatron.core.transformer.attention.forward.qkv
| 5.164192 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944 |
megatron.core.transformer.attention.forward.core_attention
| 578.336121 |
megatron.core.transformer.attention.forward.linear_proj
| 2.701568 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 586.22522 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.890112 |
megatron.core.transformer.mlp.forward.linear_fc1
| 11.53056 |
megatron.core.transformer.mlp.forward.activation
| 1.310848 |
megatron.core.transformer.mlp.forward.linear_fc2
| 11.640512 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 24.493729 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.9936 |
megatron.core.transformer.attention.forward.qkv
| 5.221824 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 587.603577 |
megatron.core.transformer.attention.forward.linear_proj
| 2.64096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 595.489563 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.889216 |
megatron.core.transformer.mlp.forward.linear_fc1
| 11.562016 |
megatron.core.transformer.mlp.forward.activation
| 1.309184 |
megatron.core.transformer.mlp.forward.linear_fc2
| 11.619168 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 24.502209 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.93888 |
megatron.core.transformer.attention.forward.qkv
| 2.669824 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 310.932373 |
megatron.core.transformer.attention.forward.linear_proj
| 1.520128 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 315.146912 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.450528 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.044576 |
megatron.core.transformer.mlp.forward.activation
| 0.661024 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.933088 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.65088 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452832 |
megatron.core.transformer.attention.forward.qkv
| 2.667392 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304 |
megatron.core.transformer.attention.forward.core_attention
| 312.369629 |
megatron.core.transformer.attention.forward.linear_proj
| 1.495072 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 316.556702 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45488 |
megatron.core.transformer.mlp.forward.linear_fc1
| 5.939168 |
megatron.core.transformer.mlp.forward.activation
| 0.659968 |
megatron.core.transformer.mlp.forward.linear_fc2
| 5.912416 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.523328 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.453408 |
megatron.core.transformer.attention.forward.qkv
| 1.297312 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 258.820435 |
megatron.core.transformer.attention.forward.linear_proj
| 0.689248 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 260.830078 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.22928 |
megatron.core.transformer.mlp.forward.linear_fc1
| 2.93088 |
megatron.core.transformer.mlp.forward.activation
| 0.331904 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.848416 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 6.122944 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.229856 |
megatron.core.transformer.attention.forward.qkv
| 1.3088 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 176.646683 |
megatron.core.transformer.attention.forward.linear_proj
| 0.672416 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 178.651169 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.230144 |
megatron.core.transformer.mlp.forward.linear_fc1
| 2.882944 |
megatron.core.transformer.mlp.forward.activation
| 0.331616 |
megatron.core.transformer.mlp.forward.linear_fc2
| 2.808416 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 6.034272 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.229376 |
megatron.core.transformer.attention.forward.qkv
| 2.852736 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288 |
megatron.core.transformer.attention.forward.core_attention
| 290.985291 |
megatron.core.transformer.attention.forward.linear_proj
| 3.066464 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 296.927917 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.892 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.207776 |
megatron.core.transformer.mlp.forward.activation
| 0.661184 |
megatron.core.transformer.mlp.forward.linear_fc2
| 7.672768 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 14.553216 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.89552 |
megatron.core.transformer.attention.forward.qkv
| 2.859936 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912 |
megatron.core.transformer.attention.forward.core_attention
| 293.756561 |
megatron.core.transformer.attention.forward.linear_proj
| 3.263328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 299.903381 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.90048 |
megatron.core.transformer.mlp.forward.linear_fc1
| 6.214144 |
megatron.core.transformer.mlp.forward.activation
| 0.664032 |
megatron.core.transformer.mlp.forward.linear_fc2
| 7.66656 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 14.556384 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.89984 |
megatron.core.transformer.attention.forward.qkv
| 1.428672 |
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003104 |
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976 |
megatron.core.transformer.attention.forward.core_attention
| 156.113663 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.