2023-11-02 21:53:57.206 | INFO | mmgpt.model.builder:build_model_tokenizer:85 - LlamaTokenizer(name_or_path='/data/hypertext/yuangpeng/huggingface_cache/models--lmsys--vicuna-7b-v15', vocab_size=32000, model_max_length=2048, is_fast=False, padding_side='right', truncation_side='right', special_tokens={'bos_token': AddedToken("", rstrip=False, lstrip=False, single_word=False, normalized=False), 'eos_token': AddedToken("", rstrip=False, lstrip=False, single_word=False, normalized=False), 'unk_token': AddedToken("", rstrip=False, lstrip=False, single_word=False, normalized=False), 'pad_token': ''}, clean_up_tokenization_spaces=False)
2023-11-02 21:54:00.632 | INFO | mmgpt.model.mmgpt.base_mmgpt:build_vision_tokenizer:52 - CLIPImageProcessor {
"crop_size": {
"height": 448,
"width": 448
},
"do_center_crop": true,
"do_convert_rgb": true,
"do_normalize": true,
"do_rescale": true,
"do_resize": true,
"feature_extractor_type": "CLIPFeatureExtractor",
"image_mean": [
0.48145466,
0.4578275,
0.40821073
],
"image_processor_type": "CLIPImageProcessor",
"image_std": [
0.26862954,
0.26130258,
0.27577711
],
"resample": 3,
"rescale_factor": 0.00392156862745098,
"size": {
"shortest_edge": 448
}
}
2023-11-02 21:54:07.019 | INFO | mmgpt.model.mmgpt.base_mmgpt:build_vision_tokenizer:64 - 2 new tokens are added to be trained.
2023-11-02 21:54:07.145 | INFO | mmgpt.model.builder:build_model_tokenizer:148 - MMGPTLlamaForCausalLM(
(model): MMGPTLlamaModel(
(embed_tokens): Embedding(32003, 4096)
(layers): ModuleList(
(0-31): 32 x LlamaDecoderLayer(
(self_attn): LlamaAttention(
(q_proj): Linear(in_features=4096, out_features=4096, bias=False)
(k_proj): Linear(in_features=4096, out_features=4096, bias=False)
(v_proj): Linear(in_features=4096, out_features=4096, bias=False)
(o_proj): Linear(in_features=4096, out_features=4096, bias=False)
(rotary_emb): LlamaRotaryEmbedding()
)
(mlp): LlamaMLP(
(gate_proj): Linear(in_features=4096, out_features=11008, bias=False)
(up_proj): Linear(in_features=4096, out_features=11008, bias=False)
(down_proj): Linear(in_features=11008, out_features=4096, bias=False)
(act_fn): SiLUActivation()
)
(input_layernorm): LlamaRMSNorm()
(post_attention_layernorm): LlamaRMSNorm()
)
)
(norm): LlamaRMSNorm()
(vision_tower): CLIPVisionTower(
(vision_tower): CLIPVisionModel(
(vision_model): CLIPVisionTransformer(
(embeddings): CLIPVisionEmbeddings(
(patch_embedding): Conv2d(3, 1024, kernel_size=(14, 14), stride=(14, 14), bias=False)
(position_embedding): Embedding(1025, 1024)
)
(pre_layrnorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(encoder): CLIPEncoder(
(layers): ModuleList(
(0-23): 24 x CLIPEncoderLayer(
(self_attn): CLIPAttention(
(k_proj): Linear(in_features=1024, out_features=1024, bias=True)
(v_proj): Linear(in_features=1024, out_features=1024, bias=True)
(q_proj): Linear(in_features=1024, out_features=1024, bias=True)
(out_proj): Linear(in_features=1024, out_features=1024, bias=True)
)
(layer_norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(mlp): CLIPMLP(
(activation_fn): QuickGELUActivation()
(fc1): Linear(in_features=1024, out_features=4096, bias=True)
(fc2): Linear(in_features=4096, out_features=1024, bias=True)
)
(layer_norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
)
)
)
(post_layernorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
)
)
)
(projector): ConvProjector(
(projector): Conv2d(1024, 4096, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
)
)
(lm_head): Linear(in_features=4096, out_features=32003, bias=False)
)
2023-11-02 21:54:20.723 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.embed_tokens.weight
2023-11-02 21:54:20.724 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.self_attn.q_proj.weight
2023-11-02 21:54:20.724 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.self_attn.k_proj.weight
2023-11-02 21:54:20.724 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.self_attn.v_proj.weight
2023-11-02 21:54:20.724 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.self_attn.o_proj.weight
2023-11-02 21:54:20.724 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.mlp.gate_proj.weight
2023-11-02 21:54:20.725 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.mlp.up_proj.weight
2023-11-02 21:54:20.725 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.mlp.down_proj.weight
2023-11-02 21:54:20.725 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.input_layernorm.weight
2023-11-02 21:54:20.725 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.0.post_attention_layernorm.weight
2023-11-02 21:54:20.726 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.self_attn.q_proj.weight
2023-11-02 21:54:20.726 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.self_attn.k_proj.weight
2023-11-02 21:54:20.726 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.self_attn.v_proj.weight
2023-11-02 21:54:20.726 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.self_attn.o_proj.weight
2023-11-02 21:54:20.726 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.mlp.gate_proj.weight
2023-11-02 21:54:20.726 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.mlp.up_proj.weight
2023-11-02 21:54:20.727 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.mlp.down_proj.weight
2023-11-02 21:54:20.727 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.input_layernorm.weight
2023-11-02 21:54:20.727 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.1.post_attention_layernorm.weight
2023-11-02 21:54:20.727 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.self_attn.q_proj.weight
2023-11-02 21:54:20.727 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.self_attn.k_proj.weight
2023-11-02 21:54:20.728 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.self_attn.v_proj.weight
2023-11-02 21:54:20.728 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.self_attn.o_proj.weight
2023-11-02 21:54:20.728 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.mlp.gate_proj.weight
2023-11-02 21:54:20.728 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.mlp.up_proj.weight
2023-11-02 21:54:20.728 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.mlp.down_proj.weight
2023-11-02 21:54:20.728 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.input_layernorm.weight
2023-11-02 21:54:20.729 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.2.post_attention_layernorm.weight
2023-11-02 21:54:20.729 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.self_attn.q_proj.weight
2023-11-02 21:54:20.729 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.self_attn.k_proj.weight
2023-11-02 21:54:20.729 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.self_attn.v_proj.weight
2023-11-02 21:54:20.729 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.self_attn.o_proj.weight
2023-11-02 21:54:20.730 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.mlp.gate_proj.weight
2023-11-02 21:54:20.730 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.mlp.up_proj.weight
2023-11-02 21:54:20.730 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.mlp.down_proj.weight
2023-11-02 21:54:20.730 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.input_layernorm.weight
2023-11-02 21:54:20.730 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.3.post_attention_layernorm.weight
2023-11-02 21:54:20.730 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.self_attn.q_proj.weight
2023-11-02 21:54:20.731 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.self_attn.k_proj.weight
2023-11-02 21:54:20.731 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.self_attn.v_proj.weight
2023-11-02 21:54:20.731 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.self_attn.o_proj.weight
2023-11-02 21:54:20.731 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.mlp.gate_proj.weight
2023-11-02 21:54:20.731 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.mlp.up_proj.weight
2023-11-02 21:54:20.732 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.mlp.down_proj.weight
2023-11-02 21:54:20.732 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.input_layernorm.weight
2023-11-02 21:54:20.732 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.4.post_attention_layernorm.weight
2023-11-02 21:54:20.732 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.self_attn.q_proj.weight
2023-11-02 21:54:20.732 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.self_attn.k_proj.weight
2023-11-02 21:54:20.732 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.self_attn.v_proj.weight
2023-11-02 21:54:20.733 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.self_attn.o_proj.weight
2023-11-02 21:54:20.733 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.mlp.gate_proj.weight
2023-11-02 21:54:20.733 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.mlp.up_proj.weight
2023-11-02 21:54:20.733 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.mlp.down_proj.weight
2023-11-02 21:54:20.733 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.input_layernorm.weight
2023-11-02 21:54:20.734 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.5.post_attention_layernorm.weight
2023-11-02 21:54:20.734 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.self_attn.q_proj.weight
2023-11-02 21:54:20.734 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.self_attn.k_proj.weight
2023-11-02 21:54:20.734 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.self_attn.v_proj.weight
2023-11-02 21:54:20.734 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.self_attn.o_proj.weight
2023-11-02 21:54:20.734 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.mlp.gate_proj.weight
2023-11-02 21:54:20.735 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.mlp.up_proj.weight
2023-11-02 21:54:20.735 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.mlp.down_proj.weight
2023-11-02 21:54:20.735 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.input_layernorm.weight
2023-11-02 21:54:20.735 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.6.post_attention_layernorm.weight
2023-11-02 21:54:20.735 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.self_attn.q_proj.weight
2023-11-02 21:54:20.736 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.self_attn.k_proj.weight
2023-11-02 21:54:20.736 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.self_attn.v_proj.weight
2023-11-02 21:54:20.736 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.self_attn.o_proj.weight
2023-11-02 21:54:20.736 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.mlp.gate_proj.weight
2023-11-02 21:54:20.736 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.mlp.up_proj.weight
2023-11-02 21:54:20.736 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.mlp.down_proj.weight
2023-11-02 21:54:20.737 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.input_layernorm.weight
2023-11-02 21:54:20.737 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.7.post_attention_layernorm.weight
2023-11-02 21:54:20.737 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.self_attn.q_proj.weight
2023-11-02 21:54:20.737 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.self_attn.k_proj.weight
2023-11-02 21:54:20.737 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.self_attn.v_proj.weight
2023-11-02 21:54:20.738 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.self_attn.o_proj.weight
2023-11-02 21:54:20.738 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.mlp.gate_proj.weight
2023-11-02 21:54:20.738 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.mlp.up_proj.weight
2023-11-02 21:54:20.738 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.mlp.down_proj.weight
2023-11-02 21:54:20.738 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.input_layernorm.weight
2023-11-02 21:54:20.739 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.8.post_attention_layernorm.weight
2023-11-02 21:54:20.739 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.self_attn.q_proj.weight
2023-11-02 21:54:20.739 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.self_attn.k_proj.weight
2023-11-02 21:54:20.739 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.self_attn.v_proj.weight
2023-11-02 21:54:20.739 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.self_attn.o_proj.weight
2023-11-02 21:54:20.739 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.mlp.gate_proj.weight
2023-11-02 21:54:20.740 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.mlp.up_proj.weight
2023-11-02 21:54:20.740 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.mlp.down_proj.weight
2023-11-02 21:54:20.740 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.input_layernorm.weight
2023-11-02 21:54:20.740 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.9.post_attention_layernorm.weight
2023-11-02 21:54:20.740 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.self_attn.q_proj.weight
2023-11-02 21:54:20.741 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.self_attn.k_proj.weight
2023-11-02 21:54:20.741 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.self_attn.v_proj.weight
2023-11-02 21:54:20.741 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.self_attn.o_proj.weight
2023-11-02 21:54:20.741 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.mlp.gate_proj.weight
2023-11-02 21:54:20.741 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.mlp.up_proj.weight
2023-11-02 21:54:20.741 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.mlp.down_proj.weight
2023-11-02 21:54:20.742 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.input_layernorm.weight
2023-11-02 21:54:20.742 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.10.post_attention_layernorm.weight
2023-11-02 21:54:20.742 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.self_attn.q_proj.weight
2023-11-02 21:54:20.742 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.self_attn.k_proj.weight
2023-11-02 21:54:20.742 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.self_attn.v_proj.weight
2023-11-02 21:54:20.742 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.self_attn.o_proj.weight
2023-11-02 21:54:20.743 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.mlp.gate_proj.weight
2023-11-02 21:54:20.743 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.mlp.up_proj.weight
2023-11-02 21:54:20.743 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.mlp.down_proj.weight
2023-11-02 21:54:20.743 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.input_layernorm.weight
2023-11-02 21:54:20.743 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.11.post_attention_layernorm.weight
2023-11-02 21:54:20.744 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.self_attn.q_proj.weight
2023-11-02 21:54:20.744 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.self_attn.k_proj.weight
2023-11-02 21:54:20.744 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.self_attn.v_proj.weight
2023-11-02 21:54:20.744 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.self_attn.o_proj.weight
2023-11-02 21:54:20.744 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.mlp.gate_proj.weight
2023-11-02 21:54:20.744 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.mlp.up_proj.weight
2023-11-02 21:54:20.745 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.mlp.down_proj.weight
2023-11-02 21:54:20.745 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.input_layernorm.weight
2023-11-02 21:54:20.745 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.12.post_attention_layernorm.weight
2023-11-02 21:54:20.745 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.self_attn.q_proj.weight
2023-11-02 21:54:20.745 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.self_attn.k_proj.weight
2023-11-02 21:54:20.746 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.self_attn.v_proj.weight
2023-11-02 21:54:20.746 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.self_attn.o_proj.weight
2023-11-02 21:54:20.746 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.mlp.gate_proj.weight
2023-11-02 21:54:20.746 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.mlp.up_proj.weight
2023-11-02 21:54:20.746 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.mlp.down_proj.weight
2023-11-02 21:54:20.746 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.input_layernorm.weight
2023-11-02 21:54:20.747 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.13.post_attention_layernorm.weight
2023-11-02 21:54:20.747 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.self_attn.q_proj.weight
2023-11-02 21:54:20.747 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.self_attn.k_proj.weight
2023-11-02 21:54:20.747 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.self_attn.v_proj.weight
2023-11-02 21:54:20.747 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.self_attn.o_proj.weight
2023-11-02 21:54:20.748 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.mlp.gate_proj.weight
2023-11-02 21:54:20.748 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.mlp.up_proj.weight
2023-11-02 21:54:20.748 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.mlp.down_proj.weight
2023-11-02 21:54:20.748 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.input_layernorm.weight
2023-11-02 21:54:20.748 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.14.post_attention_layernorm.weight
2023-11-02 21:54:20.748 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.self_attn.q_proj.weight
2023-11-02 21:54:20.749 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.self_attn.k_proj.weight
2023-11-02 21:54:20.749 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.self_attn.v_proj.weight
2023-11-02 21:54:20.749 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.self_attn.o_proj.weight
2023-11-02 21:54:20.749 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.mlp.gate_proj.weight
2023-11-02 21:54:20.749 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.mlp.up_proj.weight
2023-11-02 21:54:20.749 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.mlp.down_proj.weight
2023-11-02 21:54:20.750 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.input_layernorm.weight
2023-11-02 21:54:20.750 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.15.post_attention_layernorm.weight
2023-11-02 21:54:20.750 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.self_attn.q_proj.weight
2023-11-02 21:54:20.750 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.self_attn.k_proj.weight
2023-11-02 21:54:20.750 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.self_attn.v_proj.weight
2023-11-02 21:54:20.750 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.self_attn.o_proj.weight
2023-11-02 21:54:20.751 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.mlp.gate_proj.weight
2023-11-02 21:54:20.751 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.mlp.up_proj.weight
2023-11-02 21:54:20.751 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.mlp.down_proj.weight
2023-11-02 21:54:20.751 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.input_layernorm.weight
2023-11-02 21:54:20.751 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.16.post_attention_layernorm.weight
2023-11-02 21:54:20.752 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.self_attn.q_proj.weight
2023-11-02 21:54:20.752 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.self_attn.k_proj.weight
2023-11-02 21:54:20.752 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.self_attn.v_proj.weight
2023-11-02 21:54:20.752 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.self_attn.o_proj.weight
2023-11-02 21:54:20.752 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.mlp.gate_proj.weight
2023-11-02 21:54:20.752 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.mlp.up_proj.weight
2023-11-02 21:54:20.753 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.mlp.down_proj.weight
2023-11-02 21:54:20.753 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.input_layernorm.weight
2023-11-02 21:54:20.753 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.17.post_attention_layernorm.weight
2023-11-02 21:54:20.753 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.self_attn.q_proj.weight
2023-11-02 21:54:20.753 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.self_attn.k_proj.weight
2023-11-02 21:54:20.753 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.self_attn.v_proj.weight
2023-11-02 21:54:20.754 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.self_attn.o_proj.weight
2023-11-02 21:54:20.754 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.mlp.gate_proj.weight
2023-11-02 21:54:20.754 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.mlp.up_proj.weight
2023-11-02 21:54:20.754 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.mlp.down_proj.weight
2023-11-02 21:54:20.754 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.input_layernorm.weight
2023-11-02 21:54:20.755 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.18.post_attention_layernorm.weight
2023-11-02 21:54:20.755 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.self_attn.q_proj.weight
2023-11-02 21:54:20.755 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.self_attn.k_proj.weight
2023-11-02 21:54:20.755 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.self_attn.v_proj.weight
2023-11-02 21:54:20.755 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.self_attn.o_proj.weight
2023-11-02 21:54:20.755 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.mlp.gate_proj.weight
2023-11-02 21:54:20.756 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.mlp.up_proj.weight
2023-11-02 21:54:20.756 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.mlp.down_proj.weight
2023-11-02 21:54:20.756 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.input_layernorm.weight
2023-11-02 21:54:20.756 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.19.post_attention_layernorm.weight
2023-11-02 21:54:20.756 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.self_attn.q_proj.weight
2023-11-02 21:54:20.756 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.self_attn.k_proj.weight
2023-11-02 21:54:20.757 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.self_attn.v_proj.weight
2023-11-02 21:54:20.757 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.self_attn.o_proj.weight
2023-11-02 21:54:20.757 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.mlp.gate_proj.weight
2023-11-02 21:54:20.757 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.mlp.up_proj.weight
2023-11-02 21:54:20.757 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.mlp.down_proj.weight
2023-11-02 21:54:20.758 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.input_layernorm.weight
2023-11-02 21:54:20.758 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.20.post_attention_layernorm.weight
2023-11-02 21:54:20.758 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.self_attn.q_proj.weight
2023-11-02 21:54:20.758 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.self_attn.k_proj.weight
2023-11-02 21:54:20.758 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.self_attn.v_proj.weight
2023-11-02 21:54:20.758 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.self_attn.o_proj.weight
2023-11-02 21:54:20.759 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.mlp.gate_proj.weight
2023-11-02 21:54:20.759 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.mlp.up_proj.weight
2023-11-02 21:54:20.759 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.mlp.down_proj.weight
2023-11-02 21:54:20.759 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.input_layernorm.weight
2023-11-02 21:54:20.759 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.21.post_attention_layernorm.weight
2023-11-02 21:54:20.759 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.self_attn.q_proj.weight
2023-11-02 21:54:20.760 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.self_attn.k_proj.weight
2023-11-02 21:54:20.760 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.self_attn.v_proj.weight
2023-11-02 21:54:20.760 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.self_attn.o_proj.weight
2023-11-02 21:54:20.760 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.mlp.gate_proj.weight
2023-11-02 21:54:20.760 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.mlp.up_proj.weight
2023-11-02 21:54:20.761 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.mlp.down_proj.weight
2023-11-02 21:54:20.761 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.input_layernorm.weight
2023-11-02 21:54:20.761 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.22.post_attention_layernorm.weight
2023-11-02 21:54:20.761 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.self_attn.q_proj.weight
2023-11-02 21:54:20.761 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.self_attn.k_proj.weight
2023-11-02 21:54:20.761 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.self_attn.v_proj.weight
2023-11-02 21:54:20.762 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.self_attn.o_proj.weight
2023-11-02 21:54:20.762 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.mlp.gate_proj.weight
2023-11-02 21:54:20.762 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.mlp.up_proj.weight
2023-11-02 21:54:20.762 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.mlp.down_proj.weight
2023-11-02 21:54:20.762 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.input_layernorm.weight
2023-11-02 21:54:20.762 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.23.post_attention_layernorm.weight
2023-11-02 21:54:20.763 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.self_attn.q_proj.weight
2023-11-02 21:54:20.763 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.self_attn.k_proj.weight
2023-11-02 21:54:20.763 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.self_attn.v_proj.weight
2023-11-02 21:54:20.763 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.self_attn.o_proj.weight
2023-11-02 21:54:20.763 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.mlp.gate_proj.weight
2023-11-02 21:54:20.764 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.mlp.up_proj.weight
2023-11-02 21:54:20.764 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.mlp.down_proj.weight
2023-11-02 21:54:20.764 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.input_layernorm.weight
2023-11-02 21:54:20.764 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.24.post_attention_layernorm.weight
2023-11-02 21:54:20.764 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.self_attn.q_proj.weight
2023-11-02 21:54:20.764 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.self_attn.k_proj.weight
2023-11-02 21:54:20.765 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.self_attn.v_proj.weight
2023-11-02 21:54:20.765 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.self_attn.o_proj.weight
2023-11-02 21:54:20.765 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.mlp.gate_proj.weight
2023-11-02 21:54:20.765 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.mlp.up_proj.weight
2023-11-02 21:54:20.765 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.mlp.down_proj.weight
2023-11-02 21:54:20.765 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.input_layernorm.weight
2023-11-02 21:54:20.766 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.25.post_attention_layernorm.weight
2023-11-02 21:54:20.766 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.self_attn.q_proj.weight
2023-11-02 21:54:20.766 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.self_attn.k_proj.weight
2023-11-02 21:54:20.766 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.self_attn.v_proj.weight
2023-11-02 21:54:20.766 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.self_attn.o_proj.weight
2023-11-02 21:54:20.767 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.mlp.gate_proj.weight
2023-11-02 21:54:20.767 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.mlp.up_proj.weight
2023-11-02 21:54:20.767 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.mlp.down_proj.weight
2023-11-02 21:54:20.767 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.input_layernorm.weight
2023-11-02 21:54:20.767 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.26.post_attention_layernorm.weight
2023-11-02 21:54:20.767 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.self_attn.q_proj.weight
2023-11-02 21:54:20.768 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.self_attn.k_proj.weight
2023-11-02 21:54:20.768 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.self_attn.v_proj.weight
2023-11-02 21:54:20.768 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.self_attn.o_proj.weight
2023-11-02 21:54:20.768 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.mlp.gate_proj.weight
2023-11-02 21:54:20.768 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.mlp.up_proj.weight
2023-11-02 21:54:20.768 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.mlp.down_proj.weight
2023-11-02 21:54:20.769 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.input_layernorm.weight
2023-11-02 21:54:20.769 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.27.post_attention_layernorm.weight
2023-11-02 21:54:20.769 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.self_attn.q_proj.weight
2023-11-02 21:54:20.769 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.self_attn.k_proj.weight
2023-11-02 21:54:20.769 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.self_attn.v_proj.weight
2023-11-02 21:54:20.770 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.self_attn.o_proj.weight
2023-11-02 21:54:20.770 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.mlp.gate_proj.weight
2023-11-02 21:54:20.770 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.mlp.up_proj.weight
2023-11-02 21:54:20.770 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.mlp.down_proj.weight
2023-11-02 21:54:20.770 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.input_layernorm.weight
2023-11-02 21:54:20.770 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.28.post_attention_layernorm.weight
2023-11-02 21:54:20.771 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.self_attn.q_proj.weight
2023-11-02 21:54:20.771 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.self_attn.k_proj.weight
2023-11-02 21:54:20.771 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.self_attn.v_proj.weight
2023-11-02 21:54:20.771 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.self_attn.o_proj.weight
2023-11-02 21:54:20.771 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.mlp.gate_proj.weight
2023-11-02 21:54:20.771 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.mlp.up_proj.weight
2023-11-02 21:54:20.772 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.mlp.down_proj.weight
2023-11-02 21:54:20.772 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.input_layernorm.weight
2023-11-02 21:54:20.772 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.29.post_attention_layernorm.weight
2023-11-02 21:54:20.772 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.self_attn.q_proj.weight
2023-11-02 21:54:20.772 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.self_attn.k_proj.weight
2023-11-02 21:54:20.773 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.self_attn.v_proj.weight
2023-11-02 21:54:20.773 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.self_attn.o_proj.weight
2023-11-02 21:54:20.773 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.mlp.gate_proj.weight
2023-11-02 21:54:20.773 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.mlp.up_proj.weight
2023-11-02 21:54:20.773 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.mlp.down_proj.weight
2023-11-02 21:54:20.773 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.input_layernorm.weight
2023-11-02 21:54:20.774 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.30.post_attention_layernorm.weight
2023-11-02 21:54:20.774 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.self_attn.q_proj.weight
2023-11-02 21:54:20.774 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.self_attn.k_proj.weight
2023-11-02 21:54:20.774 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.self_attn.v_proj.weight
2023-11-02 21:54:20.774 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.self_attn.o_proj.weight
2023-11-02 21:54:20.774 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.mlp.gate_proj.weight
2023-11-02 21:54:20.775 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.mlp.up_proj.weight
2023-11-02 21:54:20.775 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.mlp.down_proj.weight
2023-11-02 21:54:20.775 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.input_layernorm.weight
2023-11-02 21:54:20.775 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.layers.31.post_attention_layernorm.weight
2023-11-02 21:54:20.775 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.norm.weight
2023-11-02 21:54:20.775 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.embeddings.class_embedding
2023-11-02 21:54:20.776 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.embeddings.patch_embedding.weight
2023-11-02 21:54:20.776 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.embeddings.position_embedding.weight
2023-11-02 21:54:20.776 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.pre_layrnorm.weight
2023-11-02 21:54:20.776 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.pre_layrnorm.bias
2023-11-02 21:54:20.776 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.k_proj.weight
2023-11-02 21:54:20.777 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.k_proj.bias
2023-11-02 21:54:20.777 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.v_proj.weight
2023-11-02 21:54:20.777 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.v_proj.bias
2023-11-02 21:54:20.777 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.q_proj.weight
2023-11-02 21:54:20.777 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.q_proj.bias
2023-11-02 21:54:20.777 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.out_proj.weight
2023-11-02 21:54:20.778 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.self_attn.out_proj.bias
2023-11-02 21:54:20.778 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.layer_norm1.weight
2023-11-02 21:54:20.778 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.layer_norm1.bias
2023-11-02 21:54:20.778 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.mlp.fc1.weight
2023-11-02 21:54:20.778 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.mlp.fc1.bias
2023-11-02 21:54:20.779 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.mlp.fc2.weight
2023-11-02 21:54:20.779 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.mlp.fc2.bias
2023-11-02 21:54:20.779 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.layer_norm2.weight
2023-11-02 21:54:20.779 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.0.layer_norm2.bias
2023-11-02 21:54:20.779 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.k_proj.weight
2023-11-02 21:54:20.779 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.k_proj.bias
2023-11-02 21:54:20.780 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.v_proj.weight
2023-11-02 21:54:20.780 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.v_proj.bias
2023-11-02 21:54:20.780 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.q_proj.weight
2023-11-02 21:54:20.780 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.q_proj.bias
2023-11-02 21:54:20.780 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.out_proj.weight
2023-11-02 21:54:20.780 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.self_attn.out_proj.bias
2023-11-02 21:54:20.781 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.layer_norm1.weight
2023-11-02 21:54:20.781 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.layer_norm1.bias
2023-11-02 21:54:20.781 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.mlp.fc1.weight
2023-11-02 21:54:20.781 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.mlp.fc1.bias
2023-11-02 21:54:20.781 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.mlp.fc2.weight
2023-11-02 21:54:20.781 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.mlp.fc2.bias
2023-11-02 21:54:20.782 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.layer_norm2.weight
2023-11-02 21:54:20.782 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.1.layer_norm2.bias
2023-11-02 21:54:20.782 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.k_proj.weight
2023-11-02 21:54:20.782 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.k_proj.bias
2023-11-02 21:54:20.782 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.v_proj.weight
2023-11-02 21:54:20.783 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.v_proj.bias
2023-11-02 21:54:20.783 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.q_proj.weight
2023-11-02 21:54:20.783 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.q_proj.bias
2023-11-02 21:54:20.783 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.out_proj.weight
2023-11-02 21:54:20.783 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.self_attn.out_proj.bias
2023-11-02 21:54:20.783 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.layer_norm1.weight
2023-11-02 21:54:20.784 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.layer_norm1.bias
2023-11-02 21:54:20.784 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.mlp.fc1.weight
2023-11-02 21:54:20.784 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.mlp.fc1.bias
2023-11-02 21:54:20.784 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.mlp.fc2.weight
2023-11-02 21:54:20.784 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.mlp.fc2.bias
2023-11-02 21:54:20.784 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.layer_norm2.weight
2023-11-02 21:54:20.785 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.2.layer_norm2.bias
2023-11-02 21:54:20.785 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.k_proj.weight
2023-11-02 21:54:20.785 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.k_proj.bias
2023-11-02 21:54:20.785 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.v_proj.weight
2023-11-02 21:54:20.785 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.v_proj.bias
2023-11-02 21:54:20.785 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.q_proj.weight
2023-11-02 21:54:20.786 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.q_proj.bias
2023-11-02 21:54:20.786 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.out_proj.weight
2023-11-02 21:54:20.786 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.self_attn.out_proj.bias
2023-11-02 21:54:20.786 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.layer_norm1.weight
2023-11-02 21:54:20.786 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.layer_norm1.bias
2023-11-02 21:54:20.787 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.mlp.fc1.weight
2023-11-02 21:54:20.787 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.mlp.fc1.bias
2023-11-02 21:54:20.787 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.mlp.fc2.weight
2023-11-02 21:54:20.787 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.mlp.fc2.bias
2023-11-02 21:54:20.787 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.layer_norm2.weight
2023-11-02 21:54:20.787 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.3.layer_norm2.bias
2023-11-02 21:54:20.788 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.k_proj.weight
2023-11-02 21:54:20.788 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.k_proj.bias
2023-11-02 21:54:20.788 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.v_proj.weight
2023-11-02 21:54:20.788 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.v_proj.bias
2023-11-02 21:54:20.788 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.q_proj.weight
2023-11-02 21:54:20.788 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.q_proj.bias
2023-11-02 21:54:20.789 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.out_proj.weight
2023-11-02 21:54:20.789 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.self_attn.out_proj.bias
2023-11-02 21:54:20.789 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.layer_norm1.weight
2023-11-02 21:54:20.789 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.layer_norm1.bias
2023-11-02 21:54:20.789 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.mlp.fc1.weight
2023-11-02 21:54:20.789 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.mlp.fc1.bias
2023-11-02 21:54:20.790 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.mlp.fc2.weight
2023-11-02 21:54:20.790 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.mlp.fc2.bias
2023-11-02 21:54:20.790 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.layer_norm2.weight
2023-11-02 21:54:20.790 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.4.layer_norm2.bias
2023-11-02 21:54:20.790 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.k_proj.weight
2023-11-02 21:54:20.790 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.k_proj.bias
2023-11-02 21:54:20.791 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.v_proj.weight
2023-11-02 21:54:20.791 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.v_proj.bias
2023-11-02 21:54:20.791 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.q_proj.weight
2023-11-02 21:54:20.791 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.q_proj.bias
2023-11-02 21:54:20.791 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.out_proj.weight
2023-11-02 21:54:20.792 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.self_attn.out_proj.bias
2023-11-02 21:54:20.792 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.layer_norm1.weight
2023-11-02 21:54:20.792 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.layer_norm1.bias
2023-11-02 21:54:20.792 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.mlp.fc1.weight
2023-11-02 21:54:20.792 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.mlp.fc1.bias
2023-11-02 21:54:20.792 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.mlp.fc2.weight
2023-11-02 21:54:20.793 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.mlp.fc2.bias
2023-11-02 21:54:20.793 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.layer_norm2.weight
2023-11-02 21:54:20.793 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.5.layer_norm2.bias
2023-11-02 21:54:20.793 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.k_proj.weight
2023-11-02 21:54:20.793 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.k_proj.bias
2023-11-02 21:54:20.793 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.v_proj.weight
2023-11-02 21:54:20.794 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.v_proj.bias
2023-11-02 21:54:20.794 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.q_proj.weight
2023-11-02 21:54:20.794 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.q_proj.bias
2023-11-02 21:54:20.794 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.out_proj.weight
2023-11-02 21:54:20.794 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.self_attn.out_proj.bias
2023-11-02 21:54:20.794 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.layer_norm1.weight
2023-11-02 21:54:20.795 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.layer_norm1.bias
2023-11-02 21:54:20.795 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.mlp.fc1.weight
2023-11-02 21:54:20.795 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.mlp.fc1.bias
2023-11-02 21:54:20.795 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.mlp.fc2.weight
2023-11-02 21:54:20.795 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.mlp.fc2.bias
2023-11-02 21:54:20.796 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.layer_norm2.weight
2023-11-02 21:54:20.796 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.6.layer_norm2.bias
2023-11-02 21:54:20.796 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.k_proj.weight
2023-11-02 21:54:20.796 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.k_proj.bias
2023-11-02 21:54:20.796 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.v_proj.weight
2023-11-02 21:54:20.796 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.v_proj.bias
2023-11-02 21:54:20.797 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.q_proj.weight
2023-11-02 21:54:20.797 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.q_proj.bias
2023-11-02 21:54:20.797 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.out_proj.weight
2023-11-02 21:54:20.797 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.self_attn.out_proj.bias
2023-11-02 21:54:20.797 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.layer_norm1.weight
2023-11-02 21:54:20.797 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.layer_norm1.bias
2023-11-02 21:54:20.798 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.mlp.fc1.weight
2023-11-02 21:54:20.798 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.mlp.fc1.bias
2023-11-02 21:54:20.798 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.mlp.fc2.weight
2023-11-02 21:54:20.798 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.mlp.fc2.bias
2023-11-02 21:54:20.798 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.layer_norm2.weight
2023-11-02 21:54:20.798 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.7.layer_norm2.bias
2023-11-02 21:54:20.799 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.k_proj.weight
2023-11-02 21:54:20.799 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.k_proj.bias
2023-11-02 21:54:20.799 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.v_proj.weight
2023-11-02 21:54:20.799 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.v_proj.bias
2023-11-02 21:54:20.799 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.q_proj.weight
2023-11-02 21:54:20.799 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.q_proj.bias
2023-11-02 21:54:20.800 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.out_proj.weight
2023-11-02 21:54:20.800 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.self_attn.out_proj.bias
2023-11-02 21:54:20.800 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.layer_norm1.weight
2023-11-02 21:54:20.800 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.layer_norm1.bias
2023-11-02 21:54:20.800 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.mlp.fc1.weight
2023-11-02 21:54:20.801 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.mlp.fc1.bias
2023-11-02 21:54:20.801 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.mlp.fc2.weight
2023-11-02 21:54:20.801 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.mlp.fc2.bias
2023-11-02 21:54:20.801 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.layer_norm2.weight
2023-11-02 21:54:20.801 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.8.layer_norm2.bias
2023-11-02 21:54:20.801 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.k_proj.weight
2023-11-02 21:54:20.802 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.k_proj.bias
2023-11-02 21:54:20.802 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.v_proj.weight
2023-11-02 21:54:20.802 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.v_proj.bias
2023-11-02 21:54:20.802 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.q_proj.weight
2023-11-02 21:54:20.802 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.q_proj.bias
2023-11-02 21:54:20.802 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.out_proj.weight
2023-11-02 21:54:20.803 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.self_attn.out_proj.bias
2023-11-02 21:54:20.803 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.layer_norm1.weight
2023-11-02 21:54:20.803 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.layer_norm1.bias
2023-11-02 21:54:20.803 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.mlp.fc1.weight
2023-11-02 21:54:20.803 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.mlp.fc1.bias
2023-11-02 21:54:20.803 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.mlp.fc2.weight
2023-11-02 21:54:20.804 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.mlp.fc2.bias
2023-11-02 21:54:20.804 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.layer_norm2.weight
2023-11-02 21:54:20.804 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.9.layer_norm2.bias
2023-11-02 21:54:20.804 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.k_proj.weight
2023-11-02 21:54:20.804 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.k_proj.bias
2023-11-02 21:54:20.804 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.v_proj.weight
2023-11-02 21:54:20.805 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.v_proj.bias
2023-11-02 21:54:20.805 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.q_proj.weight
2023-11-02 21:54:20.805 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.q_proj.bias
2023-11-02 21:54:20.805 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.out_proj.weight
2023-11-02 21:54:20.805 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.self_attn.out_proj.bias
2023-11-02 21:54:20.806 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.layer_norm1.weight
2023-11-02 21:54:20.806 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.layer_norm1.bias
2023-11-02 21:54:20.806 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.mlp.fc1.weight
2023-11-02 21:54:20.806 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.mlp.fc1.bias
2023-11-02 21:54:20.806 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.mlp.fc2.weight
2023-11-02 21:54:20.806 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.mlp.fc2.bias
2023-11-02 21:54:20.807 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.layer_norm2.weight
2023-11-02 21:54:20.807 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.10.layer_norm2.bias
2023-11-02 21:54:20.807 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.k_proj.weight
2023-11-02 21:54:20.807 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.k_proj.bias
2023-11-02 21:54:20.807 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.v_proj.weight
2023-11-02 21:54:20.807 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.v_proj.bias
2023-11-02 21:54:20.808 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.q_proj.weight
2023-11-02 21:54:20.808 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.q_proj.bias
2023-11-02 21:54:20.808 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.out_proj.weight
2023-11-02 21:54:20.808 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.self_attn.out_proj.bias
2023-11-02 21:54:20.808 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.layer_norm1.weight
2023-11-02 21:54:20.808 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.layer_norm1.bias
2023-11-02 21:54:20.809 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.mlp.fc1.weight
2023-11-02 21:54:20.809 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.mlp.fc1.bias
2023-11-02 21:54:20.809 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.mlp.fc2.weight
2023-11-02 21:54:20.809 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.mlp.fc2.bias
2023-11-02 21:54:20.809 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.layer_norm2.weight
2023-11-02 21:54:20.809 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.11.layer_norm2.bias
2023-11-02 21:54:20.810 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.k_proj.weight
2023-11-02 21:54:20.810 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.k_proj.bias
2023-11-02 21:54:20.810 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.v_proj.weight
2023-11-02 21:54:20.810 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.v_proj.bias
2023-11-02 21:54:20.810 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.q_proj.weight
2023-11-02 21:54:20.811 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.q_proj.bias
2023-11-02 21:54:20.811 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.out_proj.weight
2023-11-02 21:54:20.811 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.self_attn.out_proj.bias
2023-11-02 21:54:20.811 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.layer_norm1.weight
2023-11-02 21:54:20.811 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.layer_norm1.bias
2023-11-02 21:54:20.811 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.mlp.fc1.weight
2023-11-02 21:54:20.812 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.mlp.fc1.bias
2023-11-02 21:54:20.812 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.mlp.fc2.weight
2023-11-02 21:54:20.812 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.mlp.fc2.bias
2023-11-02 21:54:20.812 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.layer_norm2.weight
2023-11-02 21:54:20.812 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.12.layer_norm2.bias
2023-11-02 21:54:20.812 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.k_proj.weight
2023-11-02 21:54:20.813 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.k_proj.bias
2023-11-02 21:54:20.813 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.v_proj.weight
2023-11-02 21:54:20.813 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.v_proj.bias
2023-11-02 21:54:20.813 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.q_proj.weight
2023-11-02 21:54:20.813 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.q_proj.bias
2023-11-02 21:54:20.814 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.out_proj.weight
2023-11-02 21:54:20.814 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.self_attn.out_proj.bias
2023-11-02 21:54:20.814 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.layer_norm1.weight
2023-11-02 21:54:20.814 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.layer_norm1.bias
2023-11-02 21:54:20.814 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.mlp.fc1.weight
2023-11-02 21:54:20.814 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.mlp.fc1.bias
2023-11-02 21:54:20.815 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.mlp.fc2.weight
2023-11-02 21:54:20.815 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.mlp.fc2.bias
2023-11-02 21:54:20.815 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.layer_norm2.weight
2023-11-02 21:54:20.815 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.13.layer_norm2.bias
2023-11-02 21:54:20.815 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.k_proj.weight
2023-11-02 21:54:20.815 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.k_proj.bias
2023-11-02 21:54:20.816 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.v_proj.weight
2023-11-02 21:54:20.816 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.v_proj.bias
2023-11-02 21:54:20.816 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.q_proj.weight
2023-11-02 21:54:20.816 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.q_proj.bias
2023-11-02 21:54:20.816 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.out_proj.weight
2023-11-02 21:54:20.816 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.self_attn.out_proj.bias
2023-11-02 21:54:20.817 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.layer_norm1.weight
2023-11-02 21:54:20.817 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.layer_norm1.bias
2023-11-02 21:54:20.817 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.mlp.fc1.weight
2023-11-02 21:54:20.817 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.mlp.fc1.bias
2023-11-02 21:54:20.817 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.mlp.fc2.weight
2023-11-02 21:54:20.817 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.mlp.fc2.bias
2023-11-02 21:54:20.818 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.layer_norm2.weight
2023-11-02 21:54:20.818 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.14.layer_norm2.bias
2023-11-02 21:54:20.818 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.k_proj.weight
2023-11-02 21:54:20.818 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.k_proj.bias
2023-11-02 21:54:20.818 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.v_proj.weight
2023-11-02 21:54:20.819 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.v_proj.bias
2023-11-02 21:54:20.819 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.q_proj.weight
2023-11-02 21:54:20.819 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.q_proj.bias
2023-11-02 21:54:20.819 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.out_proj.weight
2023-11-02 21:54:20.819 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.self_attn.out_proj.bias
2023-11-02 21:54:20.819 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.layer_norm1.weight
2023-11-02 21:54:20.820 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.layer_norm1.bias
2023-11-02 21:54:20.820 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.mlp.fc1.weight
2023-11-02 21:54:20.820 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.mlp.fc1.bias
2023-11-02 21:54:20.820 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.mlp.fc2.weight
2023-11-02 21:54:20.820 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.mlp.fc2.bias
2023-11-02 21:54:20.820 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.layer_norm2.weight
2023-11-02 21:54:20.821 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.15.layer_norm2.bias
2023-11-02 21:54:20.821 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.k_proj.weight
2023-11-02 21:54:20.821 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.k_proj.bias
2023-11-02 21:54:20.821 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.v_proj.weight
2023-11-02 21:54:20.821 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.v_proj.bias
2023-11-02 21:54:20.821 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.q_proj.weight
2023-11-02 21:54:20.822 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.q_proj.bias
2023-11-02 21:54:20.822 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.out_proj.weight
2023-11-02 21:54:20.822 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.self_attn.out_proj.bias
2023-11-02 21:54:20.822 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.layer_norm1.weight
2023-11-02 21:54:20.822 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.layer_norm1.bias
2023-11-02 21:54:20.823 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.mlp.fc1.weight
2023-11-02 21:54:20.823 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.mlp.fc1.bias
2023-11-02 21:54:20.823 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.mlp.fc2.weight
2023-11-02 21:54:20.823 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.mlp.fc2.bias
2023-11-02 21:54:20.823 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.layer_norm2.weight
2023-11-02 21:54:20.823 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.16.layer_norm2.bias
2023-11-02 21:54:20.824 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.k_proj.weight
2023-11-02 21:54:20.824 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.k_proj.bias
2023-11-02 21:54:20.824 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.v_proj.weight
2023-11-02 21:54:20.824 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.v_proj.bias
2023-11-02 21:54:20.824 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.q_proj.weight
2023-11-02 21:54:20.824 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.q_proj.bias
2023-11-02 21:54:20.825 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.out_proj.weight
2023-11-02 21:54:20.825 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.self_attn.out_proj.bias
2023-11-02 21:54:20.825 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.layer_norm1.weight
2023-11-02 21:54:20.825 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.layer_norm1.bias
2023-11-02 21:54:20.825 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.mlp.fc1.weight
2023-11-02 21:54:20.825 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.mlp.fc1.bias
2023-11-02 21:54:20.826 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.mlp.fc2.weight
2023-11-02 21:54:20.826 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.mlp.fc2.bias
2023-11-02 21:54:20.826 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.layer_norm2.weight
2023-11-02 21:54:20.826 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.17.layer_norm2.bias
2023-11-02 21:54:20.826 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.k_proj.weight
2023-11-02 21:54:20.826 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.k_proj.bias
2023-11-02 21:54:20.827 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.v_proj.weight
2023-11-02 21:54:20.827 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.v_proj.bias
2023-11-02 21:54:20.827 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.q_proj.weight
2023-11-02 21:54:20.827 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.q_proj.bias
2023-11-02 21:54:20.827 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.out_proj.weight
2023-11-02 21:54:20.827 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.self_attn.out_proj.bias
2023-11-02 21:54:20.828 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.layer_norm1.weight
2023-11-02 21:54:20.828 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.layer_norm1.bias
2023-11-02 21:54:20.828 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.mlp.fc1.weight
2023-11-02 21:54:20.828 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.mlp.fc1.bias
2023-11-02 21:54:20.828 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.mlp.fc2.weight
2023-11-02 21:54:20.828 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.mlp.fc2.bias
2023-11-02 21:54:20.829 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.layer_norm2.weight
2023-11-02 21:54:20.829 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.18.layer_norm2.bias
2023-11-02 21:54:20.829 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.k_proj.weight
2023-11-02 21:54:20.829 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.k_proj.bias
2023-11-02 21:54:20.829 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.v_proj.weight
2023-11-02 21:54:20.830 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.v_proj.bias
2023-11-02 21:54:20.830 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.q_proj.weight
2023-11-02 21:54:20.830 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.q_proj.bias
2023-11-02 21:54:20.830 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.out_proj.weight
2023-11-02 21:54:20.830 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.self_attn.out_proj.bias
2023-11-02 21:54:20.830 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.layer_norm1.weight
2023-11-02 21:54:20.831 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.layer_norm1.bias
2023-11-02 21:54:20.831 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.mlp.fc1.weight
2023-11-02 21:54:20.831 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.mlp.fc1.bias
2023-11-02 21:54:20.831 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.mlp.fc2.weight
2023-11-02 21:54:20.831 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.mlp.fc2.bias
2023-11-02 21:54:20.831 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.layer_norm2.weight
2023-11-02 21:54:20.832 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.19.layer_norm2.bias
2023-11-02 21:54:20.832 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.k_proj.weight
2023-11-02 21:54:20.832 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.k_proj.bias
2023-11-02 21:54:20.832 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.v_proj.weight
2023-11-02 21:54:20.832 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.v_proj.bias
2023-11-02 21:54:20.832 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.q_proj.weight
2023-11-02 21:54:20.833 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.q_proj.bias
2023-11-02 21:54:20.833 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.out_proj.weight
2023-11-02 21:54:20.833 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.self_attn.out_proj.bias
2023-11-02 21:54:20.833 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.layer_norm1.weight
2023-11-02 21:54:20.833 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.layer_norm1.bias
2023-11-02 21:54:20.833 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.mlp.fc1.weight
2023-11-02 21:54:20.834 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.mlp.fc1.bias
2023-11-02 21:54:20.834 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.mlp.fc2.weight
2023-11-02 21:54:20.834 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.mlp.fc2.bias
2023-11-02 21:54:20.834 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.layer_norm2.weight
2023-11-02 21:54:20.834 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.20.layer_norm2.bias
2023-11-02 21:54:20.835 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.k_proj.weight
2023-11-02 21:54:20.835 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.k_proj.bias
2023-11-02 21:54:20.835 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.v_proj.weight
2023-11-02 21:54:20.835 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.v_proj.bias
2023-11-02 21:54:20.835 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.q_proj.weight
2023-11-02 21:54:20.835 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.q_proj.bias
2023-11-02 21:54:20.836 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.out_proj.weight
2023-11-02 21:54:20.836 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.self_attn.out_proj.bias
2023-11-02 21:54:20.836 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.layer_norm1.weight
2023-11-02 21:54:20.836 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.layer_norm1.bias
2023-11-02 21:54:20.836 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.mlp.fc1.weight
2023-11-02 21:54:20.836 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.mlp.fc1.bias
2023-11-02 21:54:20.837 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.mlp.fc2.weight
2023-11-02 21:54:20.837 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.mlp.fc2.bias
2023-11-02 21:54:20.837 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.layer_norm2.weight
2023-11-02 21:54:20.837 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.21.layer_norm2.bias
2023-11-02 21:54:20.837 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.k_proj.weight
2023-11-02 21:54:20.837 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.k_proj.bias
2023-11-02 21:54:20.838 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.v_proj.weight
2023-11-02 21:54:20.838 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.v_proj.bias
2023-11-02 21:54:20.838 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.q_proj.weight
2023-11-02 21:54:20.838 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.q_proj.bias
2023-11-02 21:54:20.838 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.out_proj.weight
2023-11-02 21:54:20.838 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.self_attn.out_proj.bias
2023-11-02 21:54:20.839 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.layer_norm1.weight
2023-11-02 21:54:20.839 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.layer_norm1.bias
2023-11-02 21:54:20.839 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.mlp.fc1.weight
2023-11-02 21:54:20.839 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.mlp.fc1.bias
2023-11-02 21:54:20.839 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.mlp.fc2.weight
2023-11-02 21:54:20.839 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.mlp.fc2.bias
2023-11-02 21:54:20.840 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.layer_norm2.weight
2023-11-02 21:54:20.840 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.22.layer_norm2.bias
2023-11-02 21:54:20.840 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.k_proj.weight
2023-11-02 21:54:20.840 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.k_proj.bias
2023-11-02 21:54:20.840 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.v_proj.weight
2023-11-02 21:54:20.840 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.v_proj.bias
2023-11-02 21:54:20.841 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.q_proj.weight
2023-11-02 21:54:20.841 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.q_proj.bias
2023-11-02 21:54:20.841 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.out_proj.weight
2023-11-02 21:54:20.841 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.self_attn.out_proj.bias
2023-11-02 21:54:20.841 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.layer_norm1.weight
2023-11-02 21:54:20.842 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.layer_norm1.bias
2023-11-02 21:54:20.842 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.mlp.fc1.weight
2023-11-02 21:54:20.842 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.mlp.fc1.bias
2023-11-02 21:54:20.842 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.mlp.fc2.weight
2023-11-02 21:54:20.842 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.mlp.fc2.bias
2023-11-02 21:54:20.842 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.layer_norm2.weight
2023-11-02 21:54:20.843 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.encoder.layers.23.layer_norm2.bias
2023-11-02 21:54:20.843 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.post_layernorm.weight
2023-11-02 21:54:20.843 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.vision_tower.vision_tower.vision_model.post_layernorm.bias
2023-11-02 21:54:20.843 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.projector.projector.weight
2023-11-02 21:54:20.843 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: model.projector.projector.bias
2023-11-02 21:54:20.843 | INFO | mmgpt.utils.logger:log_model_parameters:194 - -> Trainable Parameters: lm_head.weight
2023-11-02 21:54:20.847 | INFO | mmgpt.utils.logger:log_model_parameters:199 - >> Total params: 6752.17M
2023-11-02 21:54:20.847 | INFO | mmgpt.utils.logger:log_model_parameters:200 - >> Train params: 6752.17M, Ratio 100.00%
2023-11-02 21:54:20.864 | INFO | mmgpt.data.dataset.pair_webdataset:__init__:53 - 1666666 interleaved (6-merged) image-text pairs (splitted to 48 workers) are sampled from dataset: laion2b_10m_6merge.
2023-11-02 21:54:21.089 | INFO | mmgpt.data.dataset.pair_webdataset:__init__:53 - 833333 interleaved (6-merged) image-text pairs (splitted to 48 workers) are sampled from dataset: grit_5m_6merge.
2023-11-02 21:54:21.099 | INFO | mmgpt.data.dataset.interpair_webdataset:__init__:51 - 500000 interleaved (2-merged) image-text pairs (splitted to 48 workers) are sampled from dataset: track_1m_v1_2merge.
2023-11-02 21:54:21.109 | INFO | mmgpt.data.dataset.interpair_webdataset:__init__:51 - 1250000 interleaved (4-merged) image-text pairs (splitted to 48 workers) are sampled from dataset: det_5m_v1_en_4merge.
2023-11-02 21:54:21.110 | INFO | mmgpt.data.builder:build_dataloader:65 - After processing, totally 4249999 samples are involved.
2023-11-02 21:54:21.264 | INFO | mmgpt.engine.train.trainer:create_optimizer:62 - ->> Number of Optimizer Groups: 50
2023-11-02 21:54:21.265 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 0: 233 groups of parameters maintains a learning rate of 5e-05
2023-11-02 21:54:21.265 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 1: 2 groups of parameters maintains a learning rate of 5e-06
2023-11-02 21:54:21.265 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 2: 6 groups of parameters maintains a learning rate of 4.923854510918059e-06
2023-11-02 21:54:21.265 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 3: 6 groups of parameters maintains a learning rate of 5.470949456575621e-06
2023-11-02 21:54:21.265 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 4: 6 groups of parameters maintains a learning rate of 6.078832729528468e-06
2023-11-02 21:54:21.266 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 5: 6 groups of parameters maintains a learning rate of 6.7542585883649645e-06
2023-11-02 21:54:21.266 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 6: 6 groups of parameters maintains a learning rate of 7.504731764849959e-06
2023-11-02 21:54:21.266 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 7: 6 groups of parameters maintains a learning rate of 8.338590849833288e-06
2023-11-02 21:54:21.266 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 8: 6 groups of parameters maintains a learning rate of 9.265100944259208e-06
2023-11-02 21:54:21.266 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 9: 6 groups of parameters maintains a learning rate of 1.0294556604732453e-05
2023-11-02 21:54:21.266 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 10: 6 groups of parameters maintains a learning rate of 1.1438396227480504e-05
2023-11-02 21:54:21.267 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 11: 6 groups of parameters maintains a learning rate of 1.2709329141645005e-05
2023-11-02 21:54:21.267 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 12: 6 groups of parameters maintains a learning rate of 1.4121476824050005e-05
2023-11-02 21:54:21.267 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 13: 6 groups of parameters maintains a learning rate of 1.5690529804500005e-05
2023-11-02 21:54:21.267 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 14: 6 groups of parameters maintains a learning rate of 1.7433922005000004e-05
2023-11-02 21:54:21.267 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 15: 6 groups of parameters maintains a learning rate of 1.9371024450000006e-05
2023-11-02 21:54:21.267 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 16: 6 groups of parameters maintains a learning rate of 2.1523360500000007e-05
2023-11-02 21:54:21.268 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 17: 6 groups of parameters maintains a learning rate of 2.3914845000000007e-05
2023-11-02 21:54:21.268 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 18: 6 groups of parameters maintains a learning rate of 2.6572050000000003e-05
2023-11-02 21:54:21.268 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 19: 6 groups of parameters maintains a learning rate of 2.9524500000000005e-05
2023-11-02 21:54:21.268 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 20: 6 groups of parameters maintains a learning rate of 3.2805e-05
2023-11-02 21:54:21.268 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 21: 6 groups of parameters maintains a learning rate of 3.6450000000000005e-05
2023-11-02 21:54:21.268 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 22: 6 groups of parameters maintains a learning rate of 4.05e-05
2023-11-02 21:54:21.269 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 23: 6 groups of parameters maintains a learning rate of 4.5e-05
2023-11-02 21:54:21.269 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 24: 6 groups of parameters maintains a learning rate of 5.555555555555556e-05
2023-11-02 21:54:21.269 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 25: 76 groups of parameters maintains a learning rate of 5e-05
2023-11-02 21:54:21.269 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 26: 5 groups of parameters maintains a learning rate of 5e-06
2023-11-02 21:54:21.269 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 27: 10 groups of parameters maintains a learning rate of 4.923854510918059e-06
2023-11-02 21:54:21.269 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 28: 10 groups of parameters maintains a learning rate of 5.470949456575621e-06
2023-11-02 21:54:21.270 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 29: 10 groups of parameters maintains a learning rate of 6.078832729528468e-06
2023-11-02 21:54:21.270 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 30: 10 groups of parameters maintains a learning rate of 6.7542585883649645e-06
2023-11-02 21:54:21.270 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 31: 10 groups of parameters maintains a learning rate of 7.504731764849959e-06
2023-11-02 21:54:21.270 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 32: 10 groups of parameters maintains a learning rate of 8.338590849833288e-06
2023-11-02 21:54:21.270 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 33: 10 groups of parameters maintains a learning rate of 9.265100944259208e-06
2023-11-02 21:54:21.270 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 34: 10 groups of parameters maintains a learning rate of 1.0294556604732453e-05
2023-11-02 21:54:21.271 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 35: 10 groups of parameters maintains a learning rate of 1.1438396227480504e-05
2023-11-02 21:54:21.271 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 36: 10 groups of parameters maintains a learning rate of 1.2709329141645005e-05
2023-11-02 21:54:21.271 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 37: 10 groups of parameters maintains a learning rate of 1.4121476824050005e-05
2023-11-02 21:54:21.271 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 38: 10 groups of parameters maintains a learning rate of 1.5690529804500005e-05
2023-11-02 21:54:21.271 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 39: 10 groups of parameters maintains a learning rate of 1.7433922005000004e-05
2023-11-02 21:54:21.271 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 40: 10 groups of parameters maintains a learning rate of 1.9371024450000006e-05
2023-11-02 21:54:21.272 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 41: 10 groups of parameters maintains a learning rate of 2.1523360500000007e-05
2023-11-02 21:54:21.272 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 42: 10 groups of parameters maintains a learning rate of 2.3914845000000007e-05
2023-11-02 21:54:21.272 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 43: 10 groups of parameters maintains a learning rate of 2.6572050000000003e-05
2023-11-02 21:54:21.272 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 44: 10 groups of parameters maintains a learning rate of 2.9524500000000005e-05
2023-11-02 21:54:21.272 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 45: 10 groups of parameters maintains a learning rate of 3.2805e-05
2023-11-02 21:54:21.273 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 46: 10 groups of parameters maintains a learning rate of 3.6450000000000005e-05
2023-11-02 21:54:21.273 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 47: 10 groups of parameters maintains a learning rate of 4.05e-05
2023-11-02 21:54:21.273 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 48: 10 groups of parameters maintains a learning rate of 4.5e-05
2023-11-02 21:54:21.273 | INFO | mmgpt.engine.train.trainer:create_optimizer:64 - *********>> 49: 10 groups of parameters maintains a learning rate of 5.555555555555556e-05
2023-11-02 21:54:32.290 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:54:32.290 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'shelf:[001, 306, 261, 999].')
2023-11-02 21:54:36.938 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:54:36.938 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Maintain strict adherence to the format category:[xmin,ymin,xmax,ymax] when presenting your answer.', 'Backpack:[760, 116, 999, 764];Cup:[002, 507, 047, 739];Person:[150, 002, 829, 1000];Necklace:[466, 455, 593, 728];Potted Plant:[808, 002, 916, 091];Chair:[764, 117, 999, 999];Cabinet:[002, 002, 276, 734];Desk:[002, 226, 264, 946];Laptop:[002, 382, 134, 622].')
2023-11-02 21:55:11.374 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:55:11.375 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'caption: Prem Chandra [171, 012, 494, 931] demonstrates how she sprays down furniture [436, 347, 997, 996] with an atomized disinfectant [390, 274, 531, 442] using the Clorox 360 machine [015, 507, 396, 997].')
2023-11-02 21:55:49.860 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:55:49.861 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[559, 566, 706, 805] A father and [606, 621, 710, 792] daughter sit on [510, 777, 852, 997] rocks looking out over [000, 481, 998, 882] a loch')
2023-11-02 21:56:17.643 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:56:17.644 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Street Lights,Ambulance and Car in this image.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'Street Lights:[027, 048, 116, 358],[192, 127, 252, 347],[138, 089, 211, 273],[238, 164, 294, 286],[282, 193, 335, 264];Ambulance:[265, 201, 658, 744];Car:[830, 397, 1000, 661],[608, 344, 731, 487],[145, 284, 188, 347],[000, 283, 037, 359].')
2023-11-02 21:56:26.185 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:56:26.185 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[646, 314, 731, 596] and track its trajectory.When detailing trajectories in your response, adhere to the Frame t:[xmin,ymin,xmax,ymax] format.', 'This is a personFrame1:[646, 314, 731, 596];Frame2:[180, 269, 266, 502];Frame3:[075, 339, 159, 552];Frame4:[204, 347, 272, 591].')
2023-11-02 21:57:02.140 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:02.141 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track carFrame1:[129, 482, 177, 585],carFrame1:[000, 394, 114, 707],carFrame1:[363, 481, 426, 581],carFrame1:[061, 430, 163, 632],carFrame1:[224, 489, 256, 545] in this video clip.To respond correctly, utilize the specified classFrame t:[xmin,ymin,xmax,ymax] format.', 'carFrame1:[129, 482, 177, 585];Frame2:[081, 497, 144, 637],carFrame1[000, 394, 114, 707];Frame2:No object,carFrame1:[363, 481, 426, 581];Frame2:[364, 484, 436, 593],carFrame1:[061, 430, 163, 632];Frame2:[000, 410, 097, 711],carFrame1:[224, 489, 256, 545];Frame2:[214, 494, 251, 556].')
2023-11-02 21:57:09.808 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:09.809 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,what is the track of the step backward and stumble and fall?Maintain consistency in your answer by using the TrackiFrame t:[xmin,ymin,xmax,ymax] format for trajectories.', 'There is one step backward and stumble and fall.Track1frame:1:[696, 250, 908, 533];frame:2:[790, 254, 961, 548];frame:3:[803, 261, 957, 559].')
2023-11-02 21:57:18.556 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:18.556 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'Car:[028, 129, 963, 961],[753, 080, 999, 267];SUV:[054, 232, 093, 315];Desk:[138, 271, 163, 304],[163, 266, 176, 304],[213, 268, 243, 308],[253, 262, 271, 300];Chair:[388, 326, 429, 364];Person:[000, 264, 043, 477],[031, 239, 089, 451],[115, 241, 132, 308],[151, 228, 169, 267],[668, 122, 685, 204],[686, 110, 697, 134],[687, 103, 714, 223],[716, 061, 764, 216],[753, 057, 803, 275],[787, 054, 826, 280],[809, 035, 852, 285],[964, 016, 1000, 309],[935, 039, 977, 096],[338, 319, 351, 366],[371, 324, 390, 365],[404, 301, 415, 336],[621, 135, 637, 170];Bracelet:[817, 157, 826, 165];Tent:[000, 000, 998, 331];Sneakers:[967, 282, 990, 309],[028, 445, 043, 461],[000, 455, 012, 477],[054, 423, 066, 451],[076, 423, 089, 438];Glasses:[789, 071, 799, 079];Hat:[716, 087, 730, 104],[037, 239, 056, 256].')
2023-11-02 21:57:34.477 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:34.477 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video cluo including frame1,frame2,frame3,frame4 and frame5,can you tell me what is thisFrame1:[235, 129, 715, 827] and track its trajectory.For clarity, represent trajectories using the Frame t:[xmin,ymin,xmax,ymax] format in your response.', 'This is a iceboatFrame1:[235, 129, 715, 827];Frame2:[470, 216, 835, 739];Frame3:[395, 339, 667, 741];Frame4:[370, 360, 626, 739];Frame5:[350, 362, 595, 731].')
2023-11-02 21:57:40.356 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:40.356 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[473, 623, 1000, 1000] and track its trajectory.Use the specified Frame t:[xmin,ymin,xmax,ymax] format for all trajectories in your reply.', 'This is a bicycle by a man on the road with other bicyclesFrame1:[473, 623, 1000, 1000];Frame2:[432, 637, 995, 1000];Frame3:[371, 544, 1000, 1000];Frame4:[332, 698, 1000, 1000].')
2023-11-02 21:57:45.530 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:45.531 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Maintain strict adherence to the format category:[xmin,ymin,xmax,ymax] when presenting your answer.', 'Person:[080, 437, 148, 546],[469, 301, 501, 447],[645, 334, 794, 847],[776, 372, 903, 730],[859, 599, 999, 999].')
2023-11-02 21:57:52.006 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:57:52.007 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'person:[421, 171, 540, 387],[239, 183, 349, 382];wheel:[120, 451, 182, 898],[147, 477, 211, 957],[557, 703, 623, 926],[591, 369, 651, 584],[616, 461, 695, 965];tire:[118, 805, 146, 917],[143, 565, 207, 960],[570, 788, 622, 926],[575, 368, 645, 614],[626, 557, 693, 965];car:[120, 193, 705, 967].')
2023-11-02 21:58:53.933 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:58:53.934 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'Helmet:[556, 054, 625, 108],[087, 002, 128, 071],[356, 094, 480, 249];Backpack:[509, 002, 653, 075];Motorcycle:[007, 002, 610, 1000],[418, 002, 968, 715],[019, 002, 275, 238].')
2023-11-02 21:59:02.027 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:59:02.028 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'One hispanic and one african american males, on [005, 909, 991, 996] a roof wearing [657, 002, 758, 113] [278, 342, 392, 482] hardhats. [121, 169, 461, 998] One man is waving and [501, 003, 779, 997] the other man is holding [469, 478, 589, 649] a drill. Both are wearing [110, 496, 373, 773] [550, 111, 730, 460] saftey harnesses and smiling.')
2023-11-02 21:59:08.745 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:59:08.745 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'table:[510, 711, 713, 965];chair:[007, 595, 375, 999];woman:[742, 368, 926, 999];footwear:[523, 857, 595, 905];window:[388, 255, 551, 765],[830, 177, 999, 745].')
2023-11-02 21:59:51.246 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:59:51.247 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'flag:[029, 000, 155, 779];man:[000, 591, 168, 998],[162, 597, 349, 998],[334, 632, 480, 994],[400, 163, 620, 998],[606, 884, 647, 998],[636, 819, 731, 998],[724, 878, 780, 998],[752, 813, 869, 998],[839, 206, 998, 998];clothing:[000, 675, 151, 998],[157, 681, 338, 998],[325, 697, 457, 998],[400, 263, 608, 998],[632, 884, 745, 998],[764, 880, 863, 998],[862, 302, 998, 998];human face:[224, 632, 259, 703],[391, 638, 417, 701],[863, 218, 925, 326].')
2023-11-02 21:59:59.522 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 21:59:59.523 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Maintain strict adherence to the format category:[xmin,ymin,xmax,ymax] when presenting your answer.', 'Person:[534, 410, 986, 975],[614, 288, 987, 782],[590, 124, 707, 464],[543, 114, 636, 429],[330, 117, 577, 654],[015, 145, 396, 964],[011, 036, 185, 422],[101, 022, 217, 262];Glasses:[791, 542, 903, 587],[252, 235, 346, 278],[869, 357, 942, 393];Hat:[196, 147, 348, 305];Satchel:[539, 255, 597, 347];Plate:[366, 720, 495, 818],[406, 631, 571, 735],[567, 619, 726, 707],[627, 736, 758, 827],[493, 798, 642, 903];Desk:[183, 477, 780, 980],[477, 385, 606, 510].')
2023-11-02 22:00:06.269 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:00:06.269 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Flower:[775, 548, 925, 722],[613, 505, 718, 615],[540, 462, 613, 557],[478, 431, 548, 516],[437, 435, 480, 487];Vase:[717, 682, 999, 861],[578, 594, 780, 715],[494, 534, 633, 625],[443, 503, 549, 579],[403, 476, 494, 537];Bench:[275, 472, 388, 545],[460, 776, 973, 1000];Truck:[522, 271, 815, 498];Van:[872, 298, 999, 731];Person:[923, 477, 999, 761],[733, 365, 801, 636],[703, 370, 751, 601],[108, 326, 140, 478],[060, 335, 094, 506];Backpack:[694, 410, 744, 504].')
2023-11-02 22:00:53.715 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:00:53.716 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,what is thisFrame1:[280, 489, 326, 552] and track its trajectory.All trajectories in your reply should conform to the Frame t:[xmin,ymin,xmax,ymax] pattern.', 'This is a border terrierFrame1:[280, 489, 326, 552];Frame2:[365, 452, 410, 500];Frame3:[333, 481, 366, 527];Frame4:[411, 435, 450, 489].')
2023-11-02 22:01:57.161 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:01:57.161 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,can you point out the trajectory of the people sitting on a sailboat shifting in circles?Your trajectory details should strictly align with the TrackiFrame t:[xmin,ymin,xmax,ymax] format in your submission.', 'There is one people sitting on a sailboat shifting in circles.Track1frame:1:[406, 688, 558, 925];frame:2:[515, 696, 636, 871];frame:3:[545, 690, 656, 855].')
2023-11-02 22:02:10.650 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:02:10.651 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect ball,woman,office building and clothing in this image.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'ball:[000, 641, 053, 826],[000, 532, 055, 597];woman:[456, 342, 578, 843];office building:[011, 002, 950, 918];clothing:[060, 403, 190, 889],[470, 411, 584, 799].')
2023-11-02 22:02:14.318 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:02:14.319 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[633, 487, 670, 514] and track its trajectory.The Frame t:[xmin,ymin,xmax,ymax] format should be employed for the trajectories within your answer.', 'This is a border terrierFrame1:[633, 487, 670, 514];Frame2:[508, 510, 531, 541];Frame3:[495, 535, 528, 564];Frame4:[518, 518, 545, 547].')
2023-11-02 22:02:23.124 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:02:23.124 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[448, 760, 781, 996] and track its trajectory.For clarity, represent trajectories using the Frame t:[xmin,ymin,xmax,ymax] format in your response.', 'This is a slothFrame1:[448, 760, 781, 996];Frame2:[492, 769, 744, 999];Frame3:[467, 781, 783, 999];Frame4:[489, 760, 808, 965].')
2023-11-02 22:02:35.588 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:02:35.588 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,please tell me the trajectory of the flock of avian creatures in flight.Your trajectory details should strictly align with the TrackiFrame t:[xmin,ymin,xmax,ymax] format in your submission.', 'There are 11 flock of avian creatures in flight.Track1frame:1:[059, 425, 205, 529];frame:2:[235, 410, 377, 566];frame:3:[182, 279, 344, 491],Track2frame:1:[978, 486, 999, 495],Track3frame:1:[769, 364, 866, 508],Track4frame:1:[754, 458, 935, 634],Track5frame:1:[578, 238, 668, 379];frame:2:[801, 225, 900, 373];frame:3:[804, 284, 876, 353],Track6frame:1:[487, 253, 661, 420];frame:2:[700, 172, 868, 384];frame:3:[666, 268, 847, 376],Track7frame:1:[338, 353, 481, 564];frame:2:[516, 385, 648, 594];frame:3:[506, 432, 643, 531],Track8frame:1:[460, 318, 582, 440];frame:2:[678, 224, 795, 398];frame:3:[659, 306, 751, 370],Track9frame:1:[336, 288, 448, 392];frame:2:[544, 203, 654, 399];frame:3:[492, 372, 641, 457],Track10frame:1:[283, 314, 434, 455];frame:2:[467, 335, 655, 467];frame:3:[452, 250, 629, 385],Track11frame:1:[324, 447, 393, 479];frame:2:[607, 348, 666, 431];frame:3:[433, 348, 610, 393].')
2023-11-02 22:02:57.705 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:02:57.705 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,please tell me what is thisFrame1:[330, 323, 625, 687] and track its trajectory.All trajectories in your reply should conform to the Frame t:[xmin,ymin,xmax,ymax] pattern.', 'This is a mountain bikeFrame1:[330, 323, 625, 687];Frame2:[307, 345, 645, 716];Frame3:[309, 512, 716, 991];Frame4:[217, 150, 725, 866].')
2023-11-02 22:03:07.544 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:03:07.545 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Maintain strict adherence to the format category:[x0,y0,x1,y1] when presenting your answer.', 'table:[020, 741, 361, 980],[000, 573, 150, 796],[173, 502, 466, 745],[500, 530, 999, 998],[724, 415, 899, 525];man:[006, 317, 177, 761],[280, 287, 410, 784],[328, 612, 574, 998],[442, 271, 553, 761],[515, 216, 626, 637],[587, 226, 633, 497],[606, 180, 758, 521];tree:[327, 000, 650, 316];clothing:[000, 356, 182, 760],[240, 363, 306, 463],[295, 306, 416, 780],[305, 782, 581, 998],[456, 304, 558, 796],[520, 254, 613, 534],[560, 755, 705, 998],[605, 250, 767, 504],[784, 287, 856, 386],[936, 283, 999, 417];building:[000, 000, 999, 410].')
2023-11-02 22:03:21.086 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:03:21.086 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'Street Lights:[281, 351, 319, 518],[003, 346, 038, 494];Soccer:[460, 687, 532, 792];Sneakers:[594, 690, 636, 750];Person:[430, 083, 656, 750],[139, 217, 282, 679],[376, 414, 413, 582].')
2023-11-02 22:03:53.138 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:03:53.138 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'A large group of athletes [001, 447, 181, 577] [624, 422, 739, 514] [328, 634, 491, 806] [332, 232, 483, 332] [829, 372, 966, 484] [390, 734, 576, 944] [541, 567, 700, 766] swimming in open water.')
2023-11-02 22:03:58.669 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:03:58.670 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,what is the track of the motorized-bicycle shift forward?Ensure the trajectories in your answer follow the TrackiFrame t:[xmin,ymin,xmax,ymax] structure.', 'There is no motorized-bicycle shift forward.')
2023-11-02 22:04:48.779 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:04:48.780 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'Scooter:[804, 495, 1000, 944],[825, 447, 1000, 737],[404, 567, 774, 998],[121, 564, 498, 999];Car:[000, 378, 350, 806];Truck:[570, 267, 684, 447];Helmet:[342, 362, 421, 462],[540, 335, 626, 442],[621, 351, 668, 413],[803, 357, 861, 403];Gloves:[885, 502, 924, 548],[685, 664, 734, 737],[449, 685, 501, 756],[315, 673, 377, 734];Hat:[545, 401, 614, 453];Boots:[803, 674, 851, 754];Other Shoes:[689, 721, 738, 788];Person:[804, 359, 946, 840],[802, 361, 917, 752],[621, 352, 701, 545],[449, 336, 743, 1000],[167, 363, 449, 1000];Flag:[701, 188, 828, 459].')
2023-11-02 22:06:41.603 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:06:41.604 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect girl,man and woman in this image.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'girl:[130, 131, 998, 999],[164, 265, 596, 999],[607, 250, 999, 999];man:[046, 241, 320, 945],[730, 162, 999, 620];woman:[125, 151, 999, 999],[579, 258, 999, 999].')
2023-11-02 22:06:42.894 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:06:42.895 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'Lamp:[960, 489, 998, 623],[901, 491, 939, 624];Person:[162, 664, 216, 842],[289, 671, 340, 788],[750, 647, 839, 966],[831, 625, 900, 966],[967, 635, 1000, 765],[746, 687, 782, 839],[486, 685, 552, 908].')
2023-11-02 22:06:43.966 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:06:43.966 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'Glasses:[688, 485, 883, 610],[888, 454, 1000, 598];Person:[865, 141, 999, 817],[695, 290, 996, 997],[690, 641, 863, 911],[002, 174, 133, 572];Desk:[002, 692, 939, 1000];Cup:[123, 632, 235, 871],[638, 784, 778, 1000],[002, 840, 089, 1000];Plate:[432, 650, 543, 888],[002, 782, 141, 884].')
2023-11-02 22:07:08.130 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:07:08.130 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Maintain strict adherence to the format category:[xmin,ymin,xmax,ymax] when presenting your answer.', 'Car:[667, 223, 1000, 436],[202, 213, 707, 470],[065, 256, 315, 452],[716, 209, 1000, 306];Person:[851, 089, 960, 520],[609, 117, 728, 532],[229, 137, 645, 949],[141, 116, 528, 948],[002, 297, 262, 913];Other Balls:[549, 195, 620, 305].')
2023-11-02 22:07:30.797 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:07:30.797 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'person:[480, 799, 533, 979],[530, 799, 583, 983];tree:[000, 000, 081, 085],[462, 781, 488, 845],[480, 746, 520, 831],[498, 682, 563, 809],[738, 000, 999, 379],[880, 607, 908, 878],[908, 606, 976, 881];umbrella:[320, 762, 415, 825].')
2023-11-02 22:08:11.989 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:08:11.990 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Bottle:[380, 011, 834, 996];Person:[002, 314, 911, 998].')
2023-11-02 22:08:25.823 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:08:25.824 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'Person:[107, 110, 999, 889];Tie:[281, 339, 486, 605];Leather Shoes:[676, 790, 917, 893],[889, 762, 997, 865];Chair:[076, 295, 629, 918],[029, 081, 304, 510],[573, 111, 999, 573];shelf:[019, 043, 792, 406];Trash bin Can:[545, 252, 700, 410].')
2023-11-02 22:08:34.797 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:08:34.797 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Maintain strict adherence to the format category:[xmin,ymin,xmax,ymax] when presenting your answer.', 'Leather Shoes:[013, 084, 097, 159];High Heels:[145, 074, 224, 159],[254, 076, 334, 159],[367, 078, 452, 160],[472, 078, 557, 160],[030, 242, 116, 308],[137, 233, 214, 308],[237, 233, 322, 311],[349, 226, 432, 308],[457, 227, 545, 310],[582, 236, 660, 311],[691, 253, 766, 311],[792, 218, 876, 307],[900, 743, 986, 846],[777, 769, 865, 846],[668, 770, 751, 842],[557, 760, 635, 841],[448, 759, 529, 840],[456, 611, 532, 692],[559, 611, 644, 692],[667, 626, 750, 694],[778, 616, 862, 694],[892, 588, 979, 689],[901, 397, 988, 502],[791, 425, 868, 498],[695, 424, 782, 498],[562, 421, 643, 502],[453, 422, 538, 502],[690, 327, 769, 392],[799, 313, 878, 392],[912, 312, 992, 383].')
2023-11-02 22:09:21.659 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:09:21.659 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,what is thisFrame1:[214, 162, 597, 833] and track its trajectory.Your response should highlight trajectories using the established Frame t:[xmin,ymin,xmax,ymax] structure.', 'This is a pheasantFrame1:[214, 162, 597, 833];Frame2:[295, 161, 413, 763];Frame3:[216, 163, 382, 690];Frame4:[185, 212, 385, 697].')
2023-11-02 22:09:43.903 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:09:43.903 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'overhead bride [386, 275, 681, 934] and groom [358, 264, 541, 795] first dance at the cordelle rustic wedding')
2023-11-02 22:10:15.932 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:10:15.932 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Picture:[273, 373, 307, 430],[270, 429, 306, 523],[383, 442, 420, 539];Person:[391, 229, 457, 380],[491, 254, 604, 635],[599, 209, 706, 554],[766, 229, 814, 480];Boat:[248, 217, 519, 739],[555, 268, 893, 761];Dog:[326, 507, 395, 608],[369, 527, 433, 621].')
2023-11-02 22:10:19.178 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:10:19.179 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,please tell me the trajectory of the bicycles moving to left.Maintain consistency in your answer by using the TrackiFrame t:[xmin,ymin,xmax,ymax] format for trajectories.', 'There are 3 bicycles moving to left.Track1frame:1:[320, 390, 449, 640];frame:2:[218, 402, 363, 679];frame:3:[052, 427, 161, 748],Track2frame:1:[417, 422, 572, 769];frame:2:[298, 445, 524, 836];frame:3:[071, 480, 410, 939],Track3frame:1:[502, 525, 669, 848];frame:2:[329, 564, 551, 927];frame:3:[034, 655, 338, 999].')
2023-11-02 22:11:01.598 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:01.598 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'man:[171, 366, 256, 692],[000, 345, 999, 903];woman:[000, 385, 097, 740],[047, 414, 169, 750],[095, 416, 114, 496],[205, 380, 331, 747],[290, 343, 790, 829],[738, 376, 950, 892],[825, 348, 889, 481],[924, 339, 999, 548],[928, 346, 950, 425];girl:[204, 382, 338, 752],[288, 389, 432, 765],[343, 372, 509, 760],[576, 341, 765, 799],[921, 334, 999, 553];footwear:[205, 643, 253, 753],[346, 652, 396, 730],[399, 682, 438, 760],[477, 609, 525, 684],[585, 739, 650, 796],[663, 771, 733, 835],[730, 824, 799, 899].')
2023-11-02 22:11:05.361 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:05.361 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'Machinery Vehicle:[315, 267, 937, 751],[078, 444, 434, 772];Street Lights:[243, 314, 268, 449];Truck:[062, 437, 154, 571];Trash bin Can:[067, 575, 084, 619],[044, 572, 068, 620],[020, 573, 043, 623].')
2023-11-02 22:11:08.280 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:08.280 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,what is the track of the green bird moving to a plate?Your trajectory details should strictly align with the TrackiFrame t:[xmin,ymin,xmax,ymax] format in your submission.', 'There is one green bird moving to a plate.Track1frame:1:[603, 270, 742, 462];frame:2:[504, 288, 601, 609];frame:3:[451, 381, 565, 651].')
2023-11-02 22:11:39.262 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:39.262 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Mighty fjords [007, 054, 991, 357] rise from the sea in the Westfjords Peninsula, northwestern Iceland. The landscape [007, 348, 991, 925] under the fjords is full of brooks and flowers. poster [001, 013, 993, 960]')
2023-11-02 22:11:47.647 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:47.647 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'A rectangle [036, 145, 958, 863] of dough is spead with finely chopped chicken in buffalo sauce, and sprinkled with shredded cheese.')
2023-11-02 22:11:49.819 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:49.820 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'tree:[000, 255, 998, 546].')
2023-11-02 22:11:52.562 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:11:52.562 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Surfboard,Glasses,Desk and Chair in this image.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'Glasses:[259, 469, 359, 543];Desk:[379, 773, 1000, 1000];Chair:[089, 676, 465, 999].')
2023-11-02 22:11:55.008 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:11:55.009 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'woman:[633, 553, 891, 992],[134, 377, 221, 657],[223, 405, 268, 651],[267, 382, 365, 682],[326, 421, 365, 509],[340, 655, 632, 999],[355, 411, 463, 655],[876, 544, 999, 919];man:[643, 545, 908, 980],[418, 395, 495, 549],[485, 392, 511, 496],[501, 351, 563, 490],[526, 342, 653, 743],[639, 359, 741, 688],[645, 388, 665, 452],[746, 349, 856, 630],[812, 322, 899, 593],[889, 320, 953, 565],[925, 344, 983, 542];clothing:[021, 430, 233, 1000],[140, 383, 999, 735],[329, 828, 632, 999],[604, 705, 886, 999],[871, 645, 999, 925];girl:[141, 378, 178, 510],[204, 398, 305, 701],[253, 411, 281, 481],[266, 380, 375, 683],[320, 415, 358, 492],[335, 649, 637, 999],[355, 415, 471, 666],[390, 400, 430, 484],[643, 561, 905, 999],[870, 522, 999, 919];human face:[095, 385, 145, 506],[436, 706, 525, 877],[749, 598, 813, 732],[958, 565, 999, 671].')
2023-11-02 22:12:15.278 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:12:15.278 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[002, 242, 231, 496] Horse lovers will love learning about [141, 681, 353, 998] [404, 567, 612, 709] their favorite animals with this amazing collection of [001, 001, 994, 997] nonfiction horse books! Perfect for ages 5-12!')
2023-11-02 22:12:52.847 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:12:52.847 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video cluo including frame1,frame2 and frame3, please tell me what this oneframe:1:[457, 304, 596, 877];frame:2:[443, 302, 581, 879];frame:3:[429, 300, 557, 877] is doing?Briefly articulate your response by taking cues from the provided image frames and trajectory coordinates.', 'This is a/an people are walking')
2023-11-02 22:13:45.413 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:13:45.413 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'A portion of [511, 578, 702, 919] the border wall is under construction in Guadalupe Canyon, Arizona, which is a wildlife corridor for Mexican gray wolves and endangered jaguars.')
2023-11-02 22:14:28.914 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:14:28.915 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Boat in this image.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Boat:[830, 721, 999, 957],[875, 754, 940, 858],[683, 732, 791, 911],[476, 681, 551, 777],[346, 782, 497, 891],[194, 819, 323, 879],[339, 717, 387, 762],[123, 866, 268, 1000],[263, 905, 375, 980],[052, 719, 093, 759].')
2023-11-02 22:14:30.511 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:14:30.512 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[944, 310, 987, 504] [003, 077, 723, 811] The Metricup volunteer fire brigade were one of many to attend the Carbunup escaped burn.')
2023-11-02 22:14:50.096 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:14:50.097 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'An illustration of people video conferencing on a computer screen [328, 463, 753, 696] with speech bubbles above it and a house [072, 287, 427, 514] behind it.')
2023-11-02 22:15:37.352 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:15:37.352 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track catFrame1:[440, 377, 714, 704],catFrame1:[359, 310, 460, 537],catFrame1:[818, 022, 1000, 185] in this video clip.Do not deviate from the established classFrame t:[xmin,ymin,xmax,ymax] format in your answer.', 'catFrame1:[440, 377, 714, 704];Frame2:[584, 300, 860, 647],catFrame1:[359, 310, 460, 537];Frame2:[453, 225, 592, 483],catFrame1:[818, 022, 1000, 185];Frame2:[912, 002, 1000, 108].')
2023-11-02 22:16:07.902 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:16:07.903 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'tent:[581, 200, 849, 395];tree:[000, 000, 540, 506],[490, 021, 999, 325].')
2023-11-02 22:16:26.773 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:16:26.773 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track personFrame1:[485, 159, 624, 833],blanketFrame1:[543, 243, 609, 534],towelFrame1:[487, 287, 591, 497] in this video clip.Use only the categoryFrame t:[xmin,ymin,xmax,ymax] format to structure your answer.', 'personFrame1:[485, 159, 624, 833];Frame2:[564, 165, 682, 905],blanketFrame1:[543, 243, 609, 534];Frame2:[627, 255, 689, 573],towelFrame1:[487, 287, 591, 497];Frame2:[611, 316, 675, 575].')
2023-11-02 22:16:28.709 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:16:28.710 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Open-book spread [023, 015, 976, 981] of Imaginary Explosions, page 85. Above black-and-white graphic of mountain topography, text reads, “The event is a narrative (or is it)”')
2023-11-02 22:16:47.740 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:16:47.740 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect man,human face,woman,girl and apple in this image.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'man:[398, 680, 999, 999];human face:[018, 380, 092, 498],[137, 298, 195, 411],[206, 411, 288, 546],[363, 437, 428, 537],[420, 539, 478, 642],[475, 687, 585, 840],[510, 577, 575, 666],[578, 396, 645, 495],[713, 451, 778, 570],[840, 427, 926, 555];woman:[555, 379, 716, 733],[000, 354, 157, 940],[025, 393, 368, 999],[073, 270, 338, 605],[328, 410, 488, 856],[830, 389, 999, 830];girl:[075, 265, 341, 912],[328, 402, 489, 860],[341, 510, 511, 995],[490, 560, 702, 718],[557, 339, 728, 733],[713, 411, 855, 830].')
2023-11-02 22:16:59.778 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:16:59.778 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'Bowl:[008, 494, 163, 571],[152, 506, 222, 562],[220, 495, 255, 518],[910, 532, 999, 650],[802, 505, 941, 585],[751, 521, 904, 611];bucket:[251, 718, 311, 896];Tong:[836, 823, 951, 986];Lamp:[669, 001, 763, 137],[454, 232, 494, 288],[499, 164, 543, 223],[168, 133, 226, 298],[201, 232, 243, 346],[219, 332, 256, 373],[431, 308, 509, 405];shelf:[512, 278, 683, 492],[460, 411, 507, 494];Storage box:[566, 337, 604, 398],[595, 332, 684, 387],[596, 428, 661, 462],[639, 443, 686, 474],[507, 477, 540, 505],[486, 503, 522, 539],[452, 653, 485, 757];Fan:[026, 112, 148, 232],[307, 318, 341, 376],[094, 223, 171, 314];Stool:[377, 620, 423, 730];Clock:[351, 332, 389, 388];Power outlet:[791, 389, 820, 416];Carpet:[223, 866, 541, 1000];Trash bin Can:[336, 561, 355, 601],[531, 301, 575, 378];Blackboard:[205, 379, 268, 434];Scale:[660, 520, 762, 622];Pliers:[837, 823, 956, 991];Hat:[397, 439, 439, 478],[425, 482, 455, 523];Boots:[408, 654, 434, 707],[411, 692, 431, 725];Person:[382, 436, 465, 726],[422, 479, 516, 565].')
2023-11-02 22:17:54.970 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:17:54.970 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Decorative element. Colorful flyer [049, 039, 946, 963] with multiple images of aquaculture describes content of Great Lakes Aquaculture Day. All information on flyer is included in text of article.')
2023-11-02 22:18:02.823 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:18:02.824 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Mechanicville to Malta Brewery Ride')
2023-11-02 22:18:07.625 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:18:07.625 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'couch:[473, 088, 918, 960];coffee table:[106, 381, 401, 1000].')
2023-11-02 22:18:29.836 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:18:29.836 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.For your response, please adhere to the specified category:[xmin,ymin,xmax,ymax] format.', 'Chair:[071, 525, 396, 998],[431, 534, 769, 998];Frame:[413, 126, 500, 315];Lamp:[001, 013, 062, 099];shelf:[001, 067, 212, 300];Desk:[759, 554, 999, 1000];Person:[476, 330, 748, 987],[257, 151, 534, 908],[093, 363, 398, 1000],[620, 277, 699, 475];Satchel:[557, 792, 742, 956];Coffee Machine:[226, 242, 306, 345].')
2023-11-02 22:18:45.950 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:18:45.950 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'Desk:[001, 720, 863, 999],[123, 444, 550, 793],[173, 380, 468, 449];Cabinet:[907, 044, 1000, 526],[459, 217, 507, 377];Person:[332, 343, 415, 471],[193, 283, 285, 483],[245, 285, 273, 336],[130, 403, 187, 468],[001, 261, 059, 438],[001, 337, 164, 619],[001, 563, 039, 818],[178, 426, 526, 913],[391, 001, 896, 1000];Book:[231, 561, 727, 981],[167, 491, 259, 562],[126, 507, 179, 578];Hat:[001, 264, 042, 346];Pen:[221, 760, 260, 905];Moniter:[147, 316, 202, 397],[357, 306, 407, 370].')
2023-11-02 22:19:03.955 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:19:03.955 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'One of [608, 313, 965, 698] Momina’s daughters is preparing naan. Conditions are simple; this family is officially categorized as one in a difficult life situation.')
2023-11-02 22:19:19.234 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:19:19.234 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,please tell me what is thisFrame1:[208, 071, 665, 797] and track its trajectory.For clarity, represent trajectories using the Frame t:[xmin,ymin,xmax,ymax] format in your response.', 'This is a black-necked cobraFrame1:[208, 071, 665, 797];Frame2:[236, 037, 715, 745];Frame3:[310, 067, 766, 617];Frame4:[292, 000, 778, 479].')
2023-11-02 22:19:45.056 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:19:45.057 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Consistently apply the category:[xmin,ymin,xmax,ymax] format to your answer.', 'Chair:[739, 325, 856, 381];Couch:[669, 303, 880, 364];Lamp:[913, 220, 955, 327],[617, 228, 662, 322];Desk:[829, 393, 1000, 569];Tea pot:[394, 351, 443, 430];Refrigerator:[001, 152, 231, 846];Oven:[302, 486, 476, 797];Induction Cooker:[306, 414, 516, 514].')
2023-11-02 22:19:46.837 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:19:46.838 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Romantic couple [363, 390, 494, 905] [466, 323, 656, 903] walks in the apple orchard [004, 007, 993, 998] in summer and holding hands [450, 480, 498, 538].')
2023-11-02 22:19:47.436 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:19:47.437 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track carFrame1:[836, 351, 999, 623],carFrame1:[354, 346, 449, 480],carFrame1:[497, 284, 623, 491],carFrame1:[000, 376, 100, 597],carFrame1:[191, 378, 388, 586] in this video clip.The prescribed format for your answer is classFrame t:[xmin,ymin,xmax,ymax]. Please follow it closely.', 'carFrame1[836, 351, 999, 623];Frame2:No object,carFrame1:[354, 346, 449, 480];Frame2:[286, 332, 423, 511],carFrame1:[497, 284, 623, 491];Frame2:[497, 273, 628, 486],carFrame1[000, 376, 100, 597];Frame2:No object,carFrame1:[191, 378, 388, 586];Frame2:[000, 377, 314, 711].')
2023-11-02 22:19:57.993 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:19:57.993 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,please tell me what is thisFrame1:[310, 292, 673, 447] and track its trajectory.For the trajectories included in the answer, please use the format Frame t:[xmin,ymin,xmax,ymax].', 'This is a droneFrame1:[310, 292, 673, 447];Frame2:[317, 392, 690, 549];Frame3:[319, 394, 695, 555];Frame4:[322, 387, 697, 554].')
2023-11-02 22:20:27.815 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:20:27.816 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'SUV:[001, 690, 058, 806],[018, 687, 200, 864],[156, 586, 225, 681],[229, 672, 423, 818],[442, 660, 612, 784],[624, 650, 758, 756],[764, 645, 876, 740],[879, 643, 979, 730],[969, 633, 1000, 716];Car:[041, 671, 151, 728],[151, 658, 262, 755],[265, 642, 412, 716],[351, 626, 469, 702],[949, 563, 998, 621],[669, 567, 763, 640],[616, 583, 725, 654],[509, 571, 600, 624],[540, 593, 648, 669],[485, 603, 608, 685],[423, 617, 553, 677],[763, 565, 833, 634],[830, 562, 870, 626].')
2023-11-02 22:20:46.833 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:20:46.833 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'person:[114, 478, 778, 754],[518, 580, 553, 754];musical instrument:[043, 571, 071, 644],[085, 569, 114, 642],[140, 523, 679, 690],[446, 511, 481, 596],[606, 507, 661, 586],[693, 536, 723, 604],[738, 549, 770, 619].')
2023-11-02 22:20:55.811 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:20:55.811 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Consistently apply the category:[xmin,ymin,xmax,ymax] format to your answer.', 'building:[000, 128, 999, 998];window:[000, 958, 143, 996],[000, 572, 156, 923],[000, 281, 171, 510],[854, 268, 999, 513],[869, 569, 999, 923],[888, 947, 999, 998].')
2023-11-02 22:21:45.229 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:21:45.229 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Ring and Glasses in this image.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'Ring:[405, 305, 414, 320],[099, 777, 108, 797];Glasses:[766, 180, 839, 215],[551, 227, 626, 260],[439, 246, 489, 277].')
2023-11-02 22:22:04.257 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:22:04.257 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Watchtower [655, 218, 916, 769] near the Iranian border, eastern Turkey in this undated file photo. (Shutter')
2023-11-02 22:22:13.563 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:22:13.563 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Guitar,Backpack and Blackboard/Whiteboard in this image.For your response, please adhere to the specified category:[xmin,ymin,xmax,ymax] format.', 'Guitar:[392, 692, 977, 952];Backpack:[764, 516, 887, 689];Blackboard:[326, 113, 677, 416].')
2023-11-02 22:23:00.705 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:23:00.705 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Timber decking use on a pedestrian bridge [085, 008, 994, 994] over the river [460, 401, 997, 997]')
2023-11-02 22:23:01.626 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:23:01.626 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'clothing:[572, 366, 998, 998],[000, 226, 080, 522],[036, 303, 291, 998],[286, 436, 614, 998];man:[503, 049, 999, 998];human face:[650, 292, 843, 578].')
2023-11-02 22:23:24.954 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:23:24.954 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'Picture:[288, 002, 511, 127];Dinning Table:[081, 778, 787, 999];Chair:[305, 648, 636, 778];Other Fish:[374, 808, 449, 951];Person:[659, 275, 1000, 942],[261, 222, 677, 813];Glasses:[444, 337, 575, 418];Tie:[460, 501, 528, 720];Pie:[131, 788, 278, 857];Basin:[366, 708, 536, 835],[002, 832, 146, 1000],[681, 900, 916, 1000];Chopsticks:[272, 770, 361, 825];Plate:[484, 868, 729, 998],[261, 823, 521, 956],[105, 779, 291, 866],[126, 939, 426, 1000].')
2023-11-02 22:23:39.542 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:23:39.542 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Bear in this image.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'Bear:[365, 307, 812, 731].')
2023-11-02 22:23:39.731 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:23:39.732 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'A cardboard cut out of a percentage symbol [013, 199, 284, 886] and a house [260, 218, 580, 892] with the Sanctuary Homes logo on it on a wooden table [002, 650, 995, 995]')
2023-11-02 22:23:44.056 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:23:44.057 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect houseplant,dress,woman and suit in this image.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'dress:[106, 454, 252, 988],[243, 409, 376, 915],[611, 406, 735, 999];woman:[081, 237, 265, 1000],[241, 256, 380, 999],[361, 288, 488, 999],[475, 254, 618, 999],[611, 230, 773, 999];suit:[757, 300, 913, 999].')
2023-11-02 22:23:50.855 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:23:50.855 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The format category:[xmin,ymin,xmax,ymax] should be strictly observed in your answer.', 'Lamp:[200, 001, 657, 358],[751, 328, 841, 414];Mirror:[614, 290, 873, 625];Flower:[685, 489, 805, 609];shelf:[639, 625, 932, 906],[310, 406, 412, 525],[270, 418, 311, 500],[240, 422, 298, 519],[205, 424, 240, 518],[122, 421, 203, 517],[001, 588, 127, 723],[722, 389, 823, 569];Chair:[332, 578, 404, 699],[270, 585, 331, 679],[189, 600, 313, 734],[135, 699, 459, 1000],[605, 708, 894, 999],[522, 603, 649, 770];Towel:[292, 678, 341, 725],[363, 742, 421, 802],[473, 702, 532, 753];Carpet:[053, 851, 752, 995];Wine Glass:[600, 720, 635, 821],[476, 714, 506, 809],[411, 687, 441, 769];Plate:[615, 826, 706, 879];Fork:[593, 850, 668, 901].')
2023-11-02 22:25:11.527 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:25:11.527 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, "Grabbing late-night tacos and [257, 573, 507, 732] burritos at a place in San Jose. It's always fun to take [422, 149, 796, 995] [006, 238, 479, 996] the girls to new places.")
2023-11-02 22:25:53.718 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:25:53.718 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Surrealist sculpture and [082, 136, 914, 902] functional occasional table is shown from above at a high angle; the work features [289, 337, 722, 903] realistic cast bronze crane legs holding [080, 139, 916, 354] a round wooden, gold-plated tabletop.')
2023-11-02 22:26:19.692 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:26:19.692 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Maintain strict adherence to the format category:[x0,y0,x1,y1] when presenting your answer.', 'boat:[017, 282, 1000, 796];person:[000, 000, 181, 753],[295, 000, 429, 333],[486, 000, 535, 098];clothing:[000, 000, 199, 759],[298, 000, 426, 338],[485, 000, 533, 101],[713, 000, 748, 070],[836, 166, 999, 450].')
2023-11-02 22:26:50.133 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:26:50.134 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'thin strips of carrot, cucumber, avocado, and tofu are being placed on [152, 547, 779, 947] a nori sheet with rice')
2023-11-02 22:26:53.619 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:26:53.620 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,can you tell me the trajectory of the the left one of the two parrots standing on the hand in this video clip?As you detail the trajectories in your reply, the TrackiFrame t:[xmin,ymin,xmax,ymax] format is the preferred structure.', 'There is one the left one of the two parrots standing on the hand.Track1frame:1:[022, 408, 709, 999];frame:2:[000, 437, 718, 999];frame:3:[000, 462, 716, 999].')
2023-11-02 22:27:05.340 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:27:05.340 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect beer,suit,bottle and person in this image.Maintain strict adherence to the format category:[x0,y0,x1,y1] when presenting your answer.', 'beer:[255, 059, 634, 349];suit:[574, 000, 977, 286];bottle:[008, 009, 126, 190],[095, 000, 224, 176],[180, 057, 262, 181],[285, 059, 362, 281],[333, 059, 395, 284],[375, 067, 454, 298],[465, 076, 529, 310],[503, 086, 621, 330];person:[000, 000, 493, 334],[590, 000, 999, 298].')
2023-11-02 22:27:42.043 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:27:42.044 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Compact cars [163, 292, 861, 716] such as the Nissan Sentra finished March with an industry-leading value-retention rate of 2.5%, according to the latest numbers from Black Book. Photo courtesy [004, 007, 989, 988] Nissan USA')
2023-11-02 22:28:24.385 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:28:24.385 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[376, 341, 642, 760] and track its trajectory.Any trajectory details should be arranged according to the Frame t:[xmin,ymin,xmax,ymax] format for clarity.', 'This is a grizzlyFrame1:[376, 341, 642, 760];Frame2:[390, 334, 675, 778];Frame3:[447, 170, 640, 767];Frame4:[451, 257, 683, 746].')
2023-11-02 22:28:34.979 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:28:34.980 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The format category:[xmin,ymin,xmax,ymax] should be strictly observed in your answer.', 'Toiletry:[105, 647, 150, 738],[065, 661, 098, 739],[019, 644, 072, 736],[073, 631, 107, 736];Bathtub:[001, 644, 999, 999];Towel:[261, 797, 501, 888].')
2023-11-02 22:28:56.230 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:28:56.230 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Person:[689, 318, 965, 1000],[519, 871, 602, 1000],[557, 226, 701, 999],[529, 291, 616, 486],[065, 191, 270, 997],[892, 053, 981, 346];Sneakers:[196, 904, 257, 996],[132, 867, 192, 947];Hat:[527, 869, 603, 987];Microphone:[612, 457, 688, 541],[104, 305, 159, 415];Tripod:[417, 620, 539, 925];Speaker:[233, 469, 361, 708],[000, 806, 070, 994],[641, 631, 741, 874],[839, 674, 1000, 1000];Guitar:[091, 320, 365, 536],[538, 353, 616, 561],[721, 481, 781, 811],[885, 052, 998, 353];Cymbal:[372, 392, 465, 472];Drum:[370, 581, 531, 824],[488, 519, 550, 582],[408, 497, 496, 609].')
2023-11-02 22:29:14.364 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:29:14.364 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The format category:[xmin,ymin,xmax,ymax] should be strictly observed in your answer.', 'window:[018, 617, 099, 858],[175, 908, 237, 999],[181, 617, 243, 778],[418, 549, 534, 738],[441, 853, 520, 999],[570, 551, 669, 792],[607, 859, 666, 999],[711, 609, 741, 807],[740, 926, 771, 999].')
2023-11-02 22:29:19.142 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:29:19.143 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track carFrame1:[542, 514, 618, 617],carFrame1:[000, 491, 169, 646],carFrame1:[682, 496, 999, 779],carFrame1:[515, 507, 569, 588],carFrame1:[165, 501, 225, 601] in this video clip.Keep your response consistent with the classFrame t:[xmin,ymin,xmax,ymax] format.', 'carFrame1:[542, 514, 618, 617];Frame2:[594, 498, 710, 633],carFrame1:[000, 491, 169, 646];Frame2:[000, 539, 068, 661],carFrame1[682, 496, 999, 779];Frame2:No object,carFrame1:[515, 507, 569, 588];Frame2:[552, 492, 627, 601],carFrame1:[165, 501, 225, 601];Frame2:[064, 482, 180, 642].')
2023-11-02 22:29:35.581 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:29:35.581 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,can you point out the trajectory of the grey rabbit running left?As you detail the trajectories in your reply, the TrackiFrame t:[xmin,ymin,xmax,ymax] format is the preferred structure.', 'There is no grey rabbit running left.')
2023-11-02 22:29:55.364 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:29:55.364 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[437, 130, 738, 615] [251, 186, 468, 724] a family next to [591, 267, 694, 542] [462, 096, 838, 434] [000, 001, 284, 405] [060, 542, 210, 846] sheep in a painting')
2023-11-02 22:30:55.827 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:30:55.828 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'vídeos de stock, filmes e b-roll de shows exterior shots [257, 157, 638, 994] uk labour party leader jeremy corbyn walking out of polling station after voting in the uk general election 2017 on 8th june... - 2017')
2023-11-02 22:31:14.891 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:31:14.891 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'Wild Bird:[192, 557, 209, 568],[064, 563, 071, 568],[147, 327, 172, 344],[842, 657, 846, 662],[859, 657, 870, 662],[379, 607, 398, 622];Boat:[130, 216, 581, 847];Street Lights:[770, 514, 779, 538],[905, 508, 916, 533];Flag:[287, 246, 315, 272],[310, 296, 338, 318];Person:[639, 611, 642, 621],[022, 639, 026, 647];Lifesaver:[413, 591, 423, 620],[293, 599, 312, 627].')
2023-11-02 22:31:54.701 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:31:54.702 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'Cake:[488, 625, 685, 835],[237, 611, 426, 807];Glasses:[502, 130, 621, 178];Chair:[344, 398, 406, 542];Candle:[360, 540, 398, 635];Desk:[002, 510, 999, 999];Person:[132, 042, 397, 534],[298, 049, 682, 579],[529, 076, 973, 799];Watch:[783, 632, 816, 684];Bracelet:[742, 638, 814, 713];Plate:[002, 656, 086, 811];Spoon:[027, 641, 075, 726],[002, 621, 060, 731];Cup:[051, 702, 123, 829],[118, 502, 173, 596];Pen:[130, 583, 198, 651].')
2023-11-02 22:32:18.220 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:32:18.220 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'Couch:[245, 028, 999, 999].')
2023-11-02 22:33:08.768 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:33:08.768 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'shelf:[002, 585, 216, 994],[678, 589, 964, 997],[422, 528, 482, 846],[482, 530, 552, 854],[002, 030, 161, 378],[161, 042, 364, 234],[324, 059, 436, 348],[436, 060, 552, 345],[552, 030, 791, 364];Lamp:[803, 002, 999, 066];Sink:[713, 545, 899, 609],[838, 585, 1000, 657];Gas stove:[144, 503, 420, 583];Oven:[202, 548, 423, 934];Extractor:[155, 230, 370, 302].')
2023-11-02 22:33:15.455 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:33:15.455 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[583, 427, 615, 493] and track its trajectory.For the trajectories included in the answer, please use the format Frame t:[xmin,ymin,xmax,ymax].', 'This is a border terrierFrame1:[583, 427, 615, 493];Frame2:[660, 483, 700, 525];Frame3:[561, 502, 598, 545];Frame4:[551, 529, 596, 570].')
2023-11-02 22:33:56.356 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:33:56.357 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Chopsticks:[104, 414, 340, 999];Plate:[026, 192, 995, 894],[577, 151, 942, 268];Dinning Table:[002, 115, 999, 999];Cabinet:[101, 002, 535, 126],[845, 002, 977, 114].')
2023-11-02 22:34:04.063 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:34:04.063 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'Chair:[001, 732, 298, 999],[353, 634, 571, 986],[511, 597, 655, 780],[001, 569, 072, 748],[001, 543, 186, 721],[095, 539, 271, 700],[188, 536, 349, 669],[254, 533, 342, 598],[892, 673, 999, 999],[889, 631, 999, 876],[886, 605, 999, 717],[753, 583, 835, 702],[628, 544, 706, 638];Lamp:[015, 138, 254, 290],[145, 196, 348, 323],[242, 239, 412, 347],[312, 280, 466, 369];Person:[740, 521, 788, 570];Desk:[083, 596, 478, 973],[329, 571, 560, 841].')
2023-11-02 22:34:21.964 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:34:21.965 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,please tell me what is thisFrame1:[000, 000, 784, 998] and track its trajectory.For the trajectories included in the answer, please use the format Frame t:[xmin,ymin,xmax,ymax].', 'This is a horseless carriageFrame1:[000, 000, 784, 998];Frame2:[027, 143, 558, 877];Frame3:[158, 237, 524, 741];Frame4:[297, 262, 586, 657].')
2023-11-02 22:34:30.589 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:34:30.590 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, "Help, I'm trapped in a females body [119, 431, 764, 912], and the DMs are including complimentary Slip n' slides! 😫")
2023-11-02 22:35:37.476 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:35:37.477 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Slippers and Cow in this image.Consistently apply the category:[xmin,ymin,xmax,ymax] format to your answer.', 'Slippers:[331, 478, 404, 723];Cow:[343, 391, 894, 947].')
2023-11-02 22:36:07.415 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:07.415 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Knife,Fork and Picture/Frame in this image.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'Knife:[710, 805, 794, 851],[220, 781, 373, 810];Fork:[761, 813, 801, 845],[257, 750, 348, 797];Picture:[490, 463, 518, 505].')
2023-11-02 22:36:18.463 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:18.463 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("Given a video clip including frame1,frame2,frame3 and frame4,can you tell me what is thisFrame1:[351, 375, 455, 578] and track its trajectory.If you're including trajectory details in your reply, the Frame t:[xmin,ymin,xmax,ymax] format is imperative.", 'This is a urialFrame1:[351, 375, 455, 578];Frame2:[308, 374, 427, 574];Frame3:[303, 374, 424, 574];Frame4:[293, 374, 407, 565].')
2023-11-02 22:36:20.784 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:20.784 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'students [766, 428, 846, 745] [269, 415, 374, 791] [359, 383, 448, 773] [055, 406, 171, 844] [700, 428, 782, 753] [834, 443, 919, 752] [164, 407, 278, 816] [628, 421, 709, 756] standing in front of a CTE Works sign holding certificates [714, 503, 767, 559] [301, 480, 359, 543] [843, 487, 899, 542] [643, 501, 698, 562] [780, 495, 835, 552] [386, 494, 441, 555] [081, 536, 153, 612] [203, 521, 266, 588]')
2023-11-02 22:36:29.670 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:29.671 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,please tell me the trajectory of the the tiger running from afar to the vicinity of the water pool, located on the right-hand side..To ensure accuracy, apply the TrackiFrame t:[xmin,ymin,xmax,ymax] template for every trajectory in your response.', 'There is one the tiger running from afar to the vicinity of the water pool, located on the right-hand side..Track1frame:1:[434, 051, 736, 254];frame:2:[440, 055, 748, 252];frame:3:[486, 091, 822, 263].')
2023-11-02 22:36:30.609 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:30.609 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Consistently apply the category:[xmin,ymin,xmax,ymax] format to your answer.', 'Bench:[878, 466, 999, 711];Car:[000, 163, 130, 479],[247, 186, 464, 279],[546, 191, 759, 257],[934, 195, 1000, 265],[854, 184, 979, 265],[778, 180, 953, 263],[696, 176, 804, 229],[538, 169, 617, 217],[563, 141, 601, 170],[536, 138, 567, 171],[510, 149, 563, 193],[449, 152, 539, 224],[434, 143, 484, 195],[410, 135, 443, 166],[344, 137, 391, 193],[343, 133, 378, 179],[306, 123, 339, 169],[272, 124, 295, 167],[235, 121, 282, 174],[221, 128, 249, 193],[082, 111, 135, 152],[034, 111, 115, 161],[000, 114, 100, 170];SUV:[578, 153, 660, 199],[611, 146, 650, 178],[358, 135, 443, 223],[112, 105, 238, 211];Van:[394, 115, 431, 143];Street Lights:[896, 092, 925, 184].')
2023-11-02 22:36:47.368 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:47.369 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,please tell me what is thisFrame1:[353, 413, 724, 676] and track its trajectory.Your response should highlight trajectories using the established Frame t:[xmin,ymin,xmax,ymax] structure.', 'This is a leopardFrame1:[353, 413, 724, 676];Frame2:[324, 390, 596, 672];Frame3:[397, 440, 689, 687];Frame4:[410, 436, 642, 731].')
2023-11-02 22:36:48.183 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:36:48.184 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'Person:[418, 627, 505, 830],[522, 645, 604, 821],[437, 580, 507, 749];Cabinet:[529, 643, 700, 752],[494, 515, 625, 591];Carpet:[322, 775, 505, 823].')
2023-11-02 22:37:07.318 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:37:07.319 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("Given frame1: and frame2:,track carFrame1:[739, 823, 1000, 999],carFrame1:[390, 765, 424, 828],carFrame1:[599, 798, 798, 1000],carFrame1:[000, 771, 070, 902],carFrame1:[284, 755, 322, 819],carFrame1:[453, 778, 517, 874],carFrame1:[407, 765, 455, 840] in this video clip.It's imperative that you stick to the categoryFrame t:[xmin,ymin,xmax,ymax] format for your answer.", 'carFrame1[739, 823, 1000, 999];Frame2:No object,carFrame1:[390, 765, 424, 828];Frame2:[400, 758, 448, 830],carFrame1:[599, 798, 798, 1000];Frame2:[773, 818, 999, 1000],carFrame1[000, 771, 070, 902];Frame2:No object,carFrame1:[284, 755, 322, 819];Frame2:[283, 751, 320, 813],carFrame1:[453, 778, 517, 874];Frame2:[487, 778, 591, 899],carFrame1:[407, 765, 455, 840];Frame2:[421, 759, 485, 847].')
2023-11-02 22:37:16.388 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:37:16.389 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'tree:[000, 000, 976, 1000];flower:[387, 377, 591, 603],[000, 844, 212, 999],[263, 418, 397, 544],[310, 592, 440, 750],[465, 000, 581, 201],[473, 689, 574, 815],[627, 084, 874, 194].')
2023-11-02 22:37:30.246 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:37:30.247 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'New special agents and [408, 473, 760, 997] [107, 613, 366, 997] [763, 551, 997, 997] intelligence analysts view an exhibit at the 9/11 Memorial & Museum in New York City on Saturday, March 9, 2019.')
2023-11-02 22:37:34.275 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:37:34.276 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'Desk:[001, 504, 267, 785],[251, 428, 406, 675];Cabinet:[526, 298, 824, 484];Storage box:[627, 355, 709, 449],[668, 349, 735, 407];Chair:[714, 629, 875, 999],[650, 558, 795, 757],[532, 554, 679, 762];Person:[135, 745, 337, 999],[493, 742, 720, 1000],[242, 058, 416, 917],[407, 350, 661, 835],[580, 363, 838, 760],[669, 359, 966, 981],[761, 545, 1000, 998],[774, 441, 1000, 931];Leather Shoes:[374, 823, 411, 865];Boots:[415, 642, 498, 757],[479, 728, 553, 835];Sneakers:[705, 907, 784, 975],[609, 671, 671, 732];Satchel:[001, 821, 121, 997].')
2023-11-02 22:38:06.302 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:38:06.303 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Potted Plant:[820, 609, 870, 694],[731, 625, 835, 762],[620, 620, 676, 697],[340, 618, 387, 698],[195, 611, 288, 762],[143, 612, 192, 694],[021, 623, 056, 674].')
2023-11-02 22:38:07.288 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:38:07.288 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'cabinetry:[000, 670, 190, 999],[000, 000, 515, 443],[263, 542, 500, 981];oven:[096, 181, 296, 425],[210, 701, 398, 996];gas stove:[068, 615, 366, 717];refrigerator:[363, 325, 594, 914];countertop:[000, 546, 492, 999],[543, 600, 999, 861];microwave oven:[194, 722, 401, 999].')
2023-11-02 22:38:26.529 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:38:26.529 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Maintain strict adherence to the format category:[xmin,ymin,xmax,ymax] when presenting your answer.', 'Bed:[191, 544, 913, 1000];Cabinet:[001, 002, 222, 787];Carpet:[001, 781, 318, 999];Handbag:[017, 775, 167, 866].')
2023-11-02 22:38:40.416 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 3 samples!
2023-11-02 22:38:40.416 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'fixed-wing aircraft:[000, 209, 999, 827],[450, 000, 964, 272].')
2023-11-02 22:39:58.886 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:39:58.887 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'woman:[564, 453, 763, 992],[731, 382, 984, 998];man:[060, 313, 360, 975],[000, 541, 055, 998],[000, 402, 073, 892],[034, 392, 085, 493],[045, 413, 137, 773],[334, 233, 574, 998];suit:[000, 545, 027, 871],[000, 492, 064, 972],[000, 461, 079, 891],[065, 450, 130, 821],[339, 433, 569, 998];girl:[555, 447, 753, 998];human face:[006, 398, 039, 472],[065, 418, 086, 488],[100, 443, 147, 527],[190, 348, 290, 527],[363, 282, 455, 475],[590, 478, 679, 655],[741, 422, 840, 605].')
2023-11-02 22:40:16.152 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:40:16.152 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The format category:[xmin,ymin,xmax,ymax] should be strictly observed in your answer.', 'person:[732, 505, 820, 765],[000, 462, 159, 853],[169, 519, 227, 582],[171, 446, 380, 829],[272, 456, 488, 855],[490, 488, 563, 766],[523, 484, 620, 763],[629, 498, 706, 753],[859, 493, 998, 716],[940, 510, 993, 706];chair:[522, 545, 690, 793],[000, 646, 056, 950],[066, 548, 131, 847],[092, 618, 156, 770],[144, 635, 207, 761],[216, 550, 376, 835],[314, 537, 517, 845],[416, 615, 561, 785],[626, 566, 725, 750],[739, 555, 850, 766],[756, 556, 865, 729],[846, 566, 969, 740],[872, 569, 935, 710];table:[022, 574, 405, 895],[637, 573, 826, 757],[913, 560, 999, 718].')
2023-11-02 22:40:27.752 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:40:27.753 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track carFrame1:[174, 471, 254, 549],carFrame1:[544, 476, 580, 562],carFrame1:[313, 466, 380, 542],carFrame1:[000, 418, 121, 590],carFrame1:[239, 473, 282, 534],carFrame1:[602, 461, 732, 617],carFrame1:[670, 476, 881, 689] in this video clip.Ensure you use the exact format categoryFrame t:[xmin,ymin,xmax,ymax] in your response.', 'carFrame1:[174, 471, 254, 549];Frame2:[107, 464, 212, 565],carFrame1:[544, 476, 580, 562];Frame2:[571, 475, 627, 582],carFrame1:[313, 466, 380, 542];Frame2:[228, 456, 341, 575],carFrame1[000, 418, 121, 590];Frame2:No object,carFrame1:[239, 473, 282, 534];Frame2:[199, 472, 246, 545],carFrame1:[602, 461, 732, 617];Frame2:[675, 448, 932, 695],carFrame1:[670, 476, 881, 689];Frame2:[849, 504, 1000, 763].')
2023-11-02 22:40:42.282 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:40:42.282 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track personFrame1:[637, 388, 700, 598],personFrame1:[455, 711, 514, 998],personFrame1:[750, 344, 803, 537],personFrame1:[319, 636, 362, 875],personFrame1:[317, 483, 375, 722],personFrame1:[455, 504, 512, 706],personFrame1:[582, 565, 648, 759] in this video clip.The prescribed format for your answer is classFrame t:[xmin,ymin,xmax,ymax]. Please follow it closely.', 'personFrame1:[637, 388, 700, 598];Frame2:[643, 377, 687, 590],personFrame1:[455, 711, 514, 998];Frame2:[485, 726, 564, 998],personFrame1:[750, 344, 803, 537];Frame2:[740, 330, 801, 519],personFrame1:[319, 636, 362, 875];Frame2:[354, 641, 410, 873],personFrame1:[317, 483, 375, 722];Frame2:[296, 463, 350, 673],personFrame1:[455, 504, 512, 706];Frame2:[436, 518, 518, 720],personFrame1:[582, 565, 648, 759];Frame2:[558, 590, 636, 787].')
2023-11-02 22:40:51.781 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:40:51.781 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect man,human face,girl and glasses in this image.Maintain strict adherence to the format category:[x0,y0,x1,y1] when presenting your answer.', 'man:[182, 000, 417, 949],[365, 000, 549, 325],[848, 400, 999, 765];human face:[140, 118, 228, 392],[229, 000, 298, 066],[405, 000, 457, 083],[865, 415, 908, 505];girl:[000, 070, 339, 999],[435, 245, 738, 994],[486, 437, 972, 999];glasses:[116, 207, 218, 262].')
2023-11-02 22:40:59.785 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:40:59.785 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[543, 002, 922, 451] [226, 576, 537, 998] Promotion Concept promo bags in [003, 004, 999, 996] a tree')
2023-11-02 22:41:09.479 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:09.480 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Person:[548, 136, 919, 999],[945, 521, 999, 577];Power outlet:[456, 426, 487, 512];Potted Plant:[190, 789, 311, 974];Storage box:[108, 570, 229, 768],[141, 740, 235, 878];Sink:[876, 567, 953, 675];Bottle:[002, 780, 064, 1000],[225, 654, 281, 776],[310, 561, 350, 785];Cup:[408, 666, 460, 788],[472, 535, 520, 673];Cutting:[913, 532, 992, 590];Oven:[927, 359, 1000, 494].')
2023-11-02 22:41:18.202 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:18.202 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Hockey Stick:[417, 632, 778, 720],[129, 420, 351, 508],[000, 460, 087, 529];Person:[349, 449, 602, 846],[470, 199, 652, 721],[296, 160, 500, 617];Bottle:[740, 311, 786, 369];Helmet:[541, 197, 593, 304],[541, 449, 594, 511],[414, 162, 461, 223];Gloves:[383, 576, 429, 664],[536, 632, 596, 702];Sneakers:[350, 721, 413, 815],[405, 794, 470, 845],[296, 551, 353, 615],[300, 492, 346, 573].')
2023-11-02 22:41:27.558 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:27.558 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'Cup:[961, 488, 978, 543];Plate:[965, 529, 993, 555],[935, 395, 982, 421];Person:[000, 420, 200, 1000],[121, 349, 148, 491],[009, 395, 026, 434],[025, 352, 055, 451],[031, 342, 109, 531],[087, 354, 115, 492],[314, 293, 391, 420],[559, 267, 641, 374];Sandals:[970, 705, 999, 735];Chair:[863, 431, 928, 547],[897, 425, 952, 525];Ambulance:[133, 021, 885, 916].')
2023-11-02 22:41:36.092 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:36.093 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Carpet:[002, 934, 286, 999],[757, 934, 998, 1000];Cabinet:[783, 431, 999, 912],[715, 484, 788, 769],[586, 513, 714, 751],[341, 592, 651, 988],[140, 585, 303, 785],[001, 442, 154, 852],[151, 564, 217, 681],[302, 642, 345, 713];Storage box:[634, 945, 747, 974],[647, 919, 745, 952],[648, 885, 737, 924],[647, 833, 712, 863];Person:[153, 585, 265, 905];Umbrella:[122, 796, 164, 833];Handbag:[206, 705, 282, 776];High Heels:[429, 813, 471, 870],[574, 825, 625, 877],[817, 747, 865, 780],[833, 762, 881, 793],[859, 772, 905, 802],[468, 641, 516, 679],[508, 642, 558, 678],[445, 715, 490, 753],[862, 446, 906, 475],[899, 778, 948, 809],[663, 855, 730, 896],[649, 841, 717, 876],[924, 715, 971, 747],[868, 709, 936, 739],[482, 732, 533, 782],[588, 821, 639, 857],[847, 764, 894, 797],[820, 754, 871, 785];Boots:[176, 863, 243, 910],[201, 839, 251, 885],[672, 123, 713, 154].')
2023-11-02 22:41:36.927 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:36.927 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Wild Bird:[514, 284, 621, 348],[009, 249, 123, 321],[108, 232, 228, 302],[213, 260, 331, 321],[307, 113, 422, 210].')
2023-11-02 22:41:37.817 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:37.817 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Fairies clipart sad. The real story [015, 011, 984, 988] of')
2023-11-02 22:41:39.902 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:39.902 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The format category:[xmin,ymin,xmax,ymax] should be strictly observed in your answer.', 'bicycle wheel:[018, 343, 584, 990],[575, 404, 719, 767];bicycle:[001, 112, 716, 999];wheel:[000, 388, 552, 999],[576, 406, 714, 767];tire:[000, 387, 551, 999],[562, 401, 725, 765].')
2023-11-02 22:41:41.446 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:41.446 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'desk:[101, 097, 870, 960],[880, 285, 998, 997];table:[091, 139, 921, 958];cupboard:[085, 071, 898, 965];mug:[431, 496, 528, 635],[458, 315, 565, 459];coffee cup:[433, 491, 528, 633],[456, 320, 566, 459].')
2023-11-02 22:41:50.147 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:41:50.147 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[033, 018, 959, 994] Children with Thanakha makeup on [481, 101, 847, 412] their faces')
2023-11-02 22:42:07.325 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:42:07.325 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[093, 430, 303, 808] Penthouse in [008, 004, 994, 998] the city, holiday rental in Rehovot')
2023-11-02 22:42:20.375 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:42:20.376 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video cluo including frame1,frame2,frame3,frame4 and frame5,what is thisFrame1:[441, 343, 873, 625] and track its trajectory.Your response should highlight trajectories using the established Frame t:[xmin,ymin,xmax,ymax] structure.', 'This is a marine iguanaFrame1:[441, 343, 873, 625];Frame2:[444, 291, 873, 531];Frame3:[446, 270, 874, 545];Frame4:[445, 287, 874, 547];Frame5:[435, 293, 873, 540].')
2023-11-02 22:42:36.629 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:42:36.629 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'Other Fish:[049, 109, 947, 681];Potted Plant:[264, 022, 304, 108],[302, 021, 387, 115];Vase:[705, 051, 764, 111],[478, 057, 651, 130];Desk:[000, 595, 1000, 999];Bowl:[066, 029, 186, 107].')
2023-11-02 22:43:17.673 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:43:17.674 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'exterior shots of wind turbines [674, 418, 817, 832] [125, 252, 318, 809] and a rainbow [239, 004, 702, 459] in a farm field [005, 794, 994, 995] in northamptonshire, united kingdom - sustainable energy stock videos & royalty-free footage')
2023-11-02 22:44:31.708 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:44:31.709 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'shows [243, 669, 745, 913] the wood ledge shelves with multiple wood frames on [008, 005, 765, 994] gray wall in a hallway')
2023-11-02 22:45:18.105 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:45:18.105 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'A group of [415, 282, 555, 635] [674, 231, 822, 951] [474, 555, 582, 981] [664, 602, 777, 995] [837, 201, 967, 745] [795, 564, 975, 995] [548, 536, 690, 996] [393, 536, 499, 997] [187, 234, 304, 983] [290, 289, 414, 950] [024, 240, 155, 996] students protesting for awareness about the climate crisis.')
2023-11-02 22:45:20.576 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:45:20.577 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect man and microphone in this image.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'man:[144, 100, 610, 989];microphone:[002, 563, 157, 654],[308, 258, 376, 340],[528, 480, 681, 534],[616, 830, 770, 907].')
2023-11-02 22:45:47.149 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:45:47.149 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'boy:[470, 032, 970, 985];human body:[215, 055, 741, 989],[543, 017, 999, 999],[855, 113, 999, 306];human hair:[202, 033, 563, 619],[523, 029, 783, 480],[896, 130, 972, 219];human head:[255, 024, 554, 520],[517, 037, 796, 521],[880, 148, 960, 270];mammal:[108, 000, 703, 966],[000, 283, 143, 537],[526, 026, 999, 999],[852, 135, 997, 314];clothing:[638, 340, 1000, 1000],[169, 465, 663, 995];human eye:[402, 261, 468, 321],[653, 216, 724, 284];human mouth:[434, 389, 511, 458],[621, 382, 701, 448];human ear:[310, 313, 352, 382],[769, 226, 798, 325];man:[538, 030, 999, 999];girl:[161, 003, 665, 999];human face:[353, 122, 541, 511],[538, 098, 778, 508],[891, 164, 954, 258].')
2023-11-02 22:45:59.356 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:45:59.357 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect man,human face,woman and clothing in this image.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'man:[000, 091, 439, 999];human face:[209, 161, 420, 607],[449, 169, 614, 512],[619, 320, 785, 638];woman:[564, 252, 968, 997],[048, 034, 661, 999];clothing:[035, 395, 660, 999],[617, 570, 998, 999].')
2023-11-02 22:46:09.026 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:46:09.026 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'A herd of [764, 402, 905, 641] [067, 362, 249, 627] [225, 349, 384, 621] [358, 443, 537, 630] elephants with speech bubbles filled with [175, 081, 994, 384] various charts and data')
2023-11-02 22:46:17.423 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:46:17.423 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.For your response, please adhere to the specified category:[xmin,ymin,xmax,ymax] format.', 'man:[040, 710, 171, 997],[170, 662, 282, 999],[237, 730, 317, 905],[320, 719, 395, 903],[409, 650, 464, 836],[526, 623, 586, 838],[596, 709, 635, 805],[643, 639, 705, 841];clothing:[033, 740, 170, 996],[000, 828, 020, 999],[131, 876, 251, 999],[218, 763, 280, 999],[242, 735, 472, 870],[410, 665, 465, 837],[444, 165, 543, 242],[525, 645, 583, 843],[595, 717, 635, 795],[633, 720, 653, 791],[641, 662, 701, 829],[707, 707, 756, 890],[793, 710, 817, 798],[925, 645, 954, 777];palm tree:[695, 023, 962, 625],[074, 000, 320, 894],[388, 367, 514, 632];woman:[126, 748, 258, 999],[253, 730, 303, 879],[435, 730, 473, 835],[628, 708, 653, 801],[707, 675, 759, 903];building:[000, 101, 273, 720],[235, 203, 432, 692],[331, 227, 983, 770].')
2023-11-02 22:46:35.521 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:46:35.521 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect person,snack and doughnut in this image.Consistently apply the category:[xmin,ymin,xmax,ymax] format to your answer.', 'person:[022, 000, 494, 187],[352, 000, 999, 451];snack:[048, 318, 999, 969].')
2023-11-02 22:46:37.203 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:46:37.203 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[460, 088, 769, 995] daughter whispering to [128, 031, 589, 994] mom on [001, 198, 999, 991] the couch while mom discovers how parents can influence their children')
2023-11-02 22:46:41.615 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:46:41.615 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'human body:[766, 280, 935, 955],[002, 261, 094, 999],[139, 119, 759, 999],[605, 349, 620, 430],[650, 212, 805, 988],[959, 411, 999, 984];human hair:[237, 103, 518, 493],[732, 207, 806, 356],[827, 270, 895, 518];human head:[236, 128, 522, 535],[000, 260, 053, 506],[739, 209, 807, 330],[831, 268, 893, 414];man:[135, 188, 728, 984],[640, 211, 832, 992];glasses:[306, 306, 519, 399];tree:[002, 000, 999, 623],[145, 000, 337, 701];human face:[007, 372, 040, 499],[316, 231, 496, 559];human arm:[051, 586, 108, 784],[143, 464, 545, 999],[564, 637, 746, 999],[648, 370, 681, 711],[746, 386, 810, 784],[777, 434, 859, 604],[868, 436, 928, 609],[950, 439, 999, 699];sports uniform:[666, 340, 825, 919];human nose:[422, 320, 471, 420].')
2023-11-02 22:47:08.042 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:47:08.042 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'plant:[000, 036, 934, 984];food:[176, 318, 267, 429],[201, 231, 281, 326],[250, 271, 340, 376],[342, 497, 446, 617],[343, 615, 457, 753],[406, 042, 490, 137],[415, 126, 489, 215],[426, 535, 534, 671],[640, 778, 728, 887],[646, 666, 732, 770],[710, 730, 810, 850],[806, 388, 842, 464],[848, 413, 935, 504].')
2023-11-02 22:47:20.841 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:47:20.842 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Former World Series co-MVP Curt Schilling [005, 019, 916, 994] "has been advised that ... his employment with ESPN has been terminated," the network said.')
2023-11-02 22:47:23.626 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:47:23.626 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,please tell me what is thisFrame1:[430, 630, 527, 934] and track its trajectory.Any trajectory details should be arranged according to the Frame t:[xmin,ymin,xmax,ymax] format for clarity.', 'This is a mountain bikeFrame1:[430, 630, 527, 934];Frame2:[400, 524, 486, 756];Frame3:[388, 540, 463, 743].')
2023-11-02 22:47:43.717 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:47:43.717 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect woman,man,mammal,human head,human face,human nose,vase and computer mouse in this image.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'woman:[468, 325, 960, 1000],[353, 418, 520, 800],[482, 534, 568, 709],[863, 526, 941, 712];man:[000, 181, 547, 999];mammal:[499, 383, 966, 975],[000, 188, 557, 999],[353, 411, 521, 795],[490, 524, 565, 713],[593, 539, 640, 605],[828, 565, 900, 634],[850, 525, 940, 740],[980, 894, 999, 999],[981, 482, 999, 906];human head:[611, 290, 789, 545],[211, 198, 405, 537],[353, 417, 484, 596],[483, 530, 566, 661],[595, 543, 658, 600],[831, 565, 898, 629],[978, 485, 999, 598];human face:[236, 234, 398, 547],[503, 554, 558, 666],[626, 327, 763, 589];human nose:[330, 356, 366, 419].')
2023-11-02 22:47:46.306 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:47:46.306 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2,frame3 and frame4,please tell me what is thisFrame1:[248, 460, 528, 554] and track its trajectory.To maintain consistency, ensure that trajectories in your answer match the Frame t:[xmin,ymin,xmax,ymax] setup.', 'This is a pt boatFrame1:[248, 460, 528, 554];Frame2:[240, 468, 526, 560];Frame3:[245, 468, 539, 572];Frame4:[253, 470, 550, 570].')
2023-11-02 22:48:01.997 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:48:01.997 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Leather Shoes,Glasses,Wine Glass,Cup and Boots in this image.Consistently apply the category:[xmin,ymin,xmax,ymax] format to your answer.', 'Leather Shoes:[343, 930, 370, 999];Glasses:[799, 668, 831, 685],[622, 542, 639, 550],[387, 572, 430, 590],[350, 502, 359, 511];Wine Glass:[949, 808, 994, 932],[566, 665, 591, 737],[919, 543, 932, 576],[783, 526, 794, 547],[746, 533, 757, 558],[646, 553, 658, 576],[482, 584, 498, 630],[024, 563, 037, 601],[250, 569, 264, 611];Cup:[593, 676, 619, 720],[568, 685, 595, 715],[641, 631, 662, 693],[685, 631, 705, 670],[842, 576, 856, 596],[743, 548, 751, 558],[322, 613, 341, 641],[273, 653, 295, 684],[177, 579, 193, 604];Boots:[154, 768, 209, 924].')
2023-11-02 22:48:10.328 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:48:10.328 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[112, 066, 627, 972] A person dressed for yoga leans into a pose near [000, 237, 992, 680] a lake at sunrise. Exercising mindfully can increase your likelihood of making your workouts a consistent habit.')
2023-11-02 22:48:15.447 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:48:15.447 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'The year’s top quotes [367, 549, 616, 759]: ‘Wear a mask [321, 286, 680, 993]’ and ‘I can’t breathe’')
2023-11-02 22:48:46.362 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:48:46.362 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect girl and clothing in this image.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'girl:[005, 634, 093, 812],[164, 615, 218, 775],[789, 605, 966, 995],[851, 657, 983, 985];clothing:[000, 675, 057, 999],[000, 668, 090, 784],[101, 711, 190, 794],[121, 340, 460, 992],[681, 694, 774, 995],[788, 716, 968, 999],[855, 735, 971, 985].')
2023-11-02 22:48:57.405 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:48:57.406 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'tree:[000, 000, 999, 773].')
2023-11-02 22:49:33.391 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:49:33.392 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect Bracelet and Handbag/Satchel in this image.When composing your answer, be sure to consistently utilize the category:[xmin,ymin,xmax,ymax] structure.', 'Bracelet:[471, 615, 544, 647];Handbag:[487, 393, 542, 425].')
2023-11-02 22:49:47.257 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:49:47.258 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please make sure your answer follows the category:[xmin,ymin,xmax,ymax] configuration precisely.', 'bucket:[679, 465, 693, 491];Person:[283, 415, 297, 462],[986, 424, 999, 570];SUV:[777, 439, 1000, 571],[000, 386, 275, 632],[435, 427, 515, 525],[488, 427, 543, 503],[674, 434, 734, 484];Truck:[786, 374, 1000, 485],[000, 355, 153, 460],[188, 365, 418, 474],[802, 444, 827, 463];Street Lights:[803, 242, 834, 463],[629, 372, 637, 461];Pickup Truck:[271, 415, 465, 569],[508, 422, 555, 473];Car:[579, 440, 607, 466];Desk:[269, 453, 283, 467];Chair:[918, 458, 934, 488];Hat:[986, 424, 1000, 443].')
2023-11-02 22:50:14.799 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:50:14.799 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Congregation members [006, 112, 981, 990] personalized the banner [143, 134, 738, 790] with written messages and signatures. (Stephen F. Brown-Pearn [001, 119, 295, 996]')
2023-11-02 22:50:48.670 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:50:48.670 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video cluo including frame1,frame2,frame3,frame4 and frame5,can you tell me what is thisFrame1:[783, 562, 857, 900] and track its trajectory.For the trajectories included in the answer, please use the format Frame t:[xmin,ymin,xmax,ymax].', 'This is a personFrame1:[783, 562, 857, 900];Frame2:[794, 575, 876, 933];Frame3:[781, 593, 858, 912];Frame4:[767, 587, 834, 923];Frame5:[744, 577, 814, 916].')
2023-11-02 22:51:01.024 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:51:01.025 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect hair spray,man,dress,suit and glasses in this image.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'man:[236, 349, 365, 992],[311, 338, 397, 635],[343, 377, 391, 492],[391, 315, 473, 580],[487, 369, 552, 561],[680, 000, 999, 999],[846, 378, 895, 469],[879, 395, 943, 545];dress:[345, 583, 555, 922];suit:[560, 196, 897, 1000];glasses:[876, 202, 984, 280],[420, 462, 505, 506],[611, 229, 728, 264].')
2023-11-02 22:51:15.479 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:51:15.479 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'This white lace midi dress [217, 344, 758, 779] is the perfect summer wardrobe staple for all your warm weather events! It comes in so many fantastic colors too, and is on sale now!')
2023-11-02 22:51:18.250 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:51:18.250 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'food:[745, 641, 849, 752];drink:[888, 758, 954, 808],[955, 880, 999, 998];woman:[000, 175, 162, 998],[200, 000, 520, 450],[436, 314, 741, 998];man:[071, 198, 531, 998],[642, 298, 725, 550],[754, 334, 938, 611];clothing:[074, 461, 544, 998],[223, 157, 516, 448],[441, 520, 750, 998],[587, 504, 714, 628],[641, 350, 725, 545],[806, 477, 882, 608];human face:[278, 015, 368, 158],[316, 247, 497, 591],[522, 322, 632, 511],[798, 370, 842, 457].')
2023-11-02 22:51:38.808 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:51:38.809 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect auto part and person in this image.Maintain strict adherence to the format category:[x0,y0,x1,y1] when presenting your answer.', 'auto part:[149, 432, 275, 504],[182, 383, 577, 519],[554, 275, 832, 500],[670, 447, 976, 593];person:[000, 412, 026, 688],[011, 396, 087, 674],[044, 414, 091, 689],[083, 391, 163, 676],[261, 442, 323, 683],[383, 426, 455, 704],[460, 407, 549, 699],[580, 416, 633, 698],[706, 402, 770, 688],[900, 424, 973, 694].')
2023-11-02 22:51:54.234 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:51:54.235 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Members of the military [512, 227, 600, 772] [408, 177, 573, 889] walk the hallway of Cell Block C in the Camp 5 detention facility in January 2012.')
2023-11-02 22:52:05.605 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:52:05.605 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, '[191, 027, 814, 995] Jamie Foxx set to star in Netflix series inspired by relationship with his daughter')
2023-11-02 22:52:22.879 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:52:22.880 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,can you tell me the trajectory of the ladies dancing in this video clip?Ensure the trajectories in your answer follow the TrackiFrame t:[xmin,ymin,xmax,ymax] structure.', 'There are 4 ladies dancing.Track1frame:1:[468, 483, 565, 702];frame:2:[285, 452, 375, 656];frame:3:[465, 466, 525, 585],Track2frame:1:[757, 481, 850, 685];frame:2:[760, 462, 851, 683];frame:3:[717, 485, 834, 685],Track3frame:1:[198, 472, 273, 637];frame:2:[184, 458, 250, 677];frame:3:[220, 466, 257, 679],Track4frame:1:[381, 464, 439, 716];frame:2:[232, 452, 285, 662];frame:3:[343, 456, 457, 706].')
2023-11-02 22:52:27.543 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:52:27.543 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,please tell me the trajectory of the avian creature moving forward.Use the specified TrackiFrame t:[xmin,ymin,xmax,ymax] format for all trajectories in your reply.', 'There are 3 avian creature moving forward.Track1frame:1:[000, 151, 889, 998];frame:2:[101, 001, 876, 998];frame:3:[157, 001, 868, 998],Track2frame:1:[650, 165, 771, 643];frame:2:[653, 157, 776, 625];frame:3:[648, 144, 771, 607],Track3frame:1:[670, 243, 862, 849];frame:2:[677, 234, 869, 863];frame:3:[669, 224, 863, 904].')
2023-11-02 22:52:28.224 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:52:28.224 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\ndetect Leather Shoes,Person,Other Shoes and Street Lights in this image.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Person:[662, 069, 882, 1000],[560, 158, 704, 815],[351, 178, 514, 850];Other Shoes:[573, 752, 602, 800],[639, 758, 666, 814];Street Lights:[650, 041, 688, 272].')
2023-11-02 22:53:01.726 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:53:01.727 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\ndetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Person:[001, 200, 096, 996],[150, 337, 218, 534],[274, 365, 378, 603],[273, 389, 381, 777],[382, 336, 465, 672],[460, 331, 573, 818],[620, 369, 729, 732],[737, 311, 844, 794];Hat:[001, 204, 080, 284].')
2023-11-02 22:53:03.128 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:53:03.128 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect shorts,human face,boy,man and ladder in this image.To maintain clarity, use the prescribed category:[x0,y0,x1,y1] format for your answer.', 'shorts:[325, 654, 559, 760];human face:[384, 145, 564, 298];boy:[300, 132, 692, 901];man:[214, 156, 295, 434].')
2023-11-02 22:53:30.290 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:53:30.290 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'cat:[180, 000, 768, 999].')
2023-11-02 22:53:37.284 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:53:37.284 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect clothing,chair,table and person in this image.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'clothing:[000, 553, 228, 970],[128, 573, 201, 653],[208, 640, 350, 849],[371, 750, 505, 999],[382, 481, 415, 544],[405, 580, 498, 688],[591, 393, 643, 509],[643, 648, 873, 957],[826, 438, 869, 569],[873, 549, 943, 665],[900, 580, 999, 923];chair:[755, 801, 920, 993],[000, 754, 160, 999],[671, 476, 706, 561],[901, 759, 999, 994];table:[616, 496, 710, 576];person:[000, 486, 123, 662],[000, 460, 253, 999],[076, 377, 110, 442],[126, 500, 212, 654],[167, 497, 221, 574],[184, 474, 222, 545],[198, 560, 366, 866],[231, 500, 283, 591],[233, 528, 293, 626],[267, 436, 313, 516],[301, 486, 345, 580],[317, 501, 372, 606],[368, 525, 401, 591],[376, 455, 460, 586],[378, 689, 522, 999],[405, 541, 504, 725],[588, 377, 654, 556],[648, 531, 868, 962],[699, 521, 737, 684],[775, 537, 818, 627],[828, 406, 873, 571],[858, 500, 947, 661],[878, 480, 999, 924].')
2023-11-02 22:54:05.927 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:54:05.927 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'clothing:[696, 000, 999, 814].')
2023-11-02 22:54:22.749 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:54:22.750 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Ensure your response adheres strictly to the format category:[xmin,ymin,xmax,ymax]', 'tree:[000, 076, 263, 787],[454, 304, 996, 800];train:[383, 447, 821, 883];car:[000, 603, 042, 884],[197, 758, 309, 855].')
2023-11-02 22:55:00.229 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:55:00.229 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'skull:[554, 151, 768, 484];clothing:[421, 304, 961, 978];human face:[565, 172, 766, 515].')
2023-11-02 22:55:14.735 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:55:14.735 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track personFrame1:[468, 587, 532, 772],personFrame1:[646, 423, 695, 562],personFrame1:[428, 515, 479, 666],personFrame1:[703, 376, 764, 505],personFrame1:[507, 430, 543, 586] in this video clip.The prescribed format for your answer is classFrame t:[xmin,ymin,xmax,ymax]. Please follow it closely.', 'personFrame1:[468, 587, 532, 772];Frame2:[456, 573, 512, 769],personFrame1:[646, 423, 695, 562];Frame2:[667, 420, 711, 565],personFrame1:[428, 515, 479, 666];Frame2:[462, 516, 506, 668],personFrame1:[703, 376, 764, 505];Frame2:[707, 372, 747, 505],personFrame1:[507, 430, 543, 586];Frame2:[496, 412, 535, 572].')
2023-11-02 22:55:16.997 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:55:16.998 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your answer should be structured precisely according to the category:[xmin,ymin,xmax,ymax] format.', 'Lamp:[045, 069, 107, 203],[123, 059, 180, 193],[211, 070, 266, 206],[280, 069, 335, 207],[371, 075, 426, 213],[435, 079, 486, 214],[536, 080, 593, 213],[602, 080, 659, 219],[693, 084, 747, 215],[768, 088, 816, 224],[821, 088, 872, 225],[882, 093, 934, 229];Guitar:[751, 523, 915, 678],[371, 640, 403, 806],[909, 649, 962, 808];Drum:[579, 714, 635, 787],[476, 734, 583, 895],[446, 735, 481, 788],[241, 741, 317, 850],[565, 786, 621, 900];Violin:[060, 548, 126, 643];Person:[001, 451, 040, 908],[010, 469, 106, 908],[124, 538, 237, 888],[445, 433, 605, 905],[760, 421, 870, 909];Hat:[517, 431, 554, 476];Pickup Truck:[107, 654, 465, 815];Van:[735, 702, 915, 839];Tripod:[281, 648, 329, 914],[483, 526, 578, 918],[205, 525, 271, 921];Speaker:[062, 798, 207, 921],[322, 804, 493, 923],[610, 802, 772, 923],[900, 806, 999, 924];Microphone:[680, 614, 718, 654].')
2023-11-02 22:55:56.059 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:55:56.060 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The format category:[xmin,ymin,xmax,ymax] should be strictly observed in your answer.', 'Flower:[002, 182, 079, 710];Person:[430, 113, 750, 999],[229, 256, 473, 773];Hat:[574, 117, 741, 416].')
2023-11-02 22:56:02.233 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:02.234 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\ndetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Person:[312, 001, 793, 999],[945, 374, 999, 1000],[001, 719, 052, 822];Satchel:[001, 787, 099, 998].')
2023-11-02 22:56:08.086 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:08.086 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ("\nDetect all.It's essential that your answer aligns with the category:[x0,y0,x1,y1] format.", 'Potted Plant:[613, 741, 659, 890];Person:[182, 565, 281, 1000];Car:[274, 630, 450, 847].')
2023-11-02 22:56:10.013 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:10.013 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2 and frame3,please tell me the trajectory of the horse running in a circle.Ensure the trajectories in your answer follow the TrackiFrame t:[xmin,ymin,xmax,ymax] structure.', 'There are 4 horse running in a circle.Track1frame:1:[363, 650, 419, 821];frame:2:[363, 650, 419, 821];frame:3:[320, 621, 364, 771],Track2frame:1:[258, 662, 383, 862];frame:2:[258, 662, 383, 862];frame:3:[159, 679, 338, 879],Track3frame:1:[007, 698, 025, 740];frame:2:[007, 698, 025, 740];frame:3:[001, 737, 042, 824],Track4frame:1:[000, 668, 321, 999];frame:2:[000, 668, 321, 999];frame:3:[145, 707, 650, 999].')
2023-11-02 22:56:41.752 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 2 samples!
2023-11-02 22:56:41.752 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Your output should conform exactly to the category:[xmin,ymin,xmax,ymax] format.', 'Horse:[127, 105, 866, 1000].')
2023-11-02 22:56:45.833 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:45.834 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'Potted Plant:[001, 388, 104, 943],[417, 487, 502, 748],[597, 487, 634, 762];Mirror:[062, 158, 166, 746],[188, 231, 275, 727],[287, 280, 352, 707];Bench:[482, 685, 602, 757];Desk:[132, 635, 329, 832];Flower:[220, 463, 268, 622],[193, 463, 235, 616];Vase:[223, 606, 269, 658],[198, 608, 224, 653].')
2023-11-02 22:56:50.780 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:50.780 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given frame1: and frame2:,track carFrame1:[301, 517, 381, 622],carFrame1:[808, 493, 965, 612],carFrame1:[690, 529, 768, 591] in this video clip.Adhere strictly to the format categoryFrame t:[xmin,ymin,xmax,ymax] when providing your answer.', 'carFrame1:[301, 517, 381, 622];Frame2:[298, 512, 377, 617],carFrame1[808, 493, 965, 612];Frame2:No object,carFrame1:[690, 529, 768, 591];Frame2:[783, 513, 894, 602].')
2023-11-02 22:56:53.756 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:53.756 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.The category:[xmin,ymin,xmax,ymax] format should be rigorously followed in your response.', 'Storage box:[926, 813, 1000, 1000];Hat:[079, 201, 207, 334];Boots:[706, 676, 783, 809];Bakset:[927, 814, 999, 1000];Person:[595, 241, 732, 715],[535, 246, 783, 803],[428, 257, 577, 875],[227, 203, 519, 1000].')
2023-11-02 22:56:56.209 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:56:56.209 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'Hat:[883, 443, 997, 536],[464, 796, 534, 854],[661, 711, 697, 752],[538, 753, 576, 789],[673, 622, 709, 686],[398, 846, 424, 897];Glasses:[472, 811, 505, 849];Person:[082, 883, 155, 1000],[158, 933, 191, 999],[190, 930, 224, 999],[215, 816, 340, 1000],[296, 920, 351, 999],[294, 831, 331, 927],[331, 850, 417, 999],[440, 794, 561, 999],[426, 803, 497, 996],[495, 748, 560, 887],[538, 752, 574, 840],[546, 763, 609, 950],[562, 658, 689, 1000],[642, 685, 673, 750],[661, 711, 699, 761],[688, 663, 772, 999],[682, 620, 817, 1000],[642, 642, 675, 684],[682, 587, 718, 635],[671, 614, 712, 698],[687, 628, 713, 721],[771, 619, 795, 733],[783, 610, 833, 999],[773, 575, 801, 622],[778, 560, 820, 611],[804, 550, 834, 599],[827, 439, 1000, 998],[819, 464, 909, 999],[842, 564, 910, 657];Flag:[948, 327, 999, 408],[829, 256, 949, 490],[847, 420, 992, 571],[845, 305, 951, 428],[730, 416, 823, 560],[619, 506, 692, 579],[631, 557, 659, 679],[647, 552, 677, 618],[485, 433, 580, 621],[550, 491, 615, 704],[599, 518, 632, 649],[530, 597, 563, 659],[479, 578, 518, 680],[491, 625, 535, 680],[459, 539, 510, 761],[411, 625, 472, 722],[394, 637, 461, 769],[335, 671, 406, 753],[288, 577, 336, 834],[213, 667, 306, 775],[215, 776, 256, 825],[261, 772, 295, 836],[162, 868, 219, 938],[421, 767, 469, 824],[200, 797, 226, 864],[258, 638, 302, 789],[816, 103, 836, 134];Bottle:[830, 704, 860, 816],[684, 738, 705, 797].')
2023-11-02 22:57:15.274 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:57:15.274 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, "[003, 031, 674, 998] Fred Davis served a four-game suspension for violating the NFL's substance abuse policy in 2011. [002, 000, 997, 993] (Washington Post/Getty Images")
2023-11-02 22:57:53.435 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:57:53.436 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'The groom and [714, 132, 896, 955] [100, 184, 378, 963] [636, 130, 767, 940] [316, 127, 482, 947] [550, 155, 712, 956] [414, 179, 576, 970] his friends, each with [466, 509, 534, 972] [351, 118, 398, 389] [037, 221, 265, 271] [698, 052, 751, 463] [102, 522, 200, 966] [789, 117, 880, 408] a medieval sword')
2023-11-02 22:58:09.857 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:58:09.857 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\nDetect all.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'Cabinet:[000, 000, 982, 968];Lamp:[158, 299, 806, 379];Bottle:[676, 041, 716, 176],[646, 043, 679, 232],[560, 611, 604, 715];Plate:[647, 245, 899, 293],[372, 261, 642, 313],[163, 296, 382, 329],[044, 306, 163, 334],[218, 431, 415, 524],[070, 481, 125, 542],[267, 451, 469, 530],[028, 586, 160, 596],[262, 537, 425, 597],[489, 425, 635, 520],[516, 562, 599, 590],[684, 428, 889, 476],[679, 543, 916, 573],[737, 773, 910, 888],[575, 692, 821, 830],[565, 828, 738, 896],[357, 696, 585, 785],[342, 780, 591, 860],[228, 834, 375, 892],[069, 818, 239, 888];Basin:[240, 790, 291, 836];Lemon:[847, 769, 890, 827],[726, 466, 765, 504],[760, 393, 794, 437],[406, 495, 455, 552],[109, 483, 168, 531],[181, 805, 228, 857],[182, 855, 227, 903];Watermelon:[326, 186, 423, 280];Hamimelon:[606, 171, 670, 267];Toilet Paper:[861, 053, 919, 271];Person:[550, 224, 687, 722].')
2023-11-02 22:58:16.528 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 22:58:16.529 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Photo of a male DJ [323, 369, 493, 752] in front of a painting [301, 167, 800, 745] at the Blanton Museum of Art while he is mixing on his deck.')
2023-11-02 23:00:07.311 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 23:00:07.312 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect girl,clothing,shelf and human face in this image.When submitting your answer, maintain the category:[xmin,ymin,xmax,ymax] structure consistently.', 'girl:[243, 000, 998, 996];clothing:[240, 617, 999, 998];shelf:[000, 329, 214, 671],[394, 000, 961, 718];human face:[383, 200, 721, 726].')
2023-11-02 23:00:16.869 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:102 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 23:00:16.870 | INFO | mmgpt.data.dataset.pair_webdataset:token_processor:103 - (None, 'Peruvian archaeologists [442, 424, 553, 855] command a drone [366, 341, 578, 605] to search for architectural ruins. (New York Times')
2023-11-02 23:01:09.300 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 23:01:09.300 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('\ndetect wheelchair,man and woman in this image.Please ensure that your response strictly adheres to the category:[xmin,ymin,xmax,ymax] format.', 'wheelchair:[425, 425, 768, 998];man:[000, 000, 078, 759],[474, 034, 712, 445],[474, 030, 715, 960],[665, 418, 734, 614];woman:[000, 000, 173, 889],[160, 000, 429, 998],[494, 231, 836, 998].')
2023-11-02 23:01:12.865 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:114 - exceeding max length 2048, ignore last 1 samples!
2023-11-02 23:01:12.865 | INFO | mmgpt.data.dataset.interpair_webdataset:token_processor:115 - ('Given a video clip including frame1,frame2