TypeError: 'HybridCache' object is not iterable

#60
by kingcreatorpulga - opened

I ran my code in Google Colab without any issue but when I try it in another notebook, I still got this error. How can I handle this ?

Facing the same issue on gemma2 (paligemma2-3b-pt-224).
Weirdly I get this error only when I set eval_strategy = "epoch" instead of "step" in the TrainingArguments()

Error trace :

File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 2171, in train
return inner_training_loop(
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/accelerate/utils/memory.py", line 159, in decorator
return function(batch_size, *args, **kwargs)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 2625, in _inner_training_loop
self._maybe_log_save_evaluate(tr_loss, grad_norm, model, trial, epoch, ignore_keys_for_eval, start_time)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 3071, in _maybe_log_save_evaluate
metrics = self._evaluate(trial, ignore_keys_for_eval)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 3025, in _evaluate
metrics = self.evaluate(ignore_keys=ignore_keys_for_eval)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 4073, in evaluate
output = eval_loop(
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 4267, in evaluation_loop
losses, logits, labels = self.prediction_step(model, inputs, prediction_loss_only, ignore_keys=ignore_keys)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 4483, in prediction_step
loss, outputs = self.compute_loss(model, inputs, return_outputs=True)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/trainer.py", line 3731, in compute_loss
outputs = model(**inputs)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/parallel/data_parallel.py", line 194, in forward
return self.gather(outputs, self.output_device)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/parallel/data_parallel.py", line 217, in gather
return gather(outputs, output_device, dim=self.dim)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 135, in gather
res = gather_map(outputs)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 127, in gather_map
return type(out)((k, gather_map([d[k] for d in outputs])) for k in out)
File "", line 9, in init
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/transformers/utils/generic.py", line 392, in post_init
for idx, element in enumerate(iterator):
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 127, in
return type(out)((k, gather_map([d[k] for d in outputs])) for k in out)
File "/home/user_name/miniconda3/envs/vlm_ws/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 130, in gather_map
return type(out)(map(gather_map, zip(*outputs)))
TypeError: 'HybridCache' object is not iterable

Google org

Hi, is this issue still persists? If so, Please provide more details about your setup including the installed versions of the torch and transformers libraries as well as the specific environment setup you are using to implement this model. Thank you.

Sign up or log in to comment