Feature Extraction
Transformers
PyTorch
English
Chinese

Request for Batch Processing Support in Inference

#1
by jamessyx - opened

Excellent work on UniViTAR!

I have a question regarding the inference capabilities. From the provided code, it appears that UniViTAR currently only supports inference on single (packed) samples and doesn't support batch processing for multiple samples.
Issue:
When working with larger batch sizes (e.g., batch_size=1000), putting all samples into a single sequence seems impractical and inefficient. Could you please provide a batch processing inference script that can handle multiple samples efficiently? This would be extremely helpful for scenarios where we need to process large numbers of samples.

Sign up or log in to comment