Dataset Viewer
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code: StreamingRowsError Exception: CastError Message: Couldn't cast jsonl: binary __key__: string __url__: string mp4: null to {'mp4': Value('binary'), '__key__': Value('string'), '__url__': Value('string')} because column names don't match Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise return get_rows( File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator return func(*args, **kwargs) File "/src/services/worker/src/worker/utils.py", line 77, in get_rows rows_plus_one = list(itertools.islice(ds, rows_max_number + 1)) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2361, in __iter__ for key, example in ex_iterable: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1882, in __iter__ for key, pa_table in self._iter_arrow(): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1914, in _iter_arrow pa_table = cast_table_to_features(pa_table, self.features) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2192, in cast_table_to_features raise CastError( datasets.table.CastError: Couldn't cast jsonl: binary __key__: string __url__: string mp4: null to {'mp4': Value('binary'), '__key__': Value('string'), '__url__': Value('string')} because column names don't match
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Agibot2UnitreeG1Retarget Dataset
Paper | Project Page | Code
Description
This dataset contains action retargeting data from Agibot to UnitreeG1 humanoid robot.
Dataset Size
- Total size: ~30GB
- Split into 7 parts (A2UG1_dataset.tar.gz.aa to A2UG1_dataset.tar.gz.ag)
- Each part is approximately 4GB
Usage
Method 1: Using Hugging Face Hub (Recommended)
pip install huggingface-hub
from huggingface_hub import snapshot_download
# Download the entire dataset
snapshot_download(
repo_id="l2aggle/Agibot2UnitreeG1Retarget",
repo_type="dataset",
local_dir="./Agibot2UnitreeG1Retarget"
)
Method 2: Using Git with LFS
# Make sure git-lfs is installed
git lfs install
# Clone the repository (this will download LFS pointer files)
git clone https://huggingface.co/datasets/l2aggle/Agibot2UnitreeG1Retarget
cd Agibot2UnitreeG1Retarget
# Download the actual large files
git lfs pull
Method 3: Manual Download
Download individual parts through the Hugging Face web interface: https://huggingface.co/datasets/l2aggle/Agibot2UnitreeG1Retarget/tree/main
Extract Dataset
After downloading, extract the complete dataset:
# Combine and extract all parts
cat A2UG1_dataset.tar.gz.* | tar -xzf -
This will create the complete A2UG1_dataset
folder with all original files.
File Structure
A2UG1_dataset/
├── [your dataset structure will be shown here after extraction]
Requirements
- At least 60GB free disk space (30GB for download + 30GB for extraction)
- For Method 1: Python 3.6+ with
huggingface-hub
package - For Method 2: Git with Git LFS support
- tar utility (standard on Linux/Mac, available on Windows via WSL or Git Bash)
Installation Requirements
# For Method 1
pip install huggingface-hub
# For Method 2 (if git-lfs not installed)
# Ubuntu/Debian:
sudo apt install git-lfs
# macOS:
brew install git-lfs
# Windows: download from https://git-lfs.github.io/
License
Apache 2.0
- Downloads last month
- 216