File size: 1,612 Bytes
212bdc7
 
 
40bf684
212bdc7
 
40bf684
 
 
 
 
 
 
212bdc7
 
 
 
40bf684
212bdc7
40bf684
212bdc7
40bf684
102413c
 
4403f09
40bf684
4403f09
 
 
40bf684
 
4403f09
 
 
 
 
 
 
 
 
 
 
 
102413c
e318323
102413c
 
 
75ee0ca
102413c
 
 
 
 
4403f09
 
 
 
 
 
 
 
 
 
 
40bf684
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
---
language:
- en
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- audio-text-to-text
tags:
- audio-retrieval
- multimodal
- moment-retrieval
library_name: lighthouse
configs:
- config_name: default
  data_files:
  - split: train
    path: train/*.tar
  - split: valid
    path: valid/*.tar
  - split: test
    path: test/*.tar
---

# Clotho-Moment
This repository provides wav files used in [Language-based Audio Moment Retrieval](https://arxiv.org/abs/2409.15672).

Each sample includes long audio containing some audio events with the temporal and textual annotation.

Project page: https://h-munakata.github.io/Language-based-Audio-Moment-Retrieval/
Code: https://github.com/line/lighthouse

## Split
- Train
  - train/train-{000..715}.tar
  - 37930 audio samples
- Valid
  - valid/valid-{000..108}.tar
  - 5741 audio samples
- Test
  - test/test-{000..142}.tar
  - 7569 audio samples

## Using Webdataset
```python
import webdataset as wds

url = "https://huggingface.co/datasets/lighthouse-emnlp2024/Clotho-Moment/resolve/main/train/train-{{001..002}}.tar"
url = f"pipe:curl -s -L {url}"
dataset = wds.WebDataset(url, shardshuffle=None).decode(wds.torch_audio)

for sample in dataset:
    print(sample.keys())
```

## Citation
```bibtex
@inproceedings{munakata2025language,
  title={Language-based Audio Moment Retrieval},
  author={Munakata, Hokuto and Nishimura, Taichi and Nakada, Shota and Komatsu, Tatsuya},
  booktitle={ICASSP 2025-2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  pages={1--5},
  year={2025},
  organization={IEEE}
}
```