Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError Exception: HfHubHTTPError Message: 500 Server Error: Internal Server Error for url: https://hf-hub-lfs-us-east-1.s3.us-east-1.amazonaws.com/repos/56/ce/56cec8b7b74f2e1ad6057f8bdc977885fdb702bac3daef6424b273b56fb9d24d/792c484bb9386a88eb3b8bf48dba6e9061569a1267297fd9bb32df3925ce8b8a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIA2JU7TKAQLC2QXPN7%2F20240822%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240822T170505Z&X-Amz-Expires=259200&X-Amz-Signature=e4cd28a2600c1869b4c2c310ffd6acd67813250aa1a813313c541fc6e14b5527&X-Amz-SignedHeaders=host&response-content-disposition=inline%3B%20filename%2A%3DUTF-8%27%27dataset.zip%3B%20filename%3D%22dataset.zip%22%3B&response-content-type=application%2Fzip&x-id=GetObject Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status response.raise_for_status() File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: https://hf-hub-lfs-us-east-1.s3.us-east-1.amazonaws.com/repos/56/ce/56cec8b7b74f2e1ad6057f8bdc977885fdb702bac3daef6424b273b56fb9d24d/792c484bb9386a88eb3b8bf48dba6e9061569a1267297fd9bb32df3925ce8b8a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIA2JU7TKAQLC2QXPN7%2F20240822%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240822T170505Z&X-Amz-Expires=259200&X-Amz-Signature=e4cd28a2600c1869b4c2c310ffd6acd67813250aa1a813313c541fc6e14b5527&X-Amz-SignedHeaders=host&response-content-disposition=inline%3B%20filename%2A%3DUTF-8%27%27dataset.zip%3B%20filename%3D%22dataset.zip%22%3B&response-content-type=application%2Fzip&x-id=GetObject The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1766, in _prepare_split_single writer.write(example, key) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 500, in write self.write_examples_on_file() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 458, in write_examples_on_file self.write_batch(batch_examples=batch_examples) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 572, in write_batch self.write_table(pa_table, writer_batch_size) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 587, in write_table pa_table = embed_table_storage(pa_table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2280, in embed_table_storage arrays = [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2281, in <listcomp> embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name] File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in wrapper return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in <listcomp> return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2154, in embed_array_storage return feature.embed_storage(array) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 273, in embed_storage [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 274, in <listcomp> (path_to_bytes(x["path"]) if x["bytes"] is None else x["bytes"]) if x is not None else None File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 309, in wrapper return func(value) if value is not None else None File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 268, in path_to_bytes with xopen(path, "rb") as f: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 1227, in xopen file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open() File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 135, in open return self.__enter__() File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 103, in __enter__ f = self.fs.open(self.path, mode=mode) File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1293, in open f = self._open( File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/zip.py", line 129, in _open out = self.zip.open(path, mode.strip("b"), force_zip64=self.force_zip_64) File "/usr/local/lib/python3.9/zipfile.py", line 1527, in open fheader = zef_file.read(sizeFileHeader) File "/usr/local/lib/python3.9/zipfile.py", line 744, in read data = self._file.read(n) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 916, in track_read out = f_read(*args, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 765, in read return super().read(length) File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1846, in read out = self.cache._fetch(self.loc, self.loc + length) File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/caching.py", line 189, in _fetch self.cache = self.fetcher(start, end) # new block replaces old File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 728, in _fetch_range hf_raise_for_status(r) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 371, in hf_raise_for_status raise HfHubHTTPError(str(e), response=response) from e huggingface_hub.utils._errors.HfHubHTTPError: 500 Server Error: Internal Server Error for url: https://hf-hub-lfs-us-east-1.s3.us-east-1.amazonaws.com/repos/56/ce/56cec8b7b74f2e1ad6057f8bdc977885fdb702bac3daef6424b273b56fb9d24d/792c484bb9386a88eb3b8bf48dba6e9061569a1267297fd9bb32df3925ce8b8a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIA2JU7TKAQLC2QXPN7%2F20240822%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240822T170505Z&X-Amz-Expires=259200&X-Amz-Signature=e4cd28a2600c1869b4c2c310ffd6acd67813250aa1a813313c541fc6e14b5527&X-Amz-SignedHeaders=host&response-content-disposition=inline%3B%20filename%2A%3DUTF-8%27%27dataset.zip%3B%20filename%3D%22dataset.zip%22%3B&response-content-type=application%2Fzip&x-id=GetObject The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1524, in compute_config_parquet_and_info_response parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet( File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1099, in stream_convert_to_parquet builder._prepare_split( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1627, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1784, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
image
image |
---|
End of preview.
Bangumi Image Base of Bokura No Ame-iro Protocol
This is the image base of bangumi Bokura no Ame-iro Protocol, we detected 49 characters, 4140 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
# | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
---|---|---|---|---|---|---|---|---|---|---|
0 | 13 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
1 | 17 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
2 | 844 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
3 | 58 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
4 | 493 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
5 | 10 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
6 | 12 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
7 | 25 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
8 | 24 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
9 | 57 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
10 | 10 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
11 | 180 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
12 | 228 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
13 | 24 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
14 | 15 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
15 | 126 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
16 | 39 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
17 | 14 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
18 | 35 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
19 | 423 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
20 | 109 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
21 | 20 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
22 | 25 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
23 | 10 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
24 | 9 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
25 | 9 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
26 | 525 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
27 | 16 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
28 | 9 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
29 | 6 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
N/A | N/A |
30 | 12 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
31 | 15 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
32 | 96 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
33 | 85 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
34 | 60 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
35 | 14 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
36 | 165 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
37 | 21 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
38 | 8 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
39 | 41 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
40 | 7 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
N/A |
41 | 8 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
42 | 6 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
N/A | N/A |
43 | 6 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
N/A | N/A |
44 | 21 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
45 | 6 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
N/A | N/A |
46 | 9 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
47 | 96 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
noise | 79 | Download | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
- Downloads last month
- 272