The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
Dataset Card for Wikimedia Wikipedia (MDS Format)
NOTE: I am not affiliated with Wikimedia or Wikipedia.
Dataset Summary
Wikipedia dataset containing cleaned articles of all languages.
The dataset is built from the Wikipedia dumps (https://dumps.wikimedia.org/) with one subset per language, each containing a single train split.
Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.).
All language subsets have already been processed for recent dump, and you can load them per date and language this way:
from datasets import load_dataset
ds = load_dataset("wikimedia/wikipedia", "20231101.en")
Data Visualization
Click the Nomic Atlas map below to visualize the 6.4 million samples in the 20231101.en
split.

Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
Languages
You can find the list of languages here: https://meta.wikimedia.org/wiki/List_of_Wikipedias
Dataset Structure
Data Instances
An example looks as follows:
{'id': '1',
'url': 'https://simple.wikipedia.org/wiki/April',
'title': 'April',
'text': 'April is the fourth month...'
}
Data Fields
The data fields are the same among all configurations:
id
(str
): ID of the article.url
(str
): URL of the article.title
(str
): Title of the article.text
(str
): Text content of the article.
Data Splits
All configurations contain a single train
split.
Dataset Creation
Curation Rationale
[More Information Needed]
Source Data
Initial Data Collection and Normalization
The dataset is built from the Wikipedia dumps: https://dumps.wikimedia.org
You can find the full list of languages and dates here: https://dumps.wikimedia.org/backup-index.html
The articles have been parsed using the mwparserfromhell
tool.
When uploading the data files for the 20231101 dump, we noticed that the Wikimedia Dumps website does not contain this date dump for the "bbc", "dga", nor "zgh" Wikipedias. We have reported the issue to the Wikimedia Phabricator: https://phabricator.wikimedia.org/T351761
Who are the source language producers?
[More Information Needed]
Annotations
Annotation process
[More Information Needed]
Who are the annotators?
[More Information Needed]
Personal and Sensitive Information
[More Information Needed]
Considerations for Using the Data
Social Impact of Dataset
[More Information Needed]
Discussion of Biases
[More Information Needed]
Other Known Limitations
[More Information Needed]
Additional Information
Dataset Curators
[More Information Needed]
Licensing Information
Copyright licensing information: https://dumps.wikimedia.org/legal.html
All original textual content is licensed under the GNU Free Documentation License (GFDL) and the Creative Commons Attribution-Share-Alike 3.0 License. Some text may be available only under the Creative Commons license; see their Terms of Use for details. Text written by some authors may be released under additional licenses or into the public domain.
Citation Information
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
- Downloads last month
- 455