metadata
license: apache-2.0
task_categories:
- text-generation
language:
- zh
size_categories:
- n>1T
tags:
- Traditional Chinese Medicine
configs:
- config_name: TCM_Book_Corpus (Text)
data_files: TCM_pretrain_book_corpus.json
- config_name: TCM_Web_Corpus (Text)
data_files: TCM_pretrain_web_corpus.jsonl
- config_name: TCM_Web_Interleaved_Data (Text & Image)
data_files: TCM_pretrain_web_vision.json
- config_name: TCM_Book_Interleaved_Data (Text & Image)
data_files: TCM_pretrain_book_vision.json
- config_name: TCM__synthesized_vision (Text & Image)
data_files: TCM_pretrain_synthesized_vision.json
π Introduction
This dataset is the pre-training dataset for ShizhenGPT, a multimodal LLM for Traditional Chinese Medicine (TCM). We open-source the largest existing TCM corpus dataset (over 5B tokens) from TCM-related websites and books. Additionally, we also open-source the largest scale TCM image-text pretraining dataset.
For details, see our paper and GitHub repository.
π Dataset Overview
The open-sourced pre-training dataset consists of five parts:
Modality | Description | Data Quantity | |
---|---|---|---|
TCM_Book_Corpus | π Text | A cleaned corpus of 3,256 TCM textbooks. | ~ 0.5 B tokens |
TCM_Web_Corpus | π Text | A TCM corpus collected from the web. | Over 5B tokens |
TCM_Book_Interleaved_Data | π Text, ποΈ Visual | Interleaved text-image data from 306 TCM books. | 41459 entries, 50690 images |
TCM_Web_Interleaved_Data | π Text, ποΈ Visual | Interleaved text-image data from the TCM web corpus. | 505465 entries, 1143954 images |
TCM_pretrain_synthesized_vision | π Text, ποΈ Visual | TCM image-text pairs generated from images and their context using GPT-4o. | 144239 entries, 159534 images |
β οΈ Note: Due to privacy and ethical concerns, TCM signal datasets (e.g., sound and pulse) are not provided. For some signal data, refer to the Instruction Dataset.
π Citation
If you find our data useful, please consider citing our work!
@misc{chen2025shizhengptmultimodalllmstraditional,
title={ShizhenGPT: Towards Multimodal LLMs for Traditional Chinese Medicine},
author={Junying Chen and Zhenyang Cai and Zhiheng Liu and Yunjin Yang and Rongsheng Wang and Qingying Xiao and Xiangyi Feng and Zhan Su and Jing Guo and Xiang Wan and Guangjun Yu and Haizhou Li and Benyou Wang},
year={2025},
eprint={2508.14706},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.14706},
}