Chengfengke commited on
Commit
0f47b32
·
verified ·
1 Parent(s): 0398188

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -8
README.md CHANGED
@@ -11,7 +11,7 @@ pipeline_tag: fill-mask
11
  ---
12
  # Herbert: Pretrained Bert Model for Herbal Medicine
13
 
14
- **Herberta** is a pretrained model for herbal medicine research, developed based on the `bert-base-chinese` model. The model has been fine-tuned on domain-specific data from 675 ancient books and 32 Traditional Chinese Medicine (TCM) textbooks. It is designed to support a variety of TCM-related NLP tasks.
15
 
16
  ---
17
 
@@ -22,7 +22,7 @@ This model is optimized for TCM-related tasks, including but not limited to:
22
  - Domain-specific word embedding
23
  - Classification, labeling, and sequence prediction tasks in TCM research
24
 
25
- Herberta combines the strengths of modern pretraining techniques and domain knowledge, allowing it to excel in TCM-related text processing tasks.
26
 
27
  ---
28
 
@@ -40,9 +40,6 @@ Herberta combines the strengths of modern pretraining techniques and domain know
40
  }
41
  ### requirements
42
  "transformers_version": "4.45.1"
43
- ```bash
44
- pip install herberta
45
- ```
46
 
47
  ### Quickstart
48
 
@@ -92,9 +89,8 @@ outputs = model(**inputs)
92
  If you find our work helpful, feel free to give us a cite.
93
 
94
  ```bibtex
95
- @misc{herberta-embedding,
96
- title = {Herberta: A Pretrain_Bert_Model for TCM_herb and downstream Tasks as Text Embedding Generation},
97
- url = {https://github.com/15392778677/herberta},
98
  author = {Yehan Yang,Xinhan Zheng},
99
  month = {December},
100
  year = {2024}
 
11
  ---
12
  # Herbert: Pretrained Bert Model for Herbal Medicine
13
 
14
+ **Herbert** is a pretrained model for herbal medicine research, developed based on the `bert-base-chinese` model. The model has been fine-tuned on domain-specific data from 675 ancient books and 32 Traditional Chinese Medicine (TCM) textbooks. It is designed to support a variety of TCM-related NLP tasks.
15
 
16
  ---
17
 
 
22
  - Domain-specific word embedding
23
  - Classification, labeling, and sequence prediction tasks in TCM research
24
 
25
+ Herbert combines the strengths of modern pretraining techniques and domain knowledge, allowing it to excel in TCM-related text processing tasks.
26
 
27
  ---
28
 
 
40
  }
41
  ### requirements
42
  "transformers_version": "4.45.1"
 
 
 
43
 
44
  ### Quickstart
45
 
 
89
  If you find our work helpful, feel free to give us a cite.
90
 
91
  ```bibtex
92
+ @misc{herbert-embedding,
93
+ title = {Herbert: A Pretrain_Bert_Model for TCM_herb and downstream Tasks as Text Embedding Generation},
 
94
  author = {Yehan Yang,Xinhan Zheng},
95
  month = {December},
96
  year = {2024}