dek924 commited on
Commit
3798dc6
·
verified ·
1 Parent(s): 86b2d29

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +45 -9
README.md CHANGED
@@ -1,9 +1,45 @@
1
- ---
2
- tags:
3
- - model_hub_mixin
4
- - pytorch_model_hub_mixin
5
- ---
6
-
7
- This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
8
- - Library: [More Information Needed]
9
- - Docs: [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # EHRXDiff
2
+ Model card for our paper: [Towards Predicting Temporal Changes in a Patient's Chest X-ray Images based on Electronic Health Records](https://arxiv.org/abs/2409.07012).
3
+ We provide two versions of the **EHRXDiff** model:
4
+ * **EHRXDiff** – trained without the null-based augmentation technique
5
+ * **EHRXDiff<sub>w_null</sub>** – trained with the null-based augmentation technique.
6
+ This card describes the **EHRXDiff** model.
7
+ For implementation details, please refer to the [EHRXDiff repository](https://github.com/dek924/EHRXDiff).
8
+
9
+
10
+ ## Installation
11
+ First, clone the repository and install the required packages:
12
+ ```
13
+ git clone https://github.com/dek924/EHRXDiff.git
14
+
15
+ pip install "pip<24.1"
16
+ pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113
17
+ pip install -r requirements.txt
18
+ ```
19
+
20
+ ## Loading the model
21
+ You can load the model directly in Python:
22
+ ```python
23
+ from cheff.ldm.models.diffusion.ddpm_tab import EHRXDiff
24
+
25
+ model = EHRXDiff.from_pretrained("dek924/ehrxdiff")
26
+ model.eval()
27
+ ```
28
+ Alternatively, you can download the weights via the Hugging Face Hub:
29
+
30
+ ```python
31
+ from huggingface_hub import hf_hub_download
32
+
33
+ wt_path = hf_hub_download("dek924/ehrxdiff", "pytorch_model.bin")
34
+ ```
35
+ and then run the evaluation script included in our github repository (`scripts/eval.py`):
36
+ ```
37
+ python scripts/eval.py \
38
+ --sdm_path=${CHECKPOINT_PATH}/pytorch_model.bin \
39
+ --save_dir=${CHECKPOINT_PATH}/images/seed${RAND_SEED} \
40
+ --img_meta_dir=${IMG_META_DIR} \ # Directory containing metadata for MIMIC-CXR-JPG
41
+ --img_root_dir=${IMG_ROOT_DIR} \ # Directory containing preprocessed images
42
+ --tab_root_dir=${TAB_ROOT_DIR} \ # Directory containing tabular data
43
+ --seed=${RAND_SEED} \
44
+ --batch_size=${BATCHSIZE}
45
+ ```