YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

IJCV (2025): TryOn-Adapter

This repository is the official implementation of TryOn-Adapter

TryOn-Adapter: Efficient Fine-Grained Clothing Identity Adaptation for High-Fidelity Virtual Try-On

Jiazheng Xing, Chao Xu, Yijie Qian, Yang Liu, Guang Dai, Baigui Sun, Yong Liu, Jingdong Wang

[arXiv Paper

teaser 

TODO List

  • Release Texture Highlighting Map and Segmentation Map
  • Release Data Preparation Code
  • Release Inference Code
  • Release Model Weights

Getting Started

Installation

  1. Clone the repository
git clone https://github.com/jiazheng-xing/TryOn-Adapter.git
cd TryOn-Adapter
  1. Install Python dependencies
conda env create -f environment.yaml
conda activate tryon-adapter

Data Preparation

VITON-HD

  1. The VITON-HD dataset serves as a benchmark. Download VITON-HD dataset.

  2. In addition to above content, some other preprocessed conditions are in use in TryOn-Adapter. The preprocessed data could be downloaded, respectively. The detail information and code see data_preparation/README.md.

    Content Google Baidu
    Segmentation Map link link
    Highlighting Texture Map link link
  3. Generate Warped Cloth and Warped Mask based on the GP-VTON.

Once everything is set up, the folders should be organized like this:

β”œβ”€β”€ VITON-HD
|   β”œβ”€β”€ test_pairs.txt
|   β”œβ”€β”€ train_pairs.txt
β”‚   β”œβ”€β”€ [train | test]
|   |   β”œβ”€β”€ image
β”‚   β”‚   β”‚   β”œβ”€β”€ [000006_00.jpg | 000008_00.jpg | ...]
β”‚   β”‚   β”œβ”€β”€ cloth
β”‚   β”‚   β”‚   β”œβ”€β”€ [000006_00.jpg | 000008_00.jpg | ...]
β”‚   β”‚   β”œβ”€β”€ cloth-mask
β”‚   β”‚   β”‚   β”œβ”€β”€ [000006_00.jpg | 000008_00.jpg | ...]
β”‚   β”‚   β”œβ”€β”€ image-parse-v3
β”‚   β”‚   β”‚   β”œβ”€β”€ [000006_00.png | 000008_00.png | ...]
β”‚   β”‚   β”œβ”€β”€ openpose_img
β”‚   β”‚   β”‚   β”œβ”€β”€ [000006_00_rendered.png | 000008_00_rendered.png | ...]
β”‚   β”‚   β”œβ”€β”€ openpose_json
β”‚   β”‚   β”‚   β”œβ”€β”€ [000006_00_keypoints.json | 000008_00_keypoints.json | ...]
β”‚   β”‚   β”œβ”€β”€ train_paired/test_(un)paired
β”‚   β”‚   β”‚   β”œβ”€β”€ mask      [000006_00.png | 000008_00.png | ...]
β”‚   β”‚   β”‚   β”œβ”€β”€ seg_preds [000006_00.png | 000008_00.png | ...]
β”‚   β”‚   β”‚   β”œβ”€β”€ warped    [000006_00.png | 000008_00.png | ...]

DressCode

  1. The DressCode dataset serves as a benchmark. Download the DressCode dataset.

  2. In addition to above content, some other preprocessed conditions are in use in TryOn-Adapter. The detail information and code see data_preparation/README.md.

  3. Generate Warped Cloth and Warped Mask based on the GP-VTON.

Once everything is set up, the folders should be organized like this:

β”œβ”€β”€ DressCode
|   β”œβ”€β”€ test_pairs_paired.txt
|   β”œβ”€β”€ test_pairs_unpaired.txt
|   β”œβ”€β”€ train_pairs.txt
|   β”œβ”€β”€ train_pairs.txt
β”‚   β”œβ”€β”€ [test_paird | test_unpaird | train_paird]
β”‚   β”‚   β”œβ”€β”€ [dresses | lower_body | upper_body]
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ mask      [013563_1.png| 013564_1.png | ...]
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ seg_preds [013563_1.png| 013564_1.png | ...]
β”‚   β”‚   β”‚   β”‚   β”œβ”€β”€ warped    [013563_1.png| 013564_1.png | ...]
β”‚   β”œβ”€β”€ [dresses | lower_body | upper_body]
|   |   β”œβ”€β”€ test_pairs_paired.txt
|   |   β”œβ”€β”€ test_pairs_unpaired.txt
|   |   β”œβ”€β”€ train_pairs.txt
β”‚   β”‚   β”œβ”€β”€ images
β”‚   β”‚   β”‚   β”œβ”€β”€ [013563_0.jpg | 013563_1.jpg | 013564_0.jpg | 013564_1.jpg | ...]
β”‚   β”‚   β”œβ”€β”€ masks
β”‚   β”‚   β”‚   β”œβ”€β”€ [013563_1.png| 013564_1.png | ...]
β”‚   β”‚   β”œβ”€β”€ keypoints
β”‚   β”‚   β”‚   β”œβ”€β”€ [013563_2.json | 013564_2.json | ...]
β”‚   β”‚   β”œβ”€β”€ label_maps
β”‚   β”‚   β”‚   β”œβ”€β”€ [013563_4.png | 013564_4.png | ...]
β”‚   β”‚   β”œβ”€β”€ skeletons
β”‚   β”‚   β”‚   β”œβ”€β”€ [013563_5.jpg | 013564_5.jpg | ...]
β”‚   β”‚   β”œβ”€β”€ dense
β”‚   β”‚   β”‚   β”œβ”€β”€ [013563_5.png | 013563_5_uv.npz | 013564_5.png | 013564_5_uv.npz | ...]

Inference

Please download the pretrained model from HuggingFace. To perform inference on the Dress Code or VITON-HD dataset, use the following command:

python test_viton.py/test_dresscode.py --plms --gpu_id 0 \
--ddim_steps 100 \
--outdir <path> \
--config [configs/viton.yaml | configs/dresscode.yaml] \
--dataroot <path> \
--ckpt <path> \
--ckpt_elbm_path <path> \
--use_T_repaint [True | False] \
--n_samples 1 \
--seed 23 \
--scale 1 \
--H 512 \
--W 512 \
--unpaired
--ddim_steps <int>         sampling steps
--outdir <str>             output direction path
--config <str>             config path of viton-hd/dresscode
--ckpt <str>               diffusion model checkpoint path
--ckpt_elbm_path <str>     elbm module checkpoint dirction path
--use_T_repaint <bool>     whether to use T-Repaint technique
--n_samples <int>          numbers of samples per inference
--unpaired                 whether to use the unpaired setting

or just simply run:

bash test_viton.sh
bash test_dresscode.sh

Acknowledgements

Our code is heavily borrowed from Paint-by-Example. We also thank GP-VTON, our warping garments are generated from it.

Citation

@article{xing2025tryon,
  title={TryOn-Adapter: Efficient Fine-Grained Clothing Identity Adaptation for High-Fidelity Virtual Try-On},
  author={Xing, Jiazheng and Xu, Chao and Qian, Yijie and Liu, Yang and Dai, Guang and Sun, Baigui and Liu, Yong and Wang, Jingdong},
  journal={International Journal of Computer Vision},
  pages={1--22},
  year={2025},
  publisher={Springer}
}
Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support