This repo is the official PyTorch implementation of "Discriminator-free Unsupervised Domain Adaptation for Multi-label Image Classification" in WACV 2024.
@inproceedings{singh2023discriminatorfree,
title={Discriminator-free Unsupervised Domain Adaptation for Multi-label Image Classification},
author={Singh, Indel Pal and Ghorbel, Enjie and Kacem, Anis and Rathinam, Arunkumar and Aouada, Djamila},
booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
year={2024}
}
We provide a collection of models trained with the proposed GMM-based discrepancy on various multi-label domain adaptation datasets.
Source | Target | mAP |
---|---|---|
AID | UCM | 63.2 |
UCM | AID | 54.9 |
AID | DFC | 62.1 |
UCM | DFC | 70.6 |
VOC | Clipart | 61.4 |
Clipart | VOC | 77.0 |
Cityscapes | Foggycityscapes | 62.3 |
Create virtual environment
$ python3 -m venv dda_mlic
Activate your virtual environment
$ source dda_mlic/bin/activate
Upgrade pip to the latest version
$ pip install --upgrade pip
Install compatible CUDA and pytorch versions.
$ pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 torchaudio==0.9.0 -f https://download.pytorch.org/whl/torch_stable.html
Install other required packages from requirements.txt
$ pip install -r requirements.txt
Syntax: Source → Target
$ python main.py --phase test -s <name_of_source_dataset> -t <name_of_target_dataset> -s-dir <path_to_source_dataset_dir> -t-dir <path_to_target_dataset_dir> --model-path <path_to_pretrained_weights>
Example: AID → UCM
$ python main.py --phase test -s AID -t UCM -s-dir datasets/AID -t-dir datasets/UCM --model-path models/aid2ucm_best_63-2.pth
Download the imagenet pretrained weights for TResNetM.
Syntax: Source → Target
$ python main.py -s <name_of_source_dataset> -t <name_of_target_dataset> -s-dir <path_to_source_dataset_dir> -t-dir <path_to_target_dataset_dir> --model-path <path_to_imagenet_pretrained_weights>
Example: AID → UCM
$ python main.py -s AID -t UCM -s-dir datasets/AID -t-dir datasets/UCM --model-path models/tresnet_m_miil_21k.pth
We create our code based on the following repositories:
Thanks to the authors.