Skip to content
/ MMICT Public

Source code for TOMM'24 paper "MMICT: Boosting Multi-Modal Fine-Tuning with In-Context Examples".

License

Notifications You must be signed in to change notification settings

KDEGroup/MMICT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MMICT: Boosting Multi-Modal Fine-Tuning with In-Context Examples

Source code for TOMM 2024 paper "MMICT: Boosting Multi-Modal Fine-Tuning with In-Context Examples" [arXiv preprint].

Environment

The required environment is included in requirements.txt.

Dataset Preparation

We train and test our model on:

How to run

To train the model:

bash run.sh

Acknowledgments

We thank the developers of LAVIS, BLIP-2, CLIP, for their public code release.

About

Source code for TOMM'24 paper "MMICT: Boosting Multi-Modal Fine-Tuning with In-Context Examples".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published