Black-box-Adversarial-Reprogramming
Model reprogramming
An approach to adapt machine learning models using scarce data and limited resources by modifying their internal workings without changing the model's original architecture or training data.
Code for "Transfer Learning without Knowing: Reprogramming Black-box Machine Learning Models with Scarce Data and Limited Resources". (ICML 2020)
37 stars
3 watching
4 forks
Language: Python
last commit: about 4 years ago black-box-optimizationicml-2020limited-transfer-learningmachine-learningmedical-imaging-classificationrobust-learningsocial-good
Related projects:
Repository | Description | Stars |
---|---|---|
paarthneekhara/rnn_adversarial_reprogramming | Repurposes pre-trained neural networks for new classification tasks through adversarial reprogramming of their inputs. | 6 |
huckiyang/voice2series-reprogramming | An approach to reprogramming acoustic models for time series classification using differential mel-spectrograms and adversarial training | 69 |
prinsphield/adversarial_reprogramming | This project enables reprogramming of pre-trained neural networks to work on new tasks by fine-tuning them on smaller datasets. | 33 |
paarthneekhara/multimodal_rerprogramming | Cross-modal Adversarial Reprogramming enables retraining of image models on text classification tasks | 11 |
rentruewang/koila | A lightweight wrapper around PyTorch to prevent CUDA out-of-memory errors and optimize model execution | 1,821 |
zygmuntz/kaggle-blackbox | A toolkit for building and training machine learning models using a simple, easy-to-use interface. | 115 |
yunwentechnology/unilm | This project provides pre-trained models for natural language understanding and generation tasks using the UniLM architecture. | 438 |
zhuiyitechnology/pretrained-models | A collection of pre-trained language models for natural language processing tasks | 987 |
ymcui/macbert | Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks | 645 |
ymcui/pert | Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels | 354 |
dodohow1011/speechadvreprogram | Developing low-resource speech command recognition systems using adversarial reprogramming and transfer learning | 18 |
yiren-jian/blitext | Develops and trains models for vision-language learning with decoupled language pre-training | 24 |
cmawer/reproducible-model | A project demonstrating how to create a reproducible machine learning model using Python and version control | 86 |
tiger-ai-lab/uniir | Trains and evaluates a universal multimodal retrieval model to perform various information retrieval tasks. | 110 |
beastbyteai/falcon | Automates machine learning model training using pre-set configurations and modular design. | 159 |