UniIR
Retrieval model trainer
Trains and evaluates a universal multimodal retrieval model to perform various information retrieval tasks.
Official code for paper "UniIR: Training and Benchmarking Universal Multimodal Information Retrievers" (ECCV 2024)
114 stars
3 watching
13 forks
Language: Python
last commit: 3 months ago language-modelretrieval
Related projects:
Repository | Description | Stars |
---|---|---|
microsoft/unicoder | This repository provides pre-trained models and code for understanding and generation tasks in multiple languages. | 89 |
ieit-yuan/yuan2.0-m32 | A high-performance language model designed to excel in tasks like natural language understanding, mathematical computation, and code generation | 182 |
01-ai/yi | A series of large language models trained from scratch to excel in multiple NLP tasks | 7,743 |
xverse-ai/xverse-moe-a36b | Develops and publishes large multilingual language models with advanced mixing-of-experts architecture. | 37 |
shawn-ieitsystems/yuan-1.0 | Large-scale language model with improved performance on NLP tasks through distributed training and efficient data processing | 591 |
xverse-ai/xverse-v-13b | A large multimodal model for visual question answering, trained on a dataset of 2.1B image-text pairs and 8.2M instruction sequences. | 78 |
yunwentechnology/unilm | This project provides pre-trained models and tools for natural language understanding (NLU) and generation (NLG) tasks in Chinese. | 439 |
xverse-ai/xverse-13b | A large language model developed to support multiple languages and applications | 648 |
beastbyteai/falcon | Automates machine learning model training using pre-set configurations and modular design. | 159 |
wyy-123-xyy/ra-fed | A Python implementation of a distributed machine learning framework for training neural networks on multiple GPUs | 6 |
openai/finetune-transformer-lm | This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,167 |
jshilong/gpt4roi | Training and deploying large language models on computer vision tasks using region-of-interest inputs | 517 |
xverse-ai/xverse-65b | A large language model developed by XVERSE Technology Inc. using transformer architecture and fine-tuned on diverse data sets for various applications. | 132 |
csuhan/onellm | A framework for training and fine-tuning multimodal language models on various data types | 601 |
intelligent-machine-learning/dlrover | Automates large-scale deep learning training on distributed clusters, providing fault tolerance and fast recovery from failures. | 1,302 |