CPM-Live
Model trainer
A live training platform for large-scale deep learning models, allowing community participation and collaboration in model development and deployment.
Live Training for Open-source Big Models
511 stars
21 watching
40 forks
Language: Python
last commit: over 1 year ago deep-learningmulti-task-learningnatural-language-generationnatural-language-processingnatural-language-understandingnlpparameter-efficient-learningpretrained-language-model
Related projects:
Repository | Description | Stars |
---|---|---|
openbmb/bmtrain | A toolkit for training large models in a distributed manner while keeping code simple and efficient. | 563 |
openbmb/bmlist | A curated list of large machine learning models tracked over time | 341 |
open-mmlab/mmengine | Provides a flexible and configurable framework for training deep learning models with PyTorch. | 1,179 |
openbmb/viscpm | A family of large multimodal models supporting multimodal conversational capabilities and text-to-image generation in multiple languages | 1,089 |
llava-vl/llava-plus-codebase | A platform for training and deploying large language and vision models that can use tools to perform tasks | 704 |
openai/finetune-transformer-lm | This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,160 |
microsoft/mpnet | Develops a method for pre-training language understanding models by combining masked and permuted techniques, and provides code for implementation and fine-tuning. | 288 |
csuhan/onellm | A framework for training and fine-tuning multimodal language models on various data types | 588 |
bobazooba/xllm | A tool for training and fine-tuning large language models using advanced techniques | 380 |
qinbinli/moon | A framework for collaborative machine learning model training that leverages similarity between model representations to correct local training. | 263 |
chendelong1999/polite-flamingo | Develops training methods to improve the politeness and natural flow of multi-modal Large Language Models | 63 |
huggingface/nanotron | A library for training large language models with parallel computing and mixed precision training methods | 1,244 |
maxpumperla/elephas | Enables distributed deep learning with Keras and Spark for scalable model training | 1,574 |
moses-smt/nplm | A toolkit for training neural network language models | 14 |
pytorchbearer/torchbearer | A PyTorch model fitting library designed to simplify the process of training deep learning models. | 636 |