nplm
Language model trainer
A toolkit for training neural network language models
Fork of http://nlg.isi.edu/software/nplm/ with some efficiency tweaks and adaptation for use in mosesdecoder.
14 stars
17 watching
9 forks
Language: C++
last commit: about 9 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
moses-smt/mosesdecoder | A software toolkit for machine translation | 1,583 |
moses-smt/giza-pp | A toolkit for training statistical machine translation models and word alignment. | 264 |
microsoft/mpnet | Develops a method for pre-training language understanding models by combining masked and permuted techniques, and provides code for implementation and fine-tuning. | 288 |
csuhan/onellm | A framework for training and fine-tuning multimodal language models on various data types | 588 |
moses-smt/mgiza | A C++ implementation of a word alignment tool with multi-threading and incremental training capabilities for machine translation. | 161 |
codingtrain/machine-learning | A collection of resources and examples around machine learning for education and development | 955 |
bobazooba/xllm | A tool for training and fine-tuning large language models using advanced techniques | 380 |
modusdatascience/glm-sklearn | Provides Python wrappers for statsmodels GLM models, mimicking scikit-learn's API | 23 |
openbmb/cpm-live | A live training platform for large-scale deep learning models, allowing community participation and collaboration in model development and deployment. | 511 |
yandex/faster-rnnlm | A toolkit for training efficient neural network language models on large datasets with hierarchical softmax and noise contrastive estimation. | 561 |
openai/finetune-transformer-lm | This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,160 |
skyworkai/skywork-moe | A high-performance mixture-of-experts model with innovative training techniques for language processing tasks | 126 |
moses-smt/salm | A tool kit for working with suffix arrays and their applications in empirical language processing. | 11 |
yiren-jian/blitext | Develops and trains models for vision-language learning with decoupled language pre-training | 24 |
elanmart/psmm | An implementation of a neural network model for character-level language modeling. | 50 |