femtoGPT
GPT transformer
A Rust implementation of a minimal Generative Pretrained Transformer architecture.
Pure Rust implementation of a minimal Generative Pretrained Transformer
845 stars
15 watching
52 forks
Language: Rust
last commit: 5 months ago
Linked from 1 awesome list
from-scratchgptgpullmmachine-learningneural-networkopenclrust
Related projects:
Repository | Description | Stars |
---|---|---|
| A generative transformer model designed to process and generate text in various vertical domains, including computer science, finance, and more. | 217 |
| An implementation of post-training quantization algorithm for transformer models to reduce memory usage and improve inference speed | 1,964 |
| An implementation of the GPT-2 language model in PyTorch for generating text | 976 |
| An expert assistant for Rust programming questions and issues with precise code interpretation, up-to-date crate information, and source code analysis. | 10 |
| Research tool for training large transformer language models at scale | 1,926 |
| A resource repository providing tools and guides for analyzing and reverse engineering GPT models. | 184 |
| An implementation of a machine learning-based communications system using deep learning techniques. | 127 |
| A Rust port of Google's robots.txt parser and matcher C++ library | 89 |
| A collection of tools and scripts for training large transformer language models at scale | 1,342 |
| A service that transforms and re-publishes JSON messages between MQTT topics. | 14 |
| A deep learning framework for training vision transformers from scratch on image data. | 1,162 |
| A Rust derive macro that automatically transforms Python dicts into Rust structs and vice versa. | 105 |
| Provides an efficient wrapper around CUDA FFTs for PyTorch transformations | 315 |
| An implementation of QANet with PyTorch for text classification tasks | 345 |
| A pre-trained transformer model for natural language understanding and generation tasks in Chinese | 482 |