grok-1
MoE model
An implementation of a Mixture of Experts (MoE) model with large parameters and specialized features for natural language processing tasks.
Grok open release
50k stars
589 watching
8k forks
Language: Python
last commit: 6 months ago Related projects:
Repository | Description | Stars |
---|---|---|
| An optimized implementation of OpenAI's Whisper Model for speech recognition and speech-to-text tasks using JAX. | 4,467 |
| A minimal PyTorch implementation of a transformer-based language model | 20,474 |
| High-quality implementations of reinforcement learning algorithms for research and development purposes | 15,885 |
| A fast and efficient tokeniser for natural language models based on Byte Pair Encoding (BPE) | 12,703 |
| Enables rapid creation and deployment of web applications for machine learning models and functions using Python | 34,557 |
| An interactive code generation and execution tool using AI models | 3,567 |
| A repository providing code and models for research into language modeling and multitask learning | 22,644 |
| Guides software developers on how to effectively use and build systems around Large Language Models like GPT-4. | 8,487 |
| A tool for retraining and fine-tuning the OpenAI GPT-2 text generation model on new datasets. | 3,398 |
| A toolkit for developing and comparing reinforcement learning algorithms using a standardized API to interact with environments. | 34,966 |
| A general-purpose speech recognition system trained on large-scale weak supervision | 72,752 |
| A gRPC client library with two modes: REPL and CLI, providing automatic service inspection and task automation | 4,304 |
| A PyTorch-based framework for training and sampling consistency models in image generation | 6,199 |
| An implementation of YOLOv3 in PyTorch for object detection and tracking | 7,343 |
| An open-source toolkit for training and deploying large-scale AI models on various downstream tasks with multi-modality | 3,840 |