bitsandbytes
Language model helper
A Python library providing lightweight, hardware-accelerated operations for large language models.
Accessible large language models via k-bit quantization for PyTorch.
6k stars
52 watching
639 forks
Language: Python
last commit: 3 months ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
| A flexible Python framework for building and testing algorithmic trading strategies | 2,316 |
| A resource monitor tool that displays system usage statistics and allows for filtering, sorting, and customization of system processes. | 10,219 |
| A tool to monitor system resources and display detailed information in a customizable format | 21,529 |
| A re-implementation of Llama for efficient use with quantized weights on modern GPUs. | 2,783 |
| A PyTorch-based library for Bayesian optimization, providing a modular interface for composing and optimizing probabilistic models. | 3,126 |
| Tools and techniques for optimizing large language models on various frameworks and hardware platforms. | 2,257 |
| A library to support efficient mixed-precision matrix multiplications on GPUs for deep learning model deployment | 445 |
| A high-performance distributed deep learning framework supporting multiple frameworks and networks | 3,635 |
| A comprehensive C++ library for modeling, trading, and risk management in quantitative finance. | 5,480 |
| An open-source Python framework for backtesting trading strategies in cryptocurrencies using machine learning and technical analysis techniques. | 20 |
| An open-source software project that enables efficient and accurate low-bit weight quantization for large language models. | 2,593 |
| A Python library providing tensors and dynamic neural networks with strong GPU acceleration | 84,978 |
| A Python library for training quantum computers using programming techniques similar to neural networks | 2,409 |
| A high-performance Python profiler that analyzes CPU, GPU, and memory usage, providing detailed information and AI-powered optimization suggestions. | 12,274 |
| Developing and pretraining a GPT-like Large Language Model from scratch | 35,405 |