Transformers
Neural networks library
An implementation of deep neural network architectures, including Transformers, in Python.
Transformers and related deep network architectures are summarized and implemented here.
214 stars
5 watching
27 forks
Language: Jupyter Notebook
last commit: over 1 year ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
| A comprehensive guide to using the Transformers library for natural language processing tasks | 1,220 |
| A lightweight library for defining and training neural networks in TensorFlow. | 373 |
| A collection of GPU-accelerated deep learning algorithms implemented in Python | 895 |
| An implementation of a machine learning-based communications system using deep learning techniques. | 127 |
| A Python implementation of graph matching-based deep neural network fusion with applications to model ensemble and federated learning. | 18 |
| Deep invertible neural network implementation using PyTorch for image recognition and reconstruction tasks. | 390 |
| An artificial neural network library for rapid prototyping and extension in Haskell. | 378 |
| Research tool for training large transformer language models at scale | 1,926 |
| A Python library for implementing and training various neural network architectures | 40 |
| A Python library for training neural networks with focus on hydrological applications using PyTorch. | 372 |
| A collection of tools and scripts for training large transformer language models at scale | 1,342 |
| An implementation of artificial neural networks using NumPy | 98 |
| A Python library for building and training feedforward neural networks | 35 |
| A lightweight neural network library optimized for sparse data and single machine environments. | 1,292 |
| A Haskell library for building neural networks and working with tensors. | 1,083 |