seq2seq
Sequence transformer
An attention-based sequence-to-sequence learning framework
Attention-based sequence to sequence learning
389 stars
26 watching
122 forks
Language: Python
last commit: almost 6 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
| An implementation of an attention mechanism using TensorFlow 2 to analyze time series data. | 7 |
| A toolkit for building sequence classification models in Python | 691 |
| Provides tools and frameworks for training sequence-to-sequence models using PyTorch | 523 |
| A PyTorch-based framework for building and training sequence-to-sequence models | 1,499 |
| Automates generation of discrete sequence text using adversarially regularized autoencoders | 20 |
| An iterator-based fluent interface for functional programming in Python | 11 |
| Tools for analyzing and processing sequence data from the Online Encyclopedia of Integer Sequences. | 46 |
| A Python NLP library for training and running sequence-to-sequence models similar to the fast.ai library. | 283 |
| An open-source sequence-to-sequence framework for neural machine translation built on PyTorch. | 1,212 |
| A Python library to manipulate and transform indexable data | 49 |
| An implementation of Sequence to Sequence models in PyTorch with various attention mechanisms and extensions for machine translation tasks. | 738 |
| A project demonstrating the use of a Sequence-to-Sequence Recurrent Neural Network (RNN) for time series forecasting in Python using TensorFlow. | 1,082 |
| A comprehensive Python library for developing state-of-the-art sequence understanding models, providing pre-trained models and tools for natural language processing tasks. | 426 |
| A framework for analyzing discrete sequences using probabilistic models | 36 |
| An implementation of a sequence-to-sequence model with attention mechanism using LSTMs and character embeddings for neural machine translation | 1,263 |