attention-translation-keras
Neural machine translation model
An implementation of an attention-based sequence-to-sequence neural machine translation model in Keras.
Attention based sequence to sequence neural machine translation model built in keras.
30 stars
6 watching
13 forks
Language: Python
last commit: almost 7 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
| An implementation of Ladder Network architecture for semi-supervised learning using Keras and TensorFlow | 101 |
| A Pytorch implementation of a neural network model for machine translation | 47 |
| Represents an implementation of the Inception-ResNet v2 deep learning model in Keras. | 180 |
| A repository implementing a Keras-based architecture for improving neural network representation power with attention mechanisms and squeeze-and-excitation networks. | 363 |
| An implementation of the SqueezeNet neural network model in the Keras framework | 404 |
| An implementation of a deep learning model for image segmentation using Keras and dilated convolutions | 301 |
| An implementation of a deep learning architecture for image segmentation using the Keras framework. | 185 |
| An implementation of a deep neural network architecture for real-time semantic segmentation in Python. | 115 |
| Enables distributed deep learning with Keras and Spark for scalable model training | 1,574 |
| A unified interface to various deep learning architectures | 818 |
| An implementation of MnasNet architecture in Keras for TensorFlow using Python | 99 |
| An open-source tool to extract and visualize layer outputs and gradients in Keras models | 1,050 |
| This project demonstrates basic semantic segmentation using a deep neural network in Keras. | 56 |
| An implementation of ResNeXt models in Keras, allowing for efficient deep neural networks for image classification. | 224 |
| An implementation of a neural network model for character-level language modeling. | 50 |