Structured-Self-Attention

Self attention model

A deep learning model that generates sentence embeddings using structured self-attention and is used for binary and multiclass classification tasks.

A Structured Self-attentive Sentence Embedding

GitHub

494 stars
13 watching
106 forks
Language: Python
last commit: about 5 years ago
attentionattention-mechanismattention-modelattention-weightsclassificationdeep-learningpython3pytorchself-attentionself-attentive-rnnsentence-embeddingsvisualization

Related projects:

Repository Description Stars
javeywang/pyramid-attention-networks-pytorch An implementation of a deep learning model using PyTorch for semantic segmentation tasks. 235
peteanderson80/bottom-up-attention Trains a bottom-up attention model using Faster R-CNN and Visual Genome annotations for image captioning and VQA tasks 1,433
sinashish/multi-scale-attention A deep learning framework for medical image segmentation using multi-scale guided attention mechanisms to improve accuracy and reduce irrelevant information. 460
openai/sparse_attention Provides primitives for sparse attention mechanisms used in transformer models to improve computational efficiency and scalability 1,524
speedinghzl/ccnet An implementation of a deep learning model for semantic segmentation using a novel attention mechanism to capture long-range dependencies in images. 1,426
koichiro11/residual-attention-network An image classification neural network implementation using attention mechanisms and residual learning 94
emedvedev/attention-ocr A TensorFlow model for recognizing text in images using visual attention and a sequence-to-sequence architecture. 1,077
akosiorek/attend_infer_repeat An implementation of Attend, Infer, Repeat, a method for fast scene understanding using generative models. 82
hszhao/psanet A deep learning framework for semantic segmentation with spatial attention mechanisms 216
nikitakit/self-attentive-parser An NLP parser with high accuracy models for multiple languages 871
lxtgh/omg-seg Develops an end-to-end model for multiple visual perception and reasoning tasks using a single encoder, decoder, and large language model. 1,300
jiasenlu/adaptiveattention Adaptive attention mechanism for image captioning using visual sentinels 334
stared/keras-sequential-ascii A tool for displaying detailed ASCII summaries of Keras sequential models. 126
szagoruyko/attention-transfer Improves performance of convolutional neural networks by transferring knowledge from teacher models to student models using attention mechanisms. 1,444
pistony/residualattentionnetwork A Gluon implementation of Residual Attention Network for image classification tasks 107