BAM
Attention module implementation
An implementation of the Bottleneck Attention Module in TensorFlow using attention mechanism
Implementation of BAM: Bottleneck Attention Module with TensorFLow
12 stars
2 watching
2 forks
Language: Python
last commit: almost 6 years ago
Linked from 1 awesome list
attention-mechanismbamtensorflow
Related projects:
Repository | Description | Stars |
---|---|---|
jongchan/attention-module | Provides implementations of attention modules for computer vision tasks using PyTorch | 2,061 |
elbuco1/cbam | A deep learning project that develops and tests a novel attention mechanism called CBAM for image classification tasks. | 104 |
kobiso/cbam-keras | A repository implementing a Keras-based architecture for improving neural network representation power with attention mechanisms and squeeze-and-excitation networks. | 361 |
luuuyi/cbam.pytorch | PyTorch implementation of the CBAM module for refining feature maps in deep networks | 1,337 |
pistony/residualattentionnetwork | A Gluon implementation of Residual Attention Network for image classification tasks | 107 |
openai/sparse_attention | Provides primitives for sparse attention mechanisms used in transformer models to improve computational efficiency and scalability | 1,524 |
batzner/tensorlm | A library for text generation with recurrent neural networks using TensorFlow | 61 |
ngxbac/gain | A PyTorch implementation of an attention-guided inference network to focus on specific areas of objects in images | 48 |
tqtg/hierarchical-attention-networks | An implementation of a neural network architecture for document classification using hierarchical attention mechanisms | 86 |
jnhwkim/cbp | An implementation of a pooling technique for multimodal neural networks in Torch7 | 68 |
koichiro11/residual-attention-network | An image classification neural network implementation using attention mechanisms and residual learning | 94 |
neuro-inc/ml-recipe-hier-attention | An implementation of a neural network architecture for sentiment classification using hierarchical attention mechanisms. | 2 |
akabe/slap | A linear algebra library with type-based static size checking for matrix operations. | 88 |
sangminwoo/recyclenet | Develops an attention-based trash classification model using transfer learning and feature learning with attention module for effective waste management. | 39 |
jjjkkkjjj/matft | A Numpy-like library in Swift for multi-dimensional array and matrix operations | 133 |