bertviz
Attention visualizer
An interactive tool for visualizing attention in Transformer language models.
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
7k stars
73 watching
790 forks
Language: Python
last commit: about 2 years ago bertgpt2machine-learningnatural-language-processingneural-networknlppytorchrobertatransformertransformersvisualization
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | An implementation of Google's 2018 BERT model in PyTorch, allowing pre-training and fine-tuning for natural language processing tasks | 6,251 |
| | Tools to aid in designing and visualizing neural network architectures | 4,525 |
| | This tool allows developers to extract and analyze topics from large text datasets using transformer-based models and topic modeling techniques. | 6,246 |
| | Provides pre-trained models and code for natural language processing tasks using TensorFlow | 38,374 |
| | An implementation of the Graph Attention Network model using PyTorch. | 2,935 |
| | An interactive visualization system to help non-experts learn about Convolutional Neural Networks (CNNs) by visualizing the learning process. | 8,204 |
| | An open-source tool that helps investigate specific behaviors of small language models by combining automated interpretability techniques with sparse autoencoders. | 4,047 |
| | A Keras layer implementing an attention mechanism for natural language processing tasks | 2,803 |
| | A tool to visualize PyTorch execution graphs and traces | 3,251 |
| | An implementation of Graph Attention Networks in TensorFlow for graph representation learning. | 3,261 |
| | An interactive visualization tool to help users understand how large language models like GPT work | 3,604 |
| | Implementing BERT-like NLP models in OCaml using PyTorch bindings and pre-trained weights from popular sources. | 24 |
| | A collection of tutorials teaching deep learning with TensorFlow using Jupyter Notebooks | 6,003 |
| | An implementation of the Transformer model in PyTorch, a deep learning framework for sequence-to-sequence tasks like language translation. | 8,936 |
| | Implements RoBERTa for Chinese pre-training using TensorFlow and provides PyTorch versions for loading and training | 2,638 |