bi-att-flow
Question Answering Model
Develops a deep learning model for natural language processing to answer questions
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
2k stars
105 watching
679 forks
Language: Python
last commit: over 1 year ago
Linked from 1 awesome list
bidafnlpquestion-answeringsquadtensorflow
Related projects:
Repository | Description | Stars |
---|---|---|
| An open-source implementation of an image segmentation model that combines background removal and object detection capabilities. | 1,484 |
| Tools and codebase for training neural question answering models on multiple paragraphs of text data | 435 |
| An implementation of a method to generate sentence embeddings from pre-trained language models using TensorFlow. | 530 |
| A BERT model trained on scientific text for natural language processing tasks | 1,532 |
| A bilingual LLaMA model with enhanced reasoning ability trained on a mix of task-oriented and conversational data. | 421 |
| A framework for building modular, sequential application logic using flow-based programming principles | 198 |
| A Python-based framework for automating the process of finding and training the best-performing machine learning model for regression and classification tasks on numerical data. | 176 |
| An implementation of a deep neural network architecture for Human Activity Recognition using stacked residual bidirectional LSTM cells with TensorFlow. | 319 |
| This project presents a neural network model designed to answer visual questions by combining question and image features in a residual learning framework. | 39 |
| A cash flow modeling tool for structured finance professionals | 42 |
| This project implements a deep metric learning framework using an adversarial auxiliary loss to improve robustness. | 39 |
| Re-implementation of a neural network model with batch normalization and customized training parameters. | 131 |
| An implementation of the Bottleneck Attention Module in TensorFlow using attention mechanism | 12 |
| Re-implementation of a 100-layer fully convolutional network architecture for image segmentation | 123 |
| Automates the search for optimal neural network configurations in deep learning applications | 468 |