paragram-word
Word embeddings trainer
Trains word embeddings from a paraphrase database to represent semantic relationships between words.
Python code for training Paragram word embeddings. These achieve human-level performance on some word similiarty tasks including SimLex-999.This code was used to obtain results in the appendix of our 2015 TACL paper "From Paraphrase Database to Compositional Paraphrase Model and Back".
30 stars
3 watching
12 forks
Language: Python
last commit: about 9 years ago Related projects:
Repository | Description | Stars |
---|---|---|
| Code for training universal paraphrastic sentence embeddings and models on semantic similarity tasks | 193 |
| A collection of pre-trained models and code for training paraphrastic sentence embeddings from large machine translation datasets. | 102 |
| A codebase for training and using models of sentence embeddings. | 33 |
| A tool for training and using character n-gram based word and sentence embeddings in natural language processing. | 125 |
| Provides fast and efficient word embeddings for natural language processing. | 223 |
| A PyTorch implementation of the skip-gram model for learning word embeddings. | 188 |
| Provides training and testing code for a CNN-based sentence embedding model | 2 |
| Improves word embeddings by training with adversarial objectives | 118 |
| This repository provides an unsupervised approach to learning character-aware word and context embeddings. | 0 |
| Provides methods for evaluating word embeddings on various benchmarks | 437 |
| A project implementing a method to incorporate morphological information into word embeddings using a neural network model | 52 |
| Tools and techniques for analyzing word meanings from word embeddings | 212 |
| Provides access to pre-trained word embeddings for NLP tasks. | 81 |
| A framework to learn word embeddings using lexical dictionaries | 115 |
| A Python implementation of a neural network model for learning word embeddings from text data | 6 |