MorphologicalPriorsForWordEmbeddings
Word embedding model
A project implementing a method to incorporate morphological information into word embeddings using a neural network model
Code for EMNLP 2016 paper: Morphological Priors for Probabilistic Word Embeddings
52 stars
9 watching
12 forks
Language: Python
last commit: almost 9 years ago blocksemnlpneural-networknlptheanoword-embeddings
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | A PyTorch implementation of the skip-gram model for learning word embeddings. | 188 |
| | A Python implementation of a neural network model for learning word embeddings from text data | 6 |
| | This repository provides an unsupervised approach to learning character-aware word and context embeddings. | 0 |
| | Provides fast and efficient word embeddings for natural language processing. | 223 |
| | Develops unified sentence embedding models for NLP tasks | 840 |
| | A collection of pre-trained subword embeddings in 275 languages, useful for natural language processing tasks. | 1,189 |
| | A utility class for generating and evaluating document representations using word embeddings. | 54 |
| | A Python implementation of a topical word embedding technique used in natural language processing and information retrieval. | 314 |
| | This implementation provides a way to represent words as multivariate Gaussian distributions, allowing scalable word embeddings. | 190 |
| | An implementation of a non-parameterized approach for building sentence representations | 19 |
| | A fast and efficient utility package for utilizing vector embeddings in machine learning models | 1,635 |
| | An implementation of a word embedding model that uses character n-grams and achieves state-of-the-art results in multiple NLP tasks | 803 |
| | A deep learning model that generates word embeddings by predicting words based on their dependency context | 291 |
| | Trains word embeddings from a paraphrase database to represent semantic relationships between words. | 30 |
| | A library for training and evaluating a type of word embedding model that extends the original Word2Vec algorithm | 20 |