clinicalBERT

Clinical Embeddings

Provides pre-trained embeddings for clinical text data

repository for Publicly Available Clinical BERT Embeddings

GitHub

674 stars
25 watching
135 forks
Language: Python
last commit: about 4 years ago
Linked from 2 awesome lists


Backlinks from these awesome lists:

Related projects:

Repository Description Stars
ncbi-nlp/bluebert Pre-trained language models for biomedical natural language processing tasks 558
dmis-lab/biobert Provides pre-trained language representation models for biomedical text mining tasks 1,954
kexinhuang12345/clinicalbert This repository provides pre-trained models and scripts for predicting hospital readmission based on clinical notes using the BERT framework. 381
allenai/scibert A BERT model trained on scientific text for natural language processing tasks 1,526
davidnemeskey/embert Provides pre-trained transformer-based models and tools for natural language processing tasks 2
facebookresearch/poincare-embeddings Implementation of Poincaré Embeddings algorithm in PyTorch for hierarchical representation learning 1,681
naver/biobert-pretrained Provides pre-trained weights for a biomedical language representation model 667
dbmdz/berts Provides pre-trained language models for natural language processing tasks 155
juliatext/embeddings.jl Provides access to pre-trained word embeddings for NLP tasks. 81
ncbi-nlp/biosentvec Pre-trained word and sentence embeddings for biomedical text analysis 578
aphp/edsnlp A modular NLP framework for extracting information from clinical notes, particularly French ones. 115
bohanli/bert-flow A TensorFlow implementation of sentence embedding from pre-trained language models 529
nlprinceton/text_embedding A utility class for generating and evaluating document representations using word embeddings. 54
botcenter/spanishwordembeddings This project generates Spanish word embeddings using fastText on large corpora. 9
jwieting/paragram-word Trains word embeddings from a paraphrase database to represent semantic relationships between words. 30