biobert-pretrained

Language Model

Provides pre-trained weights for a biomedical language representation model

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

GitHub

667 stars
26 watching
88 forks
last commit: over 4 years ago
Linked from 2 awesome lists


Backlinks from these awesome lists:

Related projects:

Repository Description Stars
dmis-lab/biobert Provides pre-trained language representation models for biomedical text mining tasks 1,954
ncbi-nlp/bluebert Pre-trained language models for biomedical natural language processing tasks 558
balavenkatesh3322/nlp-pretrained-model A collection of pre-trained natural language processing models 170
allenai/scibert A BERT model trained on scientific text for natural language processing tasks 1,526
turkunlp/wikibert Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks 34
ncbi-nlp/biosentvec Pre-trained word and sentence embeddings for biomedical text analysis 578
ethan-yt/guwenbert A pre-trained language model for classical Chinese based on RoBERTa and ancient literature. 506
ymcui/macbert Improves pre-trained Chinese language models by incorporating a correction task to alleviate inconsistency issues with downstream tasks 645
langboat/mengzi Develops lightweight yet powerful pre-trained models for natural language processing tasks 534
certainlyio/nordic_bert Provides pre-trained BERT models for Nordic languages with limited training data. 161
dbmdz/berts Provides pre-trained language models for natural language processing tasks 155
zhuiyitechnology/pretrained-models A collection of pre-trained language models for natural language processing tasks 987
igobronidze/hrs_training_data Training data for a handwritten recognition system 20
ymcui/pert Develops a pre-trained language model to learn semantic knowledge from permuted text without mask labels 354
valuesimplex/finbert An open-source BERT-based language model pre-trained on financial text data 677