beto
Spanish NLP model
A pre-trained NLP model trained on Spanish text data using the BERT architecture
BETO - Spanish version of the BERT model
490 stars
39 watching
63 forks
last commit: about 2 years ago
Linked from 1 awesome list
bertbert-modelnlpspanishtransformerstransformers-library
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | Provides pre-trained language models for natural language processing tasks | 155 |
| | Pre-trained language models for biomedical natural language processing tasks | 560 |
| | A collection of precomputed word embeddings for the Spanish language, derived from different corpora and computational methods. | 354 |
| | Develops lightweight yet powerful pre-trained models for natural language processing tasks | 533 |
| | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
| | Implementing BERT-like NLP models in OCaml using PyTorch bindings and pre-trained weights from popular sources. | 24 |
| | Trains and evaluates a Chinese language model using adversarial training on a large corpus. | 140 |
| | A collection of pre-trained natural language processing models | 170 |
| | An open-source collection of Danish language models for natural language processing tasks | 30 |
| | A collection of linguistic resources and trained word embeddings for the Spanish language. | 45 |
| | A collection of pre-trained language models for natural language processing tasks | 989 |
| | An implementation of transformer models in PyTorch for natural language processing tasks | 1,257 |
| | A pre-trained language model for multiple natural language processing tasks with support for few-shot learning and transfer learning. | 656 |
| | Provides pre-trained BERT models for Nordic languages with limited training data. | 164 |
| | Provides pre-trained Chinese language models based on the ELECTRA framework for natural language processing tasks | 1,405 |