Chinese-BERT-wwm
Chinese language models
Develops and publishes pre-trained Chinese language models using Whole Word Masking technology.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
10k stars
143 watching
1k forks
Language: Python
last commit: over 1 year ago bertbert-wwmbert-wwm-extchinese-bertnlppytorchrbtrobertaroberta-wwmtensorflow
Related projects:
Repository | Description | Stars |
---|---|---|
| A toolkit for Chinese natural language processing tasks | 2,648 |
| A collection of pre-trained language models and optimization techniques for efficient natural language processing | 3,039 |
| A massive corpus of Chinese text data covering various forms and styles | 3,581 |
| A deep learning framework for generating videos from text inputs and visual features. | 3,071 |
| Compiles and organizes key papers on pre-trained language models, providing a resource for developers and researchers. | 3,331 |
| A collection of tutorials teaching deep learning with TensorFlow using Jupyter Notebooks | 6,003 |
| Provides tools and libraries for training and fine-tuning large language models using transformer architectures | 6,215 |
| Tutorials and code examples for learning deep learning with PyTorch | 2,822 |
| A high-performance neural network inference framework supporting various deep learning frameworks and hardware platforms. | 4,435 |
| Implementations of a neural network architecture for language modeling | 3,619 |
| An NLP project offering various text classification models and techniques for deep learning exploration | 7,881 |
| Provides pre-trained models for Chinese natural language processing tasks using the XLNet architecture | 1,652 |
| An explanation of key concepts and advancements in the field of Machine Learning | 7,352 |
| Implements RoBERTa for Chinese pre-training using TensorFlow and provides PyTorch versions for loading and training | 2,638 |
| Real-time speech synthesis using state-of-the-art architectures | 3,855 |