roformer
Transformer model
An enhanced transformer model with improved relative position embeddings for natural language processing tasks
Rotary Transformer
819 stars
8 watching
50 forks
Language: Python
last commit: over 2 years ago Related projects:
Repository | Description | Stars |
---|---|---|
zhuiyitechnology/roformer-v2 | A faster and more effective text processing model based on the RoFormer architecture | 149 |
zhuiyitechnology/roformer-sim | An upgraded version of SimBERT model with integrated retrieval and generation capabilities | 438 |
zhuiyitechnology/gau-alpha | An implementation of a Gated Attention Unit-based Transformer model for natural language processing tasks | 96 |
tongjilibo/bert4torch | An implementation of transformer models in PyTorch for natural language processing tasks | 1,241 |
thudm/chinese-transformer-xl | A pre-trained Chinese language model based on the Transformer-XL architecture. | 218 |
zhuiyitechnology/wobert | A pre-trained Chinese language model that uses word embeddings and is designed to process Chinese text | 458 |
lucidrains/reformer-pytorch | An implementation of Reformer, an efficient Transformer model for natural language processing tasks. | 2,120 |
fastnlp/cpt | A pre-trained transformer model for natural language understanding and generation tasks in Chinese | 481 |
zhuiyitechnology/t5-pegasus | Chinese generation model based on T5 architecture, trained using PEGASUS method | 555 |
leviswind/pytorch-transformer | Implementation of a transformer-based translation model in PyTorch | 239 |
yangjianxin1/ofa-chinese | Transforms the OFA-Chinese model to work with the Hugging Face Transformers framework | 123 |
chrislemke/sk-transformers | Provides a collection of reusable data transformation tools | 8 |
german-nlp-group/german-transformer-training | Trains German transformer models to improve language understanding | 23 |
robostack/jupyter-ros | Enables Jupyter notebooks to interact with ROS and its Python ecosystem. | 591 |
rosinality/glow-pytorch | A PyTorch implementation of Glow, a generative flow model using invertible 1x1 convolutions. | 518 |