XVERSE-65B
Language Model
A large language model developed by XVERSE Technology Inc. using transformer architecture and fine-tuned on diverse data sets for various applications.
XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.
132 stars
5 watching
15 forks
Language: Python
last commit: 8 months ago Related projects:
Repository | Description | Stars |
---|---|---|
xverse-ai/xverse-13b | A large language model developed to support multiple languages and applications | 649 |
xverse-ai/xverse-moe-a36b | Develops and publishes large multilingual language models with advanced mixing-of-experts architecture. | 36 |
xverse-ai/xverse-7b | A multilingual large language model developed by XVERSE Technology Inc. | 50 |
xverse-ai/xverse-moe-a4.2b | Developed by XVERSE Technology Inc. as a multilingual large language model with a unique mixture-of-experts architecture and fine-tuned for various tasks such as conversation, question answering, and natural language understanding. | 36 |
xverse-ai/xverse-v-13b | A large multimodal model for visual question answering, trained on a dataset of 2.1B image-text pairs and 8.2M instruction sequences. | 77 |
ieit-yuan/yuan2.0-m32 | A high-performance language model designed to excel in tasks like natural language understanding, mathematical computation, and code generation | 180 |
langboat/mengzi3 | An 8B and 13B language model based on the Llama architecture with multilingual capabilities. | 2,032 |
tele-ai/telechat-52b | An open-source chat model built on top of the 52B large language model, with improvements in position encoding, activation function, and layer normalization. | 40 |
shawn-ieitsystems/yuan-1.0 | Large-scale language model with improved performance on NLP tasks through distributed training and efficient data processing | 591 |
microsoft/unicoder | This repository provides pre-trained models and code for understanding and generation tasks in multiple languages. | 88 |
ibm-granite/granite-3.0-language-models | A collection of lightweight state-of-the-art language models designed to support multilinguality, coding, and reasoning tasks on constrained resources. | 214 |
zhuiyitechnology/roformer-sim | An upgraded version of SimBERT model with integrated retrieval and generation capabilities | 438 |
openai/finetune-transformer-lm | This project provides code and model for improving language understanding through generative pre-training using a transformer-based architecture. | 2,160 |
turkunlp/wikibert | Provides pre-trained language models derived from Wikipedia texts for natural language processing tasks | 34 |
brightmart/xlnet_zh | Trains a large Chinese language model on massive data and provides a pre-trained model for downstream tasks | 230 |