ChatAbstractions
Chat model wrapper
Provides a framework for creating custom chat models with dynamic failover and load balancing features
LangChain chat model abstractions for dynamic failover, load balancing, chaos engineering, and more!
79 stars
2 watching
6 forks
Language: Python
last commit: almost 2 years ago
Linked from 1 awesome list
Related projects:
| Repository | Description | Stars |
|---|---|---|
| | A conversational language model developed to improve understanding of complex instructions and Chinese vocabulary. | 62 |
| | An open-source chat model built on top of the 52B large language model, with improvements in position encoding, activation function, and layer normalization. | 40 |
| | An all-in-one solution for integrating AI models into IM chatbots. | 59 |
| | A large language model with 70 billion parameters designed for chatbot and conversational AI tasks | 29 |
| | A no-code chat-ai toolkit built on top of LangChain. | 886 |
| | Develops and deploys conversational AI models for health-related applications by leveraging large-scale datasets and collaborative research | 752 |
| | Pre-trained chatbot models for Chinese open-domain dialogue systems | 306 |
| | A syntax for defining Natural Language Understanding models for conversational interfaces | 37 |
| | A unified multimodal language model capable of interpreting and reasoning about various modalities without paired data. | 49 |
| | A large language model designed to support long context conversations with improved efficiency and effectiveness | 38 |
| | Provides a suite of AI-powered models for mental health support and evaluation | 625 |
| | An abstraction layer for integrating AI models into JavaScript and TypeScript applications. | 1,178 |
| | An implementation of a conversational model using sequence-to-sequence learning and LSTM layers in Torch | 777 |
| | Provides Elixir bindings for interacting with a machine learning chat model using C++ | 10 |
| | This repository contains code for training and using a multilingual chat model with 176 billion parameters. | 588 |