MLServer
Inference server
An inference server for machine learning models with support for multiple frameworks and scalable deployment options.
An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more
720 stars
27 watching
183 forks
Language: Python
last commit: 9 days ago
Linked from 1 awesome list
kfservinglightgbmmachine-learningmlflowscikit-learnseldon-corexgboost
Related projects:
Repository | Description | Stars |
---|---|---|
seldonio/seldon-server | A platform for deploying machine learning models into production on-premise or in the cloud using Kubernetes and various machine learning frameworks. | 1,473 |
axsaucedo/seldon-core | Platform to deploy machine learning models on Kubernetes at scale | 18 |
triton-inference-server/client | Client libraries and examples for communicating with Triton using various programming languages | 567 |
seldonio/seldon-core | An MLOps framework for packaging, deploying, and managing machine learning models on Kubernetes at scale | 4,384 |
seldonio/tempo | An MLOps Python library that enables data scientists to deploy and orchestrate machine learning pipelines for production-ready inference. | 116 |
roboflow/inference | A platform for deploying and fine-tuning computer vision models in production-ready environments. | 1,363 |
amplab/velox-modelserver | A system for serving machine learning predictions in real-time, supporting personalized predictions and model training. | 110 |
mlcommons/inference | Measures the performance of deep learning models in various deployment scenarios. | 1,236 |
mosecorg/mosec | A high-performance ML model serving framework | 790 |
eightbec/fastapi-ml-skeleton | A FastAPI-based framework for serving machine learning models in production-ready applications | 394 |
aria42/infer | A Clojure-based library for building machine learning and statistical models in a flexible and composable way. | 176 |
ebhy/budgetml | Simplifies deployment of machine learning models to production-ready endpoints with minimal configuration and cost. | 1,338 |
utensor/utensor | A lightweight machine learning inference framework built on Tensorflow optimized for Arm targets. | 1,729 |
hydrospheredata/hydro-serving | A MLOps platform for deploying and versioning machine learning models in production. | 271 |
jvalegre/robert | Automated machine learning protocols for cheminformatics using Python | 38 |