MLServer
Inference server
An inference server for machine learning models with support for multiple frameworks and scalable deployment options.
An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more
737 stars
28 watching
184 forks
Language: Python
last commit: about 1 month ago
Linked from 1 awesome list
kfservinglightgbmmachine-learningmlflowscikit-learnseldon-corexgboost
Related projects:
Repository | Description | Stars |
---|---|---|
seldonio/seldon-server | A platform for deploying machine learning models into production on-premise or in the cloud using Kubernetes and various machine learning frameworks. | 1,472 |
axsaucedo/seldon-core | Platform to deploy machine learning models on Kubernetes at scale | 18 |
triton-inference-server/client | Client libraries and examples for communicating with Triton using various programming languages | 579 |
seldonio/seldon-core | An MLOps framework for packaging, deploying, and managing machine learning models on Kubernetes at scale | 4,409 |
seldonio/tempo | An MLOps Python library that enables data scientists to deploy and orchestrate machine learning pipelines for production-ready inference. | 117 |
roboflow/inference | A platform for deploying and fine-tuning computer vision models in production-ready environments. | 1,401 |
amplab/velox-modelserver | A system for serving machine learning predictions in real-time and integrating with Spark and KeystoneML. | 110 |
mlcommons/inference | Measures the performance of deep learning models in various deployment scenarios. | 1,256 |
mosecorg/mosec | A high-performance ML model serving framework | 802 |
eightbec/fastapi-ml-skeleton | A FastAPI-based framework for serving machine learning models in production-ready applications | 412 |
aria42/infer | A Clojure-based library for building machine learning and statistical models in a flexible and composable way. | 176 |
ebhy/budgetml | Simplifies deployment of machine learning models to production-ready endpoints with minimal configuration and cost. | 1,341 |
utensor/utensor | A lightweight machine learning inference framework built on Tensorflow optimized for Arm targets. | 1,742 |
hydrospheredata/hydro-serving | A MLOps platform for deploying and versioning machine learning models in production. | 271 |
jvalegre/robert | Automated machine learning protocols for cheminformatics using Python | 39 |