onnxruntime

ML accelerator

An open source software framework for high-performance machine learning inference and training acceleration

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

GitHub

15k stars
246 watching
3k forks
Language: C++
last commit: 5 days ago
Linked from 3 awesome lists

ai-frameworkdeep-learninghardware-accelerationmachine-learningneural-networksonnxpytorchscikit-learntensorflow

Backlinks from these awesome lists:

Related projects:

Repository Description Stars
microsoft/onnxruntime-inference-examples Repository providing examples for using ONNX Runtime (ORT) to perform machine learning inferencing. 1,212
onnx/onnx Enables interoperability between different machine learning frameworks by providing an open standard format for AI models 17,938
microsoft/onnxruntime-training-examples Accelerates training of large transformer models by providing optimized kernels and memory optimizations. 312
emergentorder/onnx-scala An API and backend for running ONNX models in Scala 3 using typeful, functional deep learning and classical machine learning. 138
dotnet/machinelearning A cross-platform machine learning framework for .NET that enables developers to build, train, and deploy models without prior expertise in ML. 9,035
microsoft/cntk A unified deep learning toolkit that describes neural networks as a series of computational steps via a directed graph. 17,523
xboot/libonnx A lightweight onnx inference engine for embedded devices with hardware acceleration support 583
kraiskil/onnx2c Generates C code from ONNX files for efficient neural network inference on microcontrollers 223
microsoft/mmdnn A toolset to convert and manage deep learning models across multiple frameworks. 5,797
alrevuelta/connxr An embedded device-friendly C ONNX runtime with zero dependencies 193
microsoft/deepspeed A deep learning optimization library that makes distributed training and inference easy, efficient, and effective. 35,463
triton-inference-server/server Provides an optimized cloud and edge inferencing solution for AI models 8,342
tensorflow/serving A high-performance serving system for machine learning models in production environments. 6,185
microsoft/lightgbm A high-performance gradient boosting framework for machine learning tasks 16,694
xiaomi/mace A framework for deep learning inference on mobile devices 4,934