mace

Mobile AI Compute Engine

A framework for deep learning inference on mobile devices

MACE is a deep learning inference framework optimized for mobile heterogeneous computing platforms.

GitHub

5k stars
229 watching
819 forks
Language: C++
last commit: 5 months ago
Linked from 1 awesome list

deep-learninghvxmachine-learningneonneural-networkopencl

Backlinks from these awesome lists:

Related projects:

Repository Description Stars
autumnai/leaf An open machine learning framework for building classical, deep, or hybrid models on various hardware platforms. 5,557
ludwig-ai/ludwig A low-code framework for building custom deep learning models and neural networks 11,189
sjtu-ipads/powerinfer An efficient Large Language Model inference engine leveraging consumer-grade GPUs on PCs 7,964
tencent/pocketflow A framework that automatically compresses and accelerates deep learning models to make them suitable for mobile devices with limited computational resources. 2,788
open-mmlab/mmdeploy A toolset for deploying deep learning models on various devices and platforms 2,774
google-ai-edge/mediapipe A platform providing pre-built machine learning models and APIs for cross-platform deployment on various devices 27,608
microsoft/deepspeed-mii A Python library designed to accelerate model inference with high-throughput and low latency capabilities 1,898
alibaba/mnn A lightweight deep learning framework developed by Alibaba for efficient inference and training of neural networks on-device. 8,758
dotnet/machinelearning A cross-platform machine learning framework for .NET that enables developers to build, train, and deploy models without prior expertise in ML. 9,045
utensor/utensor A lightweight machine learning inference framework built on Tensorflow optimized for Arm targets. 1,729
exo-explore/exo Allows developers to run AI models on personal devices with diverse hardware configurations. 14,829
deepjavalibrary/djl A high-level Java framework for building and deploying deep learning models 4,144
paddlepaddle/fastdeploy A toolkit for easy and high-performance deployment of deep learning models on various hardware platforms 2,998
openvinotoolkit/openvino A toolkit for optimizing and deploying artificial intelligence models in various applications 7,321
fminference/flexllmgen Generates large language model outputs in high-throughput mode on single GPUs 9,192