wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference

GitHub

350 stars
6 watching
15 forks
Language: C++
last commit: 7 days ago
Linked from 1 awesome list

llamallamacppllmwasmwebassembly

Backlinks from these awesome lists: