fedi.matsuu.org

fedi.matsuu.org

matsuu . @matsuu,

Ollama代替のmacOS用ローカルLLMサーバ。MLX対応。ModelはOllama同様に複数から選べる。ブラウザからアクセスを想定してCORSにも対応。ほほう
---
GitHub - dinoki-ai/osaurus: Native, Apple Silicon–only local LLM server. Similar to Ollama, but built on Apple's MLX for maximum performance on M‑series chips. SwiftUI app + SwiftNIO server with OpenAI‑compatible endpoints.
https://github.com/dinoki-ai/osaurus
#bookmarks

Open thread