From a45231af47035e89031625d47210663881a4e2cf Mon Sep 17 00:00:00 2001 From: Min Yoo Date: Sun, 25 May 2025 05:18:32 +0900 Subject: [PATCH] readme: Add macLlama to community integrations (#10790) This commit updates the README to include macLlama within the community integrations section. macLlama is a native macOS application built for lightweight and efficient LLM interaction. Key features include: * **Lightweight & Native:** Designed to be resource-friendly and perform optimally on macOS. * **Chat-like Interface:** Provides a user-friendly, conversational interface. * **Multiple Window Support:** Allows users to manage multiple conversations simultaneously. The primary goal of macLlama is to offer a simple and easy-to-run LLM experience on macOS. --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 6a4815c1..00de95a7 100644 --- a/README.md +++ b/README.md @@ -406,6 +406,7 @@ See the [API documentation](./docs/api.md) for all endpoints. - [AppFlowy](https://github.com/AppFlowy-IO/AppFlowy) (AI collaborative workspace with Ollama, cross-platform and self-hostable) - [Lumina](https://github.com/cushydigit/lumina.git) (A lightweight, minimal React.js frontend for interacting with Ollama servers) - [Tiny Notepad](https://pypi.org/project/tiny-notepad) (A lightweight, notepad-like interface to chat with ollama available on PyPI) +- [macLlama (macOS native)](https://github.com/hellotunamayo/macLlama) (A native macOS GUI application for interacting with Ollama models, featuring a chat interface.) ### Cloud