LamaCLI: Your Local LLM Assistant, Right in Your Terminal
LamaCLI is a powerful, feature-rich Command Line Interface that brings the magic of local Large Language Models (LLMs) directly to your workflow. Powered by Ollama, it is designed for developers who want to engage with AI without ever leaving their terminal.
Key Features:
🎯 Dual Operation Modes:
Interactive Mode: A full TUI (Terminal User Interface) experience with real-time streaming, chat history, and file browsing.
CLI Mode: Quick, one-shot commands (ask, suggest, explain) for rapid queries like generating git commands or explaining regex.
🧠 Context-Aware Intelligence: Stop copy-pasting code. Use the built-in file explorer or the @ command to inject file contents and directory context directly into your prompts for code reviews and debugging.
✨ Beautiful TUI Experience: Built with the Charm architecture, LamaCLI features fully rendered Markdown, syntax-highlighted code blocks, and customizable themes.
🤖 Total Model Control: Seamlessly switch between any installed Ollama models (Llama 3, Mistral, Gemma, etc.) and manage conversation history with auto-save functionality.
Built for Speed & Privacy:
LamaCLI runs locally on your machine, ensuring your code and queries remain private. It is built in Go for high performance and leverages Bubble Tea, Lipgloss, and Glamour to provide a stunning, responsive terminal interface.
Free and Open Source under the MIT License.
Built with