NativeMind: Your fully private, open-source, on-device AI assistant
-
Updated
Mar 4, 2026 - TypeScript
NativeMind: Your fully private, open-source, on-device AI assistant
The Swiss Army Knife of Offline AI. Chat, Speak, and Generate Images - Privacy First, Zero Internet. Download an LLM and use it on your mobile device. No data ever leaves your phone. Supports text-to-text, vision, text-to-image
Local-first AI-powered document intelligence platform for investigative journalism
React Native Apple LLM plugin using Foundation Models
Cognitive memory for AI agents — learns from use, forgets what's irrelevant, strengthens what matters. Single binary, fully offline.
The AI agent script CLI for Programmable Prompt Engine.
True on-device AI for Kotlin Multiplatform (Android, iOS, Desktop, JVM, WASM). LLM, Speech-to-Text and Image Generation — powered by llama.cpp, whisper.cpp and stable-diffusion.cpp.
Run a <400ms latency Voice Agent on just 4GB VRAM. Fully offline, no API keys required. Optimized for GTX 1650 and edge robotics with zero-copy inference. (Apache 2.0)
AI-powered media search — find images and videos using natural language or visual queries
Your models on any xPU
An interactive educational game built for the Google Gemma 3n Impact Challenge.
152 open-source tools to run LLMs 100% locally – no cloud, no API keys, no censorship
Command Line telepathy. An Autonomous Al Agent for your Terminal that turns intent into Execution (Windows/Linux/Mac)
Core of Artificial Intelligence
Flutter package for local AI inference using native OS APIs - iOS Foundation Models, Android ML Kit GenAI, and Windows AI APIs. Zero model downloads required.
Local-first CLI that turns Markdown scripts into multi-speaker podcast-style audio using Coqui XTTS v2.
A real-time, fully local voice AI system optimized for low-resource devices like an 8GB Ubuntu laptop with no GPU, achieving sub-second STT-to-TTS latency using Ollama, Vosk, Piper, and JACK/PipeWire. Open-source and privacy-focused for offline conversational AI.
OfflineAI is an artificial intelligence that operates offline and uses machine learning to perform various tasks based on the code provided. It is built using two powerful AI models by Mistral AI.
ollama client for android
Multi-provider routing for Claude Code CLI. Use your Copilot subscription, Ollama offline, or Anthropic Direct.
Add a description, image, and links to the offline-ai topic page so that developers can more easily learn about it.
To associate your repository with the offline-ai topic, visit your repo's landing page and select "manage topics."