Llamatik - Local AI Chatbot

Llamatik - Local AI Chatbot icon

ASO Keyword Dashboard

Tracking 0 keywords for Llamatik - Local AI Chatbot in Apple App Store

Developer: FERRAN PONS SANCHEZ Category: productivity

Llamatik - Local AI Chatbot doesn't have tracked keywords yet.

No keywords tracked yet

We could not generate keyword insights for this app. Add more metadata or refresh the crawl to populate ASO keywords.

App Description

Llamatik — On-Device AI Chatbot

Llamatik brings the power of lightweight open-source AI models directly to your iPhone or iPad. All text generation runs fully offline, on your device, without sending your conversations to servers. It’s fast, private, and designed to showcase what the Llamatik library can do on iOS.

With Llamatik, you can explore model-based text generation, summarize content, brainstorm ideas, or simply chat — all powered by efficient, quantized models that run locally.



Fully Offline AI

All model inference happens on your device.
Your chats are never uploaded, stored, or shared.



Lightweight & Fast Models

Llamatik supports multiple open-source models in GGUF format, including:
• Gemma 3 270M
• SmolVLM 256M / 500M
• Phi-1.5
• Qwen 2.5 5B (quantized)
• Llama 3.2 1B (quantized)

Choose the model that fits your device and your needs.



Built-In Model Downloader

Download and manage AI models directly in the app:
• Auto-download on first launch
• Delete models you no longer need
• Cancel ongoing downloads at any time

Switch models at any moment to compare performance and quality.



Clean and Modern Interface

Llamatik features a simple, elegant chat interface built with Kotlin Multiplatform:
• Smart suggestions
• Model selector
• Quick prompts
• Smooth onboarding
• Minimalist design



Privacy First

Your content stays entirely on your device.
No login. No API keys. No cloud processing.

The only data collected is anonymous analytics and crash reports to help improve app reliability.


Powered by the Llamatik Library

This app is a real-world demo of the Llamatik Kotlin Multiplatform library, supporting:
• On-device LLM inference (via llama.cpp)
• Embeddings
• Vector store operations
• Cross-platform support

Developers can explore the capabilities through the app before integrating Llamatik into their own projects.



Notes
• Requires downloading at least one model to start chatting.
• Internet is only used for model downloads and diagnostics.
• All chat content is processed locally.