We’ve launched Pocket Models.AI, a free iOS app that lets you explore, experiment with, and discover what on-device AI can really do.
It’s built on the DataSapien SDK and designed for two audiences: developers testing small language models on real mobile hardware, and brands looking to understand what Device Native AI means in practice.
Download free on the App Store
More Than a Model Runner
There are already a couple of apps that let you load an AI model (GGUF format) onto your phone and chat with it. Pocket Models does that too. But we built it to show what happens when you go further.
Our team has spent over a decade working on on-device data and AI, long before the current wave of small language models made it mainstream. We’ve always believed that loading an AI or ML model onto a device is just the starting point. The real value is in what you build on top: persistent memory, structured experiences, and local agents that act on real personal context, all without data ever leaving the device.
That’s the difference between local inference and Device Native AI. Pocket Models is designed to make that difference tangible.
What’s Inside
Run GGUF Models Locally
Pocket Models supports GGUF models running entirely on your iPhone. No cloud, no API calls, no data leaving your device.
For this initial release, any GGUF model can be requested and loaded. We’re testing models before adding them to the app to ensure they perform well on mobile hardware, evaluating inference speed, memory footprint, and response quality across different iPhone generations. The library is expanding based on real device performance and community feedback.
Data Memory
This is where Pocket Models goes beyond a standard on-device chat app.
The app includes a personal data store powered by the DataSapien SDK. As you use the app, Pocket Models builds a local data profile: your preferences, context, and conversation history, stored entirely on-device.
Your local models draw on this persistent personal context via on-device memory. This will evolve into full, on-device RAG. They don’t start from zero every session. They remember what you’ve discussed, learn your preferences, and deliver increasingly relevant responses. The memory persists even when you swap between different models.
The data memory layer supports hundreds of structured data types: health and fitness signals, app usage patterns, media preferences, psychographic profiles, device context, and more. All collected, processed, and stored locally. Zero data shared with any server.
For developers, this is a chance to experience what on-device personalisation actually feels like. For brands, it’s a hands-on preview of what your own app could deliver using the same SDK.
Journeys
Pocket Models includes Journeys: guided, structured AI experiences that go beyond open-ended chat.
Journeys combine local model inference with your personal data to deliver specific outcomes. A personalised wellness check-in, a review of your habits, a decision-making framework grounded in your actual behavioural data. Think of the difference between a blank prompt and a purpose-built AI experience, except everything runs on your device, powered by your real data, and nothing ever touches a server.
Journeys are orchestrated using DataSapien’s no-code canvas, so new Journeys can be pushed to the app without requiring an App Store update. This is the same orchestration layer available to any developer or brand building on the DataSapien platform.
Built on the DataSapien SDK
Pocket Models isn’t a standalone project. It’s a working showcase of the DataSapien Device Native AI platform: the same SDK and orchestration tools available to any iOS, Android, or Flutter developer.
Everything you experience in Pocket Models, the local inference, the data memory, the Journeys, is built using production SDK features:
- On-device MeData collection: baseline device signals plus native platform data
- On-device inference: run and manage models locally
- On-device audiences and rules: segment and personalise without a server round-trip
- On-device Journeys: orchestrate structured AI experiences locally
- Orchestrator sync: pull down new logic, models, and Journey definitions without syncing user data
The architecture follows our Zero-Shared Data principle. The Orchestrator syncs logic, not user data. Your personal data store stays on your device. The platform sends down model configurations, Journey definitions, and audience rules. Never the reverse.
If you’re a developer or a brand thinking about on-device AI, Pocket Models is the fastest way to see what the DataSapien SDK can do. And if you want to build something similar for your own users, the SDK is ready.
Try It
Pocket Models is free and available now on iOS.
Download free on the App Store
Learn more at pocketmodels.ai
Access your own DNA Orchestrator to experiment with creating your own ML, AL and Agentic journeys.
Pocket Models is iOS-only for now. If there’s demand for Android too, we’ll build it. Let us know.
We’re actively expanding the model library and Journey catalogue. If you have a model you’d like to see supported, a Journey idea, or just want to talk about what Device Native AI could mean for your product, get in touch at hello@datasapien.com.

