For Developers
Build private, offline-capable AI directly into your apps. Our SDK gives you the power of large language models that run entirely on-device — no cloud, no latency, no data leaks.
Quick Start
Initialize the SDK, load a model, and run inference in just a few lines of code. It really is that simple.
// Import the PocketAI SDK import { PocketAI } from '@pocketai/sdk'; // Initialize with your configuration const ai = new PocketAI({ model: 'pocket-3b-instruct', device: 'auto', // CPU, GPU, or NPU threads: 4, privacy: 'strict' // Zero telemetry }); // Run on-device inference const response = await ai.generate({ prompt: 'Summarize this medical record', context: patientData, maxTokens: 512, temperature: 0.3 }); console.log(response.text); // => "Patient presents with..." // All processing happened on-device
SDK Features
A complete toolkit for integrating on-device AI into iOS, Android, and cross-platform applications.
Native SDKs for both platforms with Swift and Kotlin APIs. Optimized for Apple Neural Engine, Qualcomm AI Engine, and MediaTek APU. Framework-agnostic with React Native and Flutter bindings available.
Fine-tune base models on your own data with our training pipeline. LoRA and QLoRA support for efficient adaptation. Export optimized GGUF or CoreML models ready for on-device deployment.
Comprehensive API reference, integration guides, and sample projects. Interactive playground for testing models before deployment. Full TypeDoc, Jazzy, and Dokka-generated references.
Visual drag-and-drop interface for prototyping AI workflows without writing code. Test prompts, chain models, and export production-ready configurations directly to your app project.
Integration
From zero to on-device AI inference in under five minutes. No server infrastructure required.
Add PocketAI to your project with your preferred package manager. Available on npm, CocoaPods, Maven, and pub.dev.
npm install @pocketai/sdk
Configure the SDK with your API key and model preferences. The SDK automatically detects hardware capabilities and optimizes accordingly.
const ai = new PocketAI(config)
Call generate, classify, embed, or any task method. All processing happens on-device with zero network requests. Ship it.
await ai.generate({ prompt })
Get free SDK access with 10,000 monthly on-device inferences. Scale to enterprise when you are ready. No credit card needed.