Features How It Works Solutions Pricing Pitch Get Started

On-Device AI SDK

Build private, offline-capable AI directly into your apps. Our SDK gives you the power of large language models that run entirely on-device — no cloud, no latency, no data leaks.

Ship Private AI in Minutes

Initialize the SDK, load a model, and run inference in just a few lines of code. It really is that simple.

app.js
// Import the PocketAI SDK
import { PocketAI } from '@pocketai/sdk';

// Initialize with your configuration
const ai = new PocketAI({
  model:   'pocket-3b-instruct',
  device:  'auto',        // CPU, GPU, or NPU
  threads: 4,
  privacy: 'strict'       // Zero telemetry
});

// Run on-device inference
const response = await ai.generate({
  prompt: 'Summarize this medical record',
  context: patientData,
  maxTokens: 512,
  temperature: 0.3
});

console.log(response.text);
// => "Patient presents with..."
// All processing happened on-device

Everything You Need to Build

A complete toolkit for integrating on-device AI into iOS, Android, and cross-platform applications.

iOS & Android SDK

Native SDKs for both platforms with Swift and Kotlin APIs. Optimized for Apple Neural Engine, Qualcomm AI Engine, and MediaTek APU. Framework-agnostic with React Native and Flutter bindings available.

Custom Model Fine-tuning

Fine-tune base models on your own data with our training pipeline. LoRA and QLoRA support for efficient adaptation. Export optimized GGUF or CoreML models ready for on-device deployment.

API & Documentation

Comprehensive API reference, integration guides, and sample projects. Interactive playground for testing models before deployment. Full TypeDoc, Jazzy, and Dokka-generated references.

Apps Studio

Visual drag-and-drop interface for prototyping AI workflows without writing code. Test prompts, chain models, and export production-ready configurations directly to your app project.

How to Integrate in 3 Steps

From zero to on-device AI inference in under five minutes. No server infrastructure required.

1

Install the SDK

Add PocketAI to your project with your preferred package manager. Available on npm, CocoaPods, Maven, and pub.dev.

npm install @pocketai/sdk
2

Initialize

Configure the SDK with your API key and model preferences. The SDK automatically detects hardware capabilities and optimizes accordingly.

const ai = new PocketAI(config)
3

Run Inference

Call generate, classify, embed, or any task method. All processing happens on-device with zero network requests. Ship it.

await ai.generate({ prompt })

Start Building with PocketAI

Get free SDK access with 10,000 monthly on-device inferences. Scale to enterprise when you are ready. No credit card needed.