EngageAI SDK Docs
Add a voice + text AI assistant to any mobile app in under 30 minutes. Users speak or type what they want — EngageAI calls your app's functions to make it happen.
Overview
EngageAI is a drop-in SDK that bridges natural language to your app's business logic. You define functions — the actions your app can take — and EngageAI's Claude-powered agent decides which ones to call based on what the user says.
What you DO: tell EngageAI what your functions are, what parameters they take, and what they do. Drop a tappable character into your app. That's it.
What you DON'T do: write keyword matching, intent detection, dialog flow, or speech recognition. EngageAI handles all of that. The user talks naturally — “I want jollof rice” or “send 5,000 to mum” — and the agent maps it to the right function and asks any clarifying questions through conversation. Your function code runs in your app, not on EngageAI's servers — your secrets and business logic stay yours.
Voice + text
Users speak or type. Both modes use the same agent.
Your code runs
Handlers are closures — they call your services directly.
You stay in control
Sensitive actions require explicit user confirmation.
How it works
Quickstart
Add the SDK to your pubspec.yaml:
dependencies:
flutter:
sdk: flutter
rive: ^0.14.0
engageai_sdk:
git:
url: https://github.com/engageai-hq/flutter-sdk
ref: v0.2.2ref: v0.2.2) for production. Use ref: main only if you're comfortable tracking breaking changes.flutter pub get
iOS — add to ios/Runner/Info.plist:
<key>NSMicrophoneUsageDescription</key> <string>$(PRODUCT_NAME) uses the microphone for voice commands.</string>
Android — add to android/app/src/main/AndroidManifest.xml:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
Initialise Rive before runApp — required for the animated character:
import 'package:flutter/material.dart';
import 'package:rive/rive.dart' hide Animation;
import 'package:engageai_sdk/engageai_sdk.dart';
Future<void> main() async {
WidgetsFlutterBinding.ensureInitialized();
await RiveNative.init(); // must come before runApp
runApp(const MyApp());
}Configuration
Create an EngageAI instance with your app's credentials.
final engageAI = EngageAI(
config: EngageAIConfig(
serverUrl: 'https://engageai-sdk-production.up.railway.app',
appId: 'your_app_id',
apiKey: 'eai_...',
appName: 'YourApp',
domain: 'food_delivery', // food_delivery | fintech | ecommerce | healthcare | logistics | other
description: 'A short description of what your app does.',
debug: false,
),
);| Field | Required | Description |
|---|---|---|
| serverUrl | Required | EngageAI backend URL |
| appId | Required | Your registered app ID — must match exactly |
| apiKey | Required | Your API key from the portal |
| appName | Required | Human-readable app name |
| domain | Optional | Hint to the agent about your app type |
| description | Optional | What your app does — helps the agent give better responses |
| debug | Optional | Logs API calls to the console (default: false) |
Registering functions
Functions tell the AI what your app can do. Each one has a name, a description the AI reads, and a handler that runs in your Flutter app.
Three ways to define functions
- CLI sync (fastest): run
engageai sync --smart --flutter lib/services/— Claude reads your existing Dart code and generates the setup file automatically with no annotations needed. See the CLI sync section below. - In code: call
registerFunctions([...])as shown below. Full control, type-safe, version-controlled with your app. - In the portal (no-code): visit Functions, click + Add Function, and define schema visually. The portal generates a Dart stub you paste into your
registerFunctionscall.
All three execute client-side — your handler code runs in your Flutter app in every case. The portal and CLI are just different ways to define the function's metadata. There is currently no server-side execution path.
engageAI.registerFunctions([
AppFunction(
name: 'search_restaurants',
description:
'Search for nearby restaurants. Can filter by cuisine type, '
'minimum rating, and whether the restaurant is currently open.',
parameters: {
'type': 'object',
'properties': {
'cuisine': {'type': 'string', 'description': 'e.g. "nigerian", "pizza"'},
'is_open': {'type': 'boolean', 'default': true},
'min_rating': {'type': 'number'},
},
},
handler: (params) async {
return await foodService.searchRestaurants(
cuisine: params['cuisine'] as String?,
isOpen: params['is_open'] as bool? ?? true,
minRating: (params['min_rating'] as num?)?.toDouble(),
);
},
),
]);CLI sync
The fastest integration path. Point the CLI at your existing Dart service files — no annotations or changes to your code needed. Claude reads your functions and infers descriptions automatically, then writes a fully-wired engageai_setup.dart for you.
1 — Install the CLI (once)
npm install -g engageai-cli engageai login
2 — Run sync
Point --smart at your service directory. The CLI reads your public methods, generates descriptions via Claude, and writes the setup file.
engageai sync --smart --flutter lib/services/
On first run the CLI asks five questions:
- App ID — a lowercase slug, e.g.
quickbite - Display name — e.g.
QuickBite - Platform — auto-detected from
pubspec.yaml - Domain — select from a list (e.g. Food Delivery, Healthcare)
- App description — one sentence the AI uses for context
Answers are saved to engageai.config.json — commit this file. Subsequent runs are instant.
3 — Import the generated setup file
The CLI writes lib/engageai_setup.dart with all discovered functions registered and handlers pre-wired. Call setupEngageAI() in main():
import 'package:engageai_sdk/engageai_sdk.dart';
import 'engageai_setup.dart'; // ← generated by engageai sync
Future<void> main() async {
WidgetsFlutterBinding.ensureInitialized();
await RiveNative.init();
setupEngageAI(); // registers all functions + initialises the AI
runApp(const MyApp());
}engageai sync --smart --flutter lib/services/ any time you add or rename functions. The setup file is regenerated in place — no annotations to maintain.Tag annotations
If you prefer not to use --smart, you can tag functions directly with /// doc comments. The CLI reads these tags without calling Claude — faster and fully offline.
| Tag | Required | Description |
|---|---|---|
| /// @engageai <description> | Required | Marks the function for EngageAI and sets its description — what the AI reads to decide when to call it. |
| /// @engageai-confirm | Optional | Requires the user to confirm before the handler runs. Use for irreversible actions like placing orders or deleting data. |
| /// @param <name> - <desc> | Optional | Describes a specific parameter. Shown in the function manifest and used by the AI to fill in the right value. |
| /// @example <phrase> | Optional | A sample user phrase that should trigger this function. Add several — the AI uses them to improve intent matching. |
Example — annotated Dart function
/// @engageai Search nearby restaurants. Can filter by cuisine, rating, and open status.
/// @param cuisine - Type of food, e.g. "Nigerian" or "Italian". Optional.
/// @param minRating - Minimum star rating from 1.0 to 5.0. Optional.
/// @param isOpen - If true, only return currently open restaurants.
/// @example "find Nigerian food near me"
/// @example "show highly rated places open now"
Future<List<Restaurant>> searchRestaurants({
String? cuisine,
double? minRating,
bool isOpen = true,
}) async {
// ...
}Example — function requiring confirmation
/// @engageai Place the cart as a delivery order. Always confirm with the user before calling this.
/// @engageai-confirm
/// @param addressId - The ID of the saved delivery address to use.
/// @param paymentMethodId - The ID of the saved payment method.
/// @example "place my order"
/// @example "confirm and checkout"
Future<Order> placeOrder({
required String addressId,
required String paymentMethodId,
}) async {
// ...
}engageai sync lib/services/my_service.dart (without --smart) to parse annotations. The CLI will not call Claude — it reads the tags directly and generates the setup file immediately.Writing handlers
A handler is a Dart async function that receives a Map<String, dynamic> and returns a Map with the result.
handler: (params) async {
final restaurantId = params['restaurant_id'] as String;
final result = await foodService.addToCart(
itemId: params['item_id'] as String,
quantity: params['quantity'] as int? ?? 1,
);
return {
'success': true,
'item_name': result.itemName,
'cart_total': result.total,
};
},dynamic. Use as String, as int? etc.User context
engageAI.setUserContext(EngageUserContext(
userId: 'user_123',
displayName: 'Amara',
data: {
'location': 'Lagos, Nigeria',
'default_address': 'Home — 15 Admiralty Way, Lekki Phase 1',
'default_payment': 'Visa ending in 4242',
},
));Requiring confirmation
For irreversible actions, set requiresConfirmation: true. EngageAI shows a confirmation dialog before the handler runs.
AppFunction(
name: 'place_order',
requiresConfirmation: true,
description: 'Place the cart as a delivery order. Always confirm with the user first.',
parameters: { /* ... */ },
handler: (params) async {
return await foodService.placeOrder(
deliveryAddressId: params['delivery_address_id'] as String,
paymentMethodId: params['payment_method_id'] as String,
);
},
),Full example — QuickBite food delivery
QuickBite is a Chowdeck-style demo where users say “I want jollof rice from somewhere nearby” and the agent searches restaurants, browses the menu, and adds items to the cart.
import 'package:engageai_sdk/engageai_sdk.dart';
import '../services/food_delivery_service.dart';
import '../models/mock_data.dart';
class EngageAISetup {
static EngageAI create({
required String serverUrl,
required FoodDeliveryService foodService,
String? apiKey,
}) {
final engageAI = EngageAI(
config: EngageAIConfig(
serverUrl: serverUrl,
appId: 'quickbite',
apiKey: apiKey,
appName: 'QuickBite',
domain: 'food_delivery',
description: 'Food delivery app in Lagos, Nigeria. Prices in NGN.',
),
);
engageAI.setUserContext(EngageUserContext(
userId: MockDatabase.currentUser['user_id'],
displayName: MockDatabase.currentUser['display_name'],
data: {
'location': 'Lagos, Nigeria',
'default_address': 'Home — 15 Admiralty Way, Lekki Phase 1',
'default_payment': 'Visa ending in 4242',
},
));
engageAI.registerFunctions([
AppFunction(
name: 'search_restaurants',
description: 'Search nearby restaurants. Filter by cuisine, rating, distance.',
parameters: {
'type': 'object',
'properties': {
'cuisine': {'type': 'string'},
'max_distance_km': {'type': 'number', 'default': 5.0},
'min_rating': {'type': 'number'},
'is_open': {'type': 'boolean', 'default': true},
},
},
handler: (params) async => await foodService.searchRestaurants(
cuisine: params['cuisine'] as String?,
maxDistanceKm: (params['max_distance_km'] as num?)?.toDouble() ?? 5.0,
minRating: (params['min_rating'] as num?)?.toDouble(),
isOpen: params['is_open'] as bool? ?? true,
),
),
AppFunction(
name: 'add_to_cart',
description: 'Add a menu item to the cart.',
parameters: {
'type': 'object',
'required': ['item_id'],
'properties': {
'item_id': {'type': 'string'},
'quantity': {'type': 'integer', 'default': 1},
},
},
handler: (params) async => await foodService.addToCart(
itemId: params['item_id'] as String,
quantity: params['quantity'] as int? ?? 1,
),
),
AppFunction(
name: 'place_order',
description: 'Place the cart as an order. Always confirm with user first.',
requiresConfirmation: true,
parameters: {'type': 'object', 'properties': {}},
handler: (params) async => await foodService.placeOrder(
deliveryAddressId: 'addr_001',
paymentMethodId: 'pay_001',
),
),
]);
return engageAI;
}
}Tracking outcomes with Goals
Once you have functions registered, define goals on the Analytics page. A goal is a function whose call counts as a meaningful business outcome — like place_order, transfer_funds, or complete_lesson.
The Analytics page automatically tracks completion counts and chat-to-conversion rate for each goal. Up to 5 goals per app; one is marked primary and drives the headline conversion metric.
Credit costs
Each API call consumes credits from your monthly allowance.
5
Text interaction
credits
15
Voice interaction
credits
2
Function execution
credits
5
AI reasoning step
credits
A typical text turn costs ~10 credits. A voice turn costs ~20 credits. Function calls are 2 credits each. View plans →
Troubleshooting
I tap the character but nothing happens (Flutter)
Make sure you're passing either an engageAI instance OR a custom onTap callback to EngageCharacterFab. With engageAI alone, tapping opens the voice chat widget by default. Without either, the FAB is just a static button that does nothing on tap.
Voice doesn't record / silent failure
You probably haven't added microphone permissions. iOS needs NSMicrophoneUsageDescription in Info.plist; Android needs RECORD_AUDIO in AndroidManifest.xml. See the Quickstart section. Also verify the user actually granted permission when the OS prompted.
flutter pub get or npm install fails with 404
The SDK is hosted on GitHub. Make sure you have network access to github.com and that you typed the URL correctly. If you're behind a corporate firewall, GitHub may be blocked entirely. Test with curl -I https://github.com/engageai-hq/flutter-sdk.
Initialise() throws "Invalid API key"
Double-check the key from API Keys — keys start with eai_. If you regenerated the key, the old one stops working immediately. Also confirm the appId matches the app the key was created for.
AI ignores my function
Most often: the function description is too vague. The AI uses descriptions to decide what to call. Rewrite the description as if explaining to a colleague who's never seen your app — be specific about what input the function expects and what it returns. Then test with phrasing close to the description.
Handler runs but the AI gives a confusing reply
Whatever your handler returns gets passed back to the AI. If you return {success: true} with no other context, the AI has nothing to summarise. Return useful fields like {balance, currency, transaction_id} so the AI can give a meaningful response.
Voice doesn't work in Expo Go (React Native)
rive-react-native requires a development build. Either build a dev APK with eas build --profile development --platform android, or omit the Rive prop on EngageVoiceChatModal for text-only mode in Expo Go.
engageai sync --smart finds 0 functions
The CLI picks up exported top-level functions and public methods on exported classes (TypeScript) or public Dart methods. Private helpers, unexported items, and methods starting with _ are ignored. Make sure you're pointing at the right file or directory, and that your service is exported — e.g. export class HealthService (TS) or a public class (Dart). Pass a specific file path if the CLI is scanning unrelated files.
engageai sync portal upload fails but setup file still generated
The upload is non-fatal — the setup file is always written. You may see this if your API key doesn't match the app ID, or if your network blocks the portal. Fix the key with engageai login and re-run.
Still stuck?
Check the Logs page in your portal — every request and function call is logged. If logs don't show your activity, your app_id or API key is likely wrong. For anything else, email help@engageai.tech.