molroo docs
Guides

LLM Adapter

Configure LLM integration: Vercel AI SDK providers, custom adapters, or emotion-only mode.

LLM Adapter

molroo supports three modes for LLM integration: Vercel AI SDK providers (recommended), a custom LLMAdapter, or no LLM at all.

Three modes

ModeWhen to use
Vercel AI SDK providerFastest path. Pass any Vercel AI SDK LanguageModel directly. Supports OpenAI, Anthropic, Google, and any OpenAI-compatible provider.
Custom adapterYou have a non-standard LLM API, need custom preprocessing, or want full control.
Emotion-onlyYou already generate dialogue elsewhere (game engine, scripted content) and only need emotion computation.

Pass a Vercel AI SDK provider instance directly as the llm option. The SDK wraps it internally.

import { Molroo } from '@molroo-io/sdk';
import { createOpenAI } from '@ai-sdk/openai';

const molroo = new Molroo({ apiKey: 'mk_live_...' });
const openai = createOpenAI({ apiKey: process.env.OPENAI_API_KEY! });

const sera = await molroo.createPersona(personaConfig, {
  llm: openai('gpt-4o-mini'),
});

Other providers:

// Anthropic
import { createAnthropic } from '@ai-sdk/anthropic';
const anthropic = createAnthropic({ apiKey: process.env.ANTHROPIC_API_KEY! });
const sera = await molroo.createPersona(config, { llm: anthropic('claude-sonnet-4-5-20250929') });

// Google Vertex AI
import { createVertex } from '@ai-sdk/google-vertex';
const vertex = createVertex({ project: 'my-gcp-project' });
const sera = await molroo.createPersona(config, { llm: vertex('gemini-2.0-flash') });

// OpenRouter (via OpenAI-compatible API)
const openrouter = createOpenAI({
  apiKey: process.env.OPENROUTER_API_KEY!,
  baseURL: 'https://openrouter.ai/api/v1',
});
const sera = await molroo.createPersona(config, { llm: openrouter('anthropic/claude-sonnet-4-5') });

The ai package is a required peer dependency. Install it alongside your chosen provider: npm install ai @ai-sdk/openai.

Custom adapter

If you need to build a custom adapter, implement the LLMAdapter interface with two methods: generateText() for plain text and generateObject() for structured output matching a Zod schema.

The API builds the system prompt from persona context (identity, emotional state, mood, somatic markers). The adapter only passes it to the LLM -- it does not create or modify the prompt.

For the full interface definition and a complete implementation example, see the LLM Adapter Reference.

Emotion-only mode (no LLM)

If you already have a dialogue system and only need molroo for emotion computation, skip the LLM entirely by providing manual appraisal values via perceive(). This is useful for game engines, chatbot platforms with existing dialogue, research applications, and testing without LLM costs.

For setup and usage examples, see the LLM Adapter Reference.

Next steps

On this page