Skip to content

End-to-End UI (Interaction + Host + Adapter)

This guide wires the full path from a single turn to UI commands: Interaction core → sessions → UI adapter.

> **Demo path (3/4)** — Next up:

Workflow Orchestration.


How the UI boundary works

flowchart LR
  U[User message] --> H[Host handler]
  H --> S[Session]
  S --> P[Interaction pipeline]
  P --> E[EventStream]
  E --> UI[UI adapter]

Prerequisites

Install the adapter dependencies:

bash
npm install ai @assistant-ui/react
bash
pnpm add ai @assistant-ui/react
bash
yarn add ai @assistant-ui/react
bash
bun add ai @assistant-ui/react

Step 1: Set up storage and adapters

js
import { createAssistantUiInteractionEventStream, createBuiltinModel } from "@geekist/llm-core/adapters";
import { createInteractionSession } from "@geekist/llm-core/interaction";

const sessionState = new Map();

/** @type {import("#interaction").SessionStore} */
const store = {
  load: loadSessionState,
  save: saveSessionState,
};
ts
import { createAssistantUiInteractionEventStream, createBuiltinModel } from "@geekist/llm-core/adapters";
import { createInteractionSession } from "@geekist/llm-core/interaction";
import type { SessionId, SessionStore, InteractionState } from "@geekist/llm-core/interaction";
import type { AssistantTransportCommand } from "@assistant-ui/react";

const sessionState = new Map<string, InteractionState>();

const store: SessionStore = {
  load: loadSessionState,
  save: saveSessionState,
};

This keeps persistence and UI output in the host layer while the interaction core stays headless.


Step 2: Handle a chat turn and emit UI commands

js
/**
 * @typedef ChatTurnInput
 * @property {import("#interaction").SessionId} sessionId
 * @property {string} message
 * @property {(command: import("@assistant-ui/react").AssistantTransportCommand) => void} sendCommand
 */

/**
 * @param {ChatTurnInput} input
 */
export function handleChatTurn(input) {
  const eventStream = createAssistantUiInteractionEventStream({
    sendCommand: input.sendCommand,
  });

  const session = createInteractionSession({
    sessionId: input.sessionId,
    store,
    adapters: { model: createBuiltinModel() },
    eventStream,
  });

  return session.send({ role: "user", content: input.message });
}
ts
export type ChatTurnInput = {
  sessionId: SessionId;
  message: string;
  sendCommand: (command: AssistantTransportCommand) => void;
};

export function handleChatTurn(input: ChatTurnInput) {
  const eventStream = createAssistantUiInteractionEventStream({
    sendCommand: input.sendCommand,
  });

  const session = createInteractionSession({
    sessionId: input.sessionId,
    store,
    adapters: { model: createBuiltinModel() },
    eventStream,
  });

  return session.send({ role: "user", content: input.message });
}

Your app calls handleChatTurn(...) with a sessionId, a message, and a UI command sink. The adapter turns interaction events into UI-specific commands.


Swap UI adapters without touching interaction logic

The adapter is the only thing that changes:

diff
- createAssistantUiInteractionEventStream
+ createAiSdkInteractionEventStream

The rest of the handler stays the same.


Run the demo locally

The examples/interaction-node-sse app shows the same end-to-end path (Interaction → Session → EventStream) in a tiny Node server.

bash
# from the repo root
bun install
bun examples/interaction-node-sse/server.js

Then open:

http://localhost:3030

To hit the SSE endpoint directly:

http://localhost:3030/chat?sessionId=demo&message=Hello

Agentic playground (WebSocket + agent loop)

The examples/agentic app is a fuller UI that wires the agent loop runtime to an assistant-ui chat and a configurable control panel (tools, skills, MCP, approvals, sub-agents) while still using the same event stream surface.

bash
# from the repo root
bun install
bun --cwd examples/agentic/server dev
bun --cwd examples/agentic/client dev

Then open:

http://localhost:5173

Next step

If you need multi-step orchestration (RAG, tools, HITL), move to full workflows: