Recipes are Assets
Define flows as named, versioned recipes. Share them across teams like npm packages.
Define declarative flows using Recipes. Swap providers via Adapters. Stop writing spaghetti code.

llm-core connects your business logic to AI models without gluing them together with fragile scripts. You define the Recipe, plug in the Adapters, and let the Runtime handle the execution.
Define a flow, plug in your adapters, and run it.
import { recipes } from "@geekist/llm-core/recipes";
import { fromAiSdkModel } from "@geekist/llm-core/adapters";
import type { Model } from "@geekist/llm-core/adapters";
import { openai } from "@ai-sdk/openai";
// 1. Define your recipe (or load a standard one)
const agent = recipes.agent();
// 2. Plug in your adapters
const model: Model = fromAiSdkModel(openai("gpt-4o"));
const workflow = agent.defaults({ adapters: { model } }).build();
// 3. Run it
const result = await workflow.run({ input: "Build me a React app" });
if (result.status === "ok") {
console.log(result.artefact);
}