Skip to content

llm-coreBuild AI with Recipes, not Glue.

Define declarative flows using Recipes. Swap providers via Adapters. Stop writing spaghetti code.

llm-core logo

Stop debugging prompts. Start orchestrating logic.

llm-core connects your business logic to AI models without gluing them together with fragile scripts. You define the Recipe, plug in the Adapters, and let the Runtime handle the execution.

Quick start (TS/JS)

Define a flow, plug in your adapters, and run it.

ts
import { recipes } from "@geekist/llm-core/recipes";
import { fromAiSdkModel } from "@geekist/llm-core/adapters";
import type { Model } from "@geekist/llm-core/adapters";
import { openai } from "@ai-sdk/openai";

// 1. Define your recipe (or load a standard one)
const agent = recipes.agent();

// 2. Plug in your adapters
const model: Model = fromAiSdkModel(openai("gpt-4o"));
const workflow = agent.defaults({ adapters: { model } }).build();

// 3. Run it
const result = await workflow.run({ input: "Build me a React app" });

if (result.status === "ok") {
  console.log(result.artefact);
}

Next