Skip to main content

Overview

The LlamaIndexAgent connects AG-Kit to the LlamaIndex TS workflow system. It can:
  • Build a simple single-agent workflow from llm, tools, and systemPrompt
  • Or accept a custom workflow via workflowFactory
  • Stream AG-UI compatible events (text chunks, tool calls/results)
  • Maintain per-thread conversation memory via memoryFactory
Under the hood it uses an AguiLlamaIndexWorkflow to translate LlamaIndex workflow events into BaseEvent for @ag-ui/client.

Installation

pnpm add @ag-kit/agents @ag-kit/adapter-llamaindex llamaindex @llamaindex/openai @llamaindex/workflow zod

Exports

All exports are available from @ag-kit/adapter-llamaindex: Exports:
  • LlamaIndexAgent - Main agent class for LlamaIndex integration
  • AguiLlamaIndexWorkflow - Utility to construct runnable workflows (single or multi-agent)

Quick start (single agent)

import { LlamaIndexAgent } from "@ag-kit/adapter-llamaindex";
import { OpenAI } from "@llamaindex/openai";

const llm = new OpenAI({
  model: process.env.OPENAI_MODEL!,
  apiKey: process.env.OPENAI_API_KEY!,
  baseURL: process.env.OPENAI_BASE_URL!,
});

export const agent = new LlamaIndexAgent({
  name: "llamaindex-agent",
  description: "Single-agent with LlamaIndex LLM",
  llm,
  // tools, systemPrompt are optional here
});

Custom workflow (multi-agent or advanced)

import { LlamaIndexAgent, AguiLlamaIndexWorkflow } from "@ag-kit/adapter-llamaindex";

export const agent = new LlamaIndexAgent({
  name: "multi-agent",
  workflowFactory: () => {
    // Build with AguiLlamaIndexWorkflow helpers
    return AguiLlamaIndexWorkflow.fromTools({
      name: "Orchestrator",
      llm,
      tools,
      systemPrompt: "You are the orchestrator",
    });
  },
});

Server integration

import { run } from "@ag-kit/server";
import { agent } from "./agent";

run({ createAgent: () => ({ agent }) });

API

class LlamaIndexAgent

new LlamaIndexAgent(config: AgentConfig & {
  // Base single-agent params (used when workflowFactory is not provided)
  llm?: LLM;
  tools?: BaseTool[];
  systemPrompt?: string;
  name?: string;
  // Advanced: supply your own runnable workflow
  workflowFactory?: () => RunnableWorkflow;
  // Optional conversation memory provider
  memoryFactory?: () => Memory;
})
name
string
Unique identifier for the agent. Defaults to “Agent” if omitted.
description
string
Human-readable description of the agent.
llm
LLM
LlamaIndex LLM instance. Used when building the default single-agent workflow.
tools
BaseTool[]
Array of LlamaIndex tools. These are called by the agent during execution.
systemPrompt
string
Optional system prompt injected before user messages.
workflowFactory
() => RunnableWorkflow
Provide a custom workflow. When present, it overrides the default single-agent workflow.
memoryFactory
() => Memory
Factory for per-thread conversation memory.

Methods

run()
run(input: RunAgentInput): Observable<BaseEvent>
Executes the workflow and emits BaseEvent values from @ag-ui/client.
  • input - RunAgentInput containing messages, runId, threadId, tools, etc.
  • returns - Observable<BaseEvent>
Note: Event types follow the same semantics as LanggraphAgent (RUN_STARTED, TEXT_MESSAGE_CONTENT, TOOL_CALL_START, TOOL_CALL_ARGS, TOOL_CALL_RESULT, RUN_FINISHED, etc.).

See also