The Go framework for AI agents

Type-safe tools. Multi-provider routing. Durable execution. Production observability. One dependency.

Works with

AnthropicOpenAIGoogle GeminiOpenRouterOllamaLM Studio

What you can build

Customer Support Agent

Route conversations to specialized agents with type-safe handoffs.

agent := graft.NewAgent("support",
  graft.WithTools(lookupOrder, refund),
  graft.WithHandoffs(handoff.To(billing)),
)

Multi-Agent Orchestration

Compose sub-agents that run in parallel with isolated context.

graft.RunSubAgentsParallel(ctx, runner,
  []graft.SubAgent{
    {Agent: researcher},
    {Agent: writer},
  }, messages)

Durable AI Pipelines

Survive crashes with Temporal or Hatchet-backed execution.

runner := temporal.NewRunner(client,
  temporal.WithTaskQueue("agents"),
)
result, _ := runner.Run(ctx, agent, msgs)

Observable RAG

Full OpenTelemetry tracing for every token and tool call.

runner := otel.InstrumentRunner(
  graft.NewDefaultRunner(model),
  otel.WithTracerProvider(tp),
)

Developer experience, not boilerplate

Minimal code. Maximum capability.

Define a tool

tool := graft.NewTool("search", "Search the web",
  func(ctx context.Context, p struct {
    Query string "text-jade-300">`json:"query"`
  }) (string, error) {
    return search(p.Query), nil
  })

Switch providers

model := openai.New(openai.WithModel("gpt-4o"))
model := anthropic.New(anthropic.WithModel("claude-sonnet-4-20250514"))
model := google.New(google.WithModel("gemini-2.5-pro"))

Add observability

runner := graft.NewDefaultRunner(model)
runner = otel.InstrumentRunner(runner,
  otel.WithTracerProvider(tp))

Batteries included

15 packages. One dependency. Everything for production AI agents.

Type-Safe Tools

Struct tags become JSON Schema via reflection.

Multi-Provider

Anthropic, OpenAI, Gemini. Fallback + round-robin.

Agent Handoffs

LLM-driven routing between specialized agents.

Lifecycle Hooks

14+ events: pre/post generate, tool calls, errors.

Guardrails

Input, output, and tool validation out of the box.

OpenTelemetry

Tracing and metrics from day one. Vendor-neutral.

Session State

Memory and file-backed persistence. Transparent.

SubAgents

Context-isolated child agents. Run in parallel.

Streaming + SSE

Go channels with built-in SSE HTTP adapter.

MCP Protocol

Client and server support for the MCP ecosystem.

Temporal

Durable execution with deterministic replay.

Hatchet

PostgreSQL-powered high-throughput durability.

Graph Orchestration

State machines, reducers, checkpointing.

Trigger.dev

Waitpoints, warm starts, zero-timeout tasks.

Pluggable Tracing

Braintrust, LangSmith, OTel, or custom providers.

Why Graft?

The only comprehensive Go-native AI agent framework.

GraftLangChainOpenAI SDKVercel AI SDK
LanguageGo-nativePythonPythonTypeScript
Durable ExecutionTemporal + Hatchet + Trigger.devCheckpointsNoneNone
MCP SupportClient + ServerLimitedBasicNone
ObservabilityOTel + Braintrust + LangSmithLangSmithCustomMiddleware
Graph OrchestrationBuilt-inLangGraph (separate)NoneNone
Dependencies1 (OTel)ManySeveralMany
Tool Type SafetyGenerics + reflectionRuntimeRuntimeZod schemas
StreamingGo channels + SSEAsync iterationSSEstreamText

How it works

Messages
Agent
Runner
LLM
Tools
Result

Generate, execute tool calls, repeat. Handoffs switch agents mid-loop.

Start in 30 seconds

1

Install

go get github.com/delavalom/graft
2

Define your agent

agent := graft.NewAgent("assistant",
  graft.WithInstructions("You are helpful."),
  graft.WithTools(myTool),
)
3

Run it

runner := graft.NewDefaultRunner(model)
result, _ := runner.Run(ctx, agent, messages)
fmt.Println(result.LastAssistantText())

Build something great with Go

go get github.com/delavalom/graft
Read the Docs →