Skip to content

Go SDK

The Go SDK provides native Go support for building AI workflow agents with full type safety and high performance.

Terminal window
go get github.com/duragraph/duragraph-go

State flows through the graph as nodes process it. Define your state as a Go struct:

type ChatState struct {
Messages []string `json:"messages"`
Result string `json:"result,omitempty"`
}

Nodes implement the graph.Node interface:

import (
"context"
"github.com/duragraph/duragraph-go/graph"
"github.com/duragraph/duragraph-go/llm"
"github.com/duragraph/duragraph-go/llm/openai"
)
type ThinkNode struct {
llm llm.Provider
}
func (n *ThinkNode) Execute(ctx context.Context, state *ChatState) (*ChatState, error) {
messages := make([]llm.Message, len(state.Messages))
for i, m := range state.Messages {
messages[i] = llm.Message{Role: "user", Content: m}
}
resp, err := n.llm.Complete(ctx, messages)
if err != nil {
return nil, err
}
state.Result = resp.Content
return state, nil
}

Connect nodes with edges:

func NewChatAgent() *graph.Graph[*ChatState] {
g := graph.New[*ChatState]("chat_agent")
// Add nodes
g.AddNode("think", &ThinkNode{llm: openai.New()})
g.AddNode("respond", &RespondNode{})
// Connect with edges
g.AddEdge("think", "respond")
// Set entry point
g.SetEntrypoint("think")
return g
}
func main() {
g := NewChatAgent()
result, err := g.Run(context.Background(), &ChatState{
Messages: []string{"Hello, how can I help you today?"},
})
if err != nil {
log.Fatal(err)
}
fmt.Println(result.Result)
}

For dynamic branching, implement the Router interface:

type DecisionNode struct{}
func (n *DecisionNode) Execute(ctx context.Context, state *ChatState) (*ChatState, error) {
// Analyze state and prepare for routing decision
return state, nil
}
func (n *DecisionNode) Route(ctx context.Context, state *ChatState) (string, error) {
if needsSearch(state) {
return "search", nil
}
return "respond", nil
}

The SDK supports multiple LLM providers:

import "github.com/duragraph/duragraph-go/llm/openai"
client := openai.New() // Uses OPENAI_API_KEY env var
resp, err := client.Complete(ctx, messages,
llm.WithModel("gpt-4o-mini"),
llm.WithTemperature(0.7),
)

Define tools for the LLM to call:

tools := []llm.Tool{
{
Name: "search",
Description: "Search the web for information",
Parameters: map[string]any{
"type": "object",
"properties": map[string]any{
"query": map[string]any{
"type": "string",
"description": "The search query",
},
},
"required": []string{"query"},
},
},
}
resp, err := client.Complete(ctx, messages, llm.WithTools(tools))
// Handle tool calls
for _, call := range resp.ToolCalls {
switch call.Name {
case "search":
query := call.Arguments["query"].(string)
result := performSearch(query)
// Add tool result to messages and continue
}
}

Configure workers for production deployment:

w := worker.New(g,
worker.WithControlPlane("http://localhost:8081"),
worker.WithConcurrency(10), // Process 10 runs concurrently
worker.WithPollInterval(time.Second), // Poll every second
worker.WithAPIKey(os.Getenv("API_KEY")), // Authenticate with control plane
)

Errors in nodes stop graph execution:

func (n *MyNode) Execute(ctx context.Context, state *State) (*State, error) {
result, err := doSomething()
if err != nil {
// Wrap with context for debugging
return nil, fmt.Errorf("MyNode failed: %w", err)
}
return state, nil
}

Full API documentation is available at pkg.go.dev/github.com/duragraph/duragraph-go.

See the examples directory for complete working examples:

  • Chat Agent - Simple conversational agent
  • RAG Agent - Retrieval-augmented generation
  • Multi-Agent - Coordinating multiple agents