Cloudflare Agents
January 30, 2026
Cloudflare Agents is an SDK for building AI agents that run on Cloudflare's edge network. Built on Durable Objects, it provides stateful, globally-distributed infrastructure for autonomous systems.
The Edge Advantage
Traditional agents run on centralized servers. Cloudflare Agents run everywhere:
- Low latency: Close to users globally
- Stateful: Durable Objects persist state
- Scalable: Millions of concurrent agents
- Integrated: Native Cloudflare ecosystem
Durable Objects are perfect for quick ephemeral storage. Well equipped for AI coding agents.
Quick Start
npx create-cloudflare@latest --template cloudflare/agents-starter
npm install
npm start
Defining an Agent
Agents extend the base Agent class:
import { Agent } from "agents";
export class MyAgent extends Agent {
async onMessage(message: string) {
// Handle incoming messages
const response = await this.ai.run("@cf/meta/llama-3-8b-instruct", {
prompt: message,
});
return response;
}
async onSchedule(trigger: string) {
// Handle scheduled tasks
await this.doPeriodicWork();
}
}
Core Capabilities
Built-in State
Every agent has persistent state:
class MyAgent extends Agent {
async handleRequest() {
// Read state
const count = this.state.get("messageCount") || 0;
// Update state
await this.setState({ messageCount: count + 1 });
// SQL queries
const history = await this.sql`
SELECT * FROM messages
WHERE user_id = ${userId}
ORDER BY created_at DESC
LIMIT 10
`;
}
}
WebSocket Support
Real-time communication with clients:
class MyAgent extends Agent {
async onConnect(socket: WebSocket) {
// New client connected
this.sockets.add(socket);
}
async onMessage(socket: WebSocket, message: string) {
// Handle real-time messages
const response = await this.processMessage(message);
socket.send(JSON.stringify(response));
}
broadcast(message: string) {
// Send to all connected clients
for (const socket of this.sockets) {
socket.send(message);
}
}
}
Scheduling
Run tasks on schedules or delays:
class MyAgent extends Agent {
async scheduleReminder(userId: string, message: string, delay: number) {
await this.schedule({
type: "reminder",
userId,
message,
}, Date.now() + delay);
}
async onSchedule(payload: any) {
if (payload.type === "reminder") {
await this.sendNotification(payload.userId, payload.message);
}
}
}
AI Model Access
Native integration with Cloudflare Workers AI:
class MyAgent extends Agent {
async generate(prompt: string) {
// Run inference
const response = await this.ai.run(
"@cf/meta/llama-3-8b-instruct",
{ prompt }
);
// Or use external providers
const openaiResponse = await fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: { Authorization: `Bearer ${this.env.OPENAI_API_KEY}` },
body: JSON.stringify({ model: "gpt-4o", messages: [...] }),
});
}
}
State Synchronization
Automatic sync between agent and clients:
// Agent side
class MyAgent extends Agent {
async updateState(data: any) {
await this.setState(data);
// Clients automatically receive updates
}
}
// Client side (React)
import { useAgent } from "agents/react";
function App() {
const { state, send } = useAgent("my-agent");
return (
<div>
<p>Messages: {state.messageCount}</p>
<button onClick={() => send("hello")}>Send</button>
</div>
);
}
Workflows Integration
Combine with Cloudflare Workflows for complex orchestration:
import { Workflow } from "@cloudflare/workflows";
export class ResearchWorkflow extends Workflow {
async run(event: WorkflowEvent) {
// Step 1: Gather data
const data = await this.step("gather", async () => {
return await this.fetchSources(event.input.topic);
});
// Step 2: Analyze (with retry)
const analysis = await this.step("analyze", async () => {
return await this.agent.analyze(data);
}, { retries: 3 });
// Step 3: Generate report
return await this.step("report", async () => {
return await this.agent.generateReport(analysis);
});
}
}
Use Cases
- Customer support chatbots handling queries in real-time
- Workflow automators with human-in-loop approval
- Code assistants using models like Claude
- Multi-agent systems for data processing
Real-world example from a developer:
I built an AI Agent to handle all my emails autonomously using Cloudflare Durable Objects and Vercel AI SDK.
Deployment
# Local development
npm start
# Deploy to production
npm run deploy
Agents deploy globally on Cloudflare's network, automatically routing to the nearest edge location.
When to Use Cloudflare Agents
Good fit:
- Need global low-latency agents
- Want built-in state management
- Building real-time applications
- Already using Cloudflare
Consider alternatives when:
- Need complex LLM orchestration (use LangGraph)
- Provider-specific features needed (use native SDKs)
- Not ready for edge paradigm
Sources
- Cloudflare Agents Documentation — Official docs
- Durable Objects — State infrastructure
- Workers AI — Model inference
- GitHub: cloudflare/agents-starter — Starter template
See also: MCP Servers · Orchestration · Memory & State