Quick Start: Build Your First Agent in 10 Minutes
This tutorial will get you from zero to a working AI agent in under 10 minutes. By the end, you’ll have:- ✅ OmniDaemon installed and running
- ✅ Your first agent listening for events
- ✅ A publisher sending events
- ✅ Results being stored and retrieved
Step 1: Install Event Bus & Storage Backend
For this Quick Start, we’ll use Redis (the current production-ready backend for both event bus and storage).💡 OmniDaemon is pluggable! Redis Streams is our first event bus implementation. Coming soon: Kafka, RabbitMQ, NATS. For storage, we support JSON (dev) and Redis (production), with PostgreSQL, MongoDB, and S3 planned.
macOS
Ubuntu/Debian
Windows
Docker (All Platforms - Easiest!)
PONG
❌ If you see “command not found” or connection error, the event bus backend isn’t running. Try the Docker method above.
Step 2: Install OmniDaemon
Using uv (Recommended - Fast!)
Using pip (Traditional)
Step 3: Create Your First Agent
Create a file calledagent_runner.py:
SIMPLE VERSION (Required parameters only)
FULL VERSION (With all optional parameters)
name: Auto-generated from topicdescription: Noneversion: “1.0.0”tags: []config.max_retries: 3config.reclaim_idle_ms: 300000 (5 minutes)config.dlq_enabled: True
Step 4: Run Your Agent
- Shows “Registered agent” message
- Shows “Agent running” message
- Process doesn’t exit (stays running, waiting for messages)
| Error | Cause | Fix |
|---|---|---|
Connection refused [Errno 111] | Event bus not running | Go back to Step 1, start event bus backend |
ModuleNotFoundError: No module named 'omnidaemon' | Not installed | Go back to Step 2 |
ImportError: cannot import name 'OmniDaemonSDK' | Wrong import | Try from omnidaemon import OmniDaemonSDK |
Step 5: Publish an Event
Open a NEW terminal (keep the agent running in the first one!) and createpublisher.py:
SIMPLE VERSION (Required parameters only)
FULL VERSION (With all optional parameters)
webhook: None (no HTTP callback)reply_to: None (no response topic)correlation_id: Auto-generated UUIDcausation_id: Nonesource: “unknown”tenant_id: “default”
Step 6: Check System Health
In a new terminal:🎉 Success! What Just Happened?
You now have a fully functional event-driven AI agent runtime:- ✅ Event Bus - Running and handling message distribution (using Redis Streams)
- ✅ Storage Backend - Persisting agents, results, and metrics (using Redis)
- ✅ OmniDaemon - Installed and operational
- ✅ Agent - Registered and listening for events
- ✅ Event Flow - Published task → Agent processed → Result stored
- ✅ Health Check - All systems verified
⚙️ Configuration (Optional)
The Quick Start uses smart defaults - you don’t need to configure anything! Defaults:- Storage Backend: JSON files in
.omnidaemon_data/(pluggable) - Event Bus: Redis Streams at
localhost:6379(pluggable) - API: Disabled (use SDK/CLI only)
.env file:
| Setting | Change When… |
|---|---|
STORAGE_BACKEND=redis | Production deployment, need distributed storage |
REDIS_URL=... | Event bus or storage on different host/port |
OMNIDAEMON_API_ENABLED=true | Want HTTP API access |
LOG_LEVEL=DEBUG | Troubleshooting issues |
🐛 Quick Troubleshooting
Problem: “Event Bus connection keeps failing”
Problem: “Agent runs but doesn’t process tasks”
Problem: “No output when running agent”
This is normal! Agent runs in background. Look for:- ✅ “Registered agent” message
- ✅ “Listening for topics” message
- ✅ No error messages
Problem: “Can’t import OmniDaemonSDK”
🚀 What’s Next?
Congratulations! You’ve built your first AI agent with OmniDaemon. Here’s where to go next:Learn More
- Core Concepts - Understand EDA deeply
- Agent Lifecycle - Registration, subscription, deletion
- Callback Pattern - Master the callback
Build Real Agents
- Use OmniCore Agent - AI agent with MCP tools
- Use Google ADK - Google’s Agent Development Kit
- Common Patterns - 7 production-ready recipes
Go to Production
- Production Setup - Deploy for real
- Monitoring - Metrics, health, DLQ
Explore the API
- Python SDK Reference - Complete API docs
- CLI Reference - All CLI commands
📖 Need Help?
Happy building! 🎉