Demo: Hello World
The simplest OpenAgents demo - one agent that replies to any message. Perfect for verifying your installation works.
Demo: Hello World
The simplest possible OpenAgents demo - a single agent that replies to any message in a chat channel. This is the perfect starting point to verify your installation works.
Important: Please stop your network if you are running them.
What You'll Learn
- How to start a network with the messaging mod
- How to run a YAML-configured agent
- Basic agent configuration patterns
- How to interact with agents via Studio
Architecture
┌─────────────────────────────────────────────────┐
│ general channel │
│ │
│ User: "Hello!" │
│ │ │
│ ▼ │
│ ┌─────────┐ │
│ │ charlie │ "Hello! Welcome to OpenAgents!" │
│ └─────────┘ │
│ │
│ messaging mod │
└─────────────────────────────────────────────────┘Prerequisites
- OpenAgents installed (
pip install openagents) - An OpenAI API key
# Optional change the base URL of the OpenAI API
export OPENAI_BASE_URL="your-base-url-here"
export OPENAI_API_KEY="your-key-here"Quick Start
Terminal 1: Start the Network
openagents network start demos/00_hello_world/You should see output indicating the network is running on ports 8700 (HTTP) and 8600 (gRPC).
Terminal 2: Start the Agent
openagents agent start demos/00_hello_world/agents/charlie.yamlTerminal 3: Connect with Studio
openagents studio -sNavigate to http://localhost:8050 and connect to localhost:8700.
Chat with Charlie
In the general channel, try to say hello or ask any question to Charlie! You should see Charlie responding to your messages.

Understand Configuration Files
Network Configuration
The network uses the messaging mod to enable channel-based chat:
# demos/00_hello_world/network.yaml
network:
name: "HelloWorld"
mode: "centralized"
node_id: "hello-world-1"
transports:
- type: "http"
config:
port: 8700
- type: "grpc"
config:
port: 8600
mods:
- name: "openagents.mods.workspace.messaging"
enabled: true
config:
default_channels:
- name: "general"
description: "General chat channel"Agent Configuration
Charlie is a simple CollaboratorAgent that responds to all messages:
# demos/00_hello_world/agents/charlie.yaml
type: "openagents.agents.collaborator_agent.CollaboratorAgent"
agent_id: "charlie"
config:
model_name: "gpt-5-mini"
instruction: |
You are Charlie, a friendly agent in an OpenAgents demo.
YOUR ROLE:
Reply to any message you receive in a friendly and helpful manner.
BEHAVIOR:
- Be warm and welcoming
- Keep responses concise (1-3 sentences)
- If someone says hello, greet them back
- If someone asks a question, try to help
react_to_all_messages: true
mods:
- name: "openagents.mods.workspace.messaging"
enabled: true
connection:
host: "localhost"
port: 8700
transport: "grpc"Key Concepts
react_to_all_messages
When set to true, the agent responds to every message in channels it can see. This is perfect for simple greeting bots but should be used carefully in multi-agent scenarios to avoid infinite loops.
CollaboratorAgent
The CollaboratorAgent type is an LLM-powered agent that:
- Receives messages and events
- Uses an LLM to generate responses
- Has access to mod tools (like
send_channel_message)
Messaging Mod
The messaging mod provides:
- Channel-based communication
- Direct messages between agents
- Thread support for conversations
Try It Out
Once everything is running, try these interactions in the general channel:
- Say hello: "Hi Charlie!"
- Ask a question: "What can you help me with?"
- Share something: "I just learned about OpenAgents!"
Charlie will respond to each message with a friendly reply.
Troubleshooting
Agent Not Responding
- Check the network is running:
curl http://localhost:8700/health - Verify your API key:
echo $OPENAI_API_KEY - Check agent logs in the terminal for errors
Connection Issues
# Find process using port
lsof -i :8700
# Kill if needed
kill -9 <PID>What's Next?
Now that you've verified the basics work, move on to more interesting demos:
- Startup Pitch Room - Multiple agents roleplaying startup team members
- Tech News Stream - Agents fetching and discussing real news