Skip to main content
Version: Next

Quick Start

Build and run your first AI agent with Agent Kernel in under 5 minutes!

Choose Your Framework

Agent Kernel supports multiple frameworks. Pick the one you're most comfortable with:

OpenAI Agents Quick Start

1. Install

pip install agentkernel[openai]

2. Create Your Agent

Create a file called my_agent.py:

from agents import Agent as OpenAIAgent
from agentkernel.cli import CLI
from agentkernel.openai import OpenAIModule

# Define your agent
general_agent = OpenAIAgent(
name="general",
handoff_description="Agent for general questions",
instructions="You provide assistance with general queries. Give short and direct answers.",
)

math_agent = OpenAIAgent(
name="math",
handoff_description="Specialist agent for math questions",
instructions="You provide help with math problems. Explain your reasoning.",
)

# Register agents with Agent Kernel
OpenAIModule([general_agent, math_agent])

if __name__ == "__main__":
CLI.main()

3. Set API Key

export OPENAI_API_KEY=your-api-key-here

4. Run Your Agent

python my_agent.py

Testing Your Agent

Once your agent is running, you'll see an interactive CLI:

(kernel) >> Using in-memory session store
AgentKernel CLI (type !help for commands or !quit to exit):
(kernel) >> No agent was requested. Defaulting to first agent in the list
(kernel) >> Selected agent: triage
(kernel) >> Starting new session: 07ec500e-f103-4b0e-8ecb-d794232f5992
(triage) >>
(triage) >> !list
Available agents:
triage
math
general

(triage) >> !select math
(math) >> What is 2 + 2?

[math agent responds]
(math) >> 2 + 2 = 4. This is basic addition where we combine two quantities...

(math) >>

Understanding the Structure

Every Agent Kernel application follows this pattern:

  1. Define agents using your preferred framework
  2. Wrap them in an Agent Kernel Module
  3. Run them using Agent Kernel's execution modes (CLI, API, AWS, etc.)

Next Steps

Add Custom Tools

Enhance your agent with custom tools:

from crewai import Agent, Tool

def search_database(query: str) -> str:
# Your custom logic
return f"Results for: {query}"

search_tool = Tool(
name="search",
description="Search the database",
func=search_database
)

agent = Agent(
role="researcher",
goal="Find information",
backstory="You are a research assistant",
tools=[search_tool],
verbose=False
)

Deploy as REST API

Run your agent as a REST API server:

# Install API dependencies
pip install agentkernel[api]

Use the RESTAPI module run the agents

from agents import Agent as OpenAIAgent
from agentkernel.api import RESTAPI
from agentkernel.openai import OpenAIModule

# Define your agent
general_agent = OpenAIAgent(
name="general",
handoff_description="Agent for general questions",
instructions="You provide assistance with general queries. Give short and direct answers.",
)

math_agent = OpenAIAgent(
name="math",
handoff_description="Specialist agent for math questions",
instructions="You provide help with math problems. Explain your reasoning.",
)

# Register agents with Agent Kernel
OpenAIModule([general_agent, math_agent])

if __name__ == "__main__":
RESTAPI.run()

Run the agents

# Run as API server
python my_agent.py

Configure the API port by editing config.yaml

api:
port: 8000

Test it:

curl -X POST http://localhost:8000/run \
-H "Content-Type: application/json" \
-d '{"agent": "general", "message": "Hello!"}'

Deploy to AWS Lambda

Package and deploy to AWS Lambda:

# Install AWS dependencies
pip install agentkernel[aws]

Use the Lambda module to run the agents

from agents import Agent
from agentkernel.aws import Lambda
from agentkernel.openai import OpenAIModule

math_agent = Agent(
name="math",
handoff_description="Specialist agent for math questions",
instructions="You provide help with math problems. Explain your \
reasoning at each step and include examples. \
If prompted for anything else you refuse to answer.",
)

history_agent = Agent(
name="history",
handoff_description="Specialist agent for historical questions",
instructions="You provide assistance with historical queries. \
Explain important events and context clearly.",
)

triage_agent = Agent(
name="triage",
instructions="You determine which agent to use based on the user's question.",
handoffs=[history_agent, math_agent],
)

OpenAIModule([triage_agent, math_agent, history_agent])

handler = Lambda.handler
# Deploy requires AWS credentials configured
# Terraform module imported and configured (see examples)
terraform init
terraform apply

Configure Memory

Add Redis-backed memory for persistent sessions:

export AK_SESSION_STORAGE=redis
export AK_REDIS_URL=redis://localhost:6379

Or use in-memory storage (default):

export AK_SESSION_STORAGE=in_memory

Common Patterns

Session Management

# Sessions automatically track conversation history
# Each user/conversation gets a unique session ID
# Configure via environment variables:
# AK_SESSION_STORAGE=redis
# AK_REDIS_URL=redis://localhost:6379

Explore more examples in our Examples section:

Example snippets can be found here.

Troubleshooting

Agent Not Found

Make sure you're using the correct agent name when running:

# Agent names come from the framework-specific naming
# OpenAI: agent.name
# CrewAI: agent.role
# LangGraph: graph.name

API Key Errors

Ensure your API key is set correctly:

# OpenAI (for OpenAI, CrewAI, LangGraph)
export OPENAI_API_KEY=sk-...

# Google (for ADK)
export GOOGLE_API_KEY=...

Import Errors

Install the correct extras package:

pip install agentkernel[your-framework]

Learn More


Need help? Check out our GitHub Issues or open a discussion!