This is a demonstration of using AIGNE Framework to build a handoff workflow. The example now supports both one-shot and interactive chat modes, along with customizable model settings and pipeline input/output.
flowchart LR
in(In)
out(Out)
agentA(Agent A)
agentB(Agent B)
in --> agentA --transfer to b--> agentB --> out
classDef inputOutput fill:#f9f0ed,stroke:#debbae,stroke-width:2px,color:#b35b39,font-weight:bolder;
classDef processing fill:#F0F4EB,stroke:#C2D7A7,stroke-width:2px,color:#6B8F3C,font-weight:bolder;
class in inputOutput
class out inputOutput
class agentA processing
class agentB processing
Workflow of a handoff between user and two agents:
sequenceDiagram
participant User
participant A as Agent A
participant B as Agent B
User ->> A: transfer to agent b
A ->> B: transfer to agent b
B ->> User: What do you need, friend?
loop
User ->> B: It's a beautiful day
B ->> User: Sunshine warms the earth,<br />Gentle breeze whispers softly,<br />Nature sings with joy.
end
- Node.js (>=20.0) and npm installed on your machine
- An OpenAI API key for interacting with OpenAI's services
- Optional dependencies (if running the example from source code):
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
# Run in one-shot mode (default)
npx -y @aigne/example-workflow-handoff
# Run in interactive chat mode
npx -y @aigne/example-workflow-handoff --chat
# Use pipeline input
echo "transfer to agent b" | npx -y @aigne/example-workflow-handoff
git clone https://github.com/AIGNE-io/aigne-framework
cd aigne-framework/examples/workflow-handoff
pnpm install
Setup your OpenAI API key in the .env.local
file:
OPENAI_API_KEY="" # Set your OpenAI API key here
You can use different AI models by setting the MODEL
environment variable along with the corresponding API key. The framework supports multiple providers:
-
OpenAI:
MODEL="openai:gpt-4.1"
withOPENAI_API_KEY
-
Anthropic:
MODEL="anthropic:claude-3-7-sonnet-latest"
withANTHROPIC_API_KEY
-
Google Gemini:
MODEL="gemini:gemini-2.0-flash"
withGEMINI_API_KEY
-
AWS Bedrock:
MODEL="bedrock:us.amazon.nova-premier-v1:0"
with AWS credentials -
DeepSeek:
MODEL="deepseek:deepseek-chat"
withDEEPSEEK_API_KEY
-
OpenRouter:
MODEL="openrouter:openai/gpt-4o"
withOPEN_ROUTER_API_KEY
-
xAI:
MODEL="xai:grok-2-latest"
withXAI_API_KEY
-
Ollama:
MODEL="ollama:llama3.2"
withOLLAMA_DEFAULT_BASE_URL
For detailed configuration examples, please refer to the .env.local.example
file in this directory.
pnpm start # Run in one-shot mode (default)
# Run in interactive chat mode
pnpm start -- --chat
# Use pipeline input
echo "transfer to agent b" | pnpm start
The example supports the following command-line parameters:
Parameter | Description | Default |
---|---|---|
--chat |
Run in interactive chat mode | Disabled (one-shot mode) |
--model <provider[:model]> |
AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
--temperature <value> |
Temperature for model generation | Provider default |
--top-p <value> |
Top-p sampling value | Provider default |
--presence-penalty <value> |
Presence penalty value | Provider default |
--frequency-penalty <value> |
Frequency penalty value | Provider default |
--log-level <level> |
Set logging level (ERROR, WARN, INFO, DEBUG, TRACE) | INFO |
--input , -i <input>
|
Specify input directly | None |
# Run in chat mode (interactive)
pnpm start -- --chat
# Set logging level
pnpm start -- --log-level DEBUG
# Use pipeline input
echo "transfer to agent b" | pnpm start
The following example demonstrates how to build a handoff workflow:
import { AIAgent, AIGNE } from "@aigne/core";
import { OpenAIChatModel } from "@aigne/core/models/openai-chat-model.js";
const { OPENAI_API_KEY } = process.env;
const model = new OpenAIChatModel({
apiKey: OPENAI_API_KEY,
});
function transfer_to_b() {
return agentB;
}
const agentA = AIAgent.from({
name: "AgentA",
instructions: "You are a helpful agent.",
outputKey: "A",
skills: [transfer_to_b],
});
const agentB = AIAgent.from({
name: "AgentB",
instructions: "Only speak in Haikus.",
outputKey: "B",
});
const aigne = new AIGNE({ model });
const userAgent = aigne.invoke(agentA);
const result1 = await userAgent.invoke("transfer to agent b");
console.log(result1);
// Output:
// {
// B: "Transfer now complete, \nAgent B is here to help. \nWhat do you need, friend?",
// }
const result2 = await userAgent.invoke("It's a beautiful day");
console.log(result2);
// Output:
// {
// B: "Sunshine warms the earth, \nGentle breeze whispers softly, \nNature sings with joy. ",
// }
This project is licensed under the MIT License.