A library that significantly accelerates AI function selection through vector embeddings.
@agentica/pg-vector-selector
drastically improves function selection speed compared to traditional LLM-based methods. By leveraging vector embeddings and semantic similarity, it can identify the most appropriate functions for a given context multiple times faster than conventional approaches.
import { Agentica } from "@agentica/core";
import { AgenticaPgVectorSelector } from "@agentica/pg-vector-selector";
import typia from "typia";
// Initialize with connector-hive server
const selectorExecute = AgenticaPgVectorSelector.boot<"chatgpt">(
"https://your-connector-hive-server.com"
);
const agent = new Agentica({
model: "chatgpt",
vendor: {
model: "gpt-4o-mini",
api: new OpenAI({
apiKey: process.env.CHATGPT_API_KEY,
}),
},
controllers: [
await fetch(
"https://shopping-be.wrtn.ai/editor/swagger.json",
).then(r => r.json()),
typia.llm.application<ShoppingCounselor>(),
typia.llm.application<ShoppingPolicy>(),
typia.llm.application<ShoppingSearchRag>(),
],
config: {
executor: {
select: selectorExecute,
}
}
});
await agent.conversate("I wanna buy MacBook Pro");
npm install @agentica/core @agentica/pg-vector-selector typia
npx typia setup
To use pg-vector-selector, you need:
- A running connector-hive server
-
PostgreSQL
database connected to theconnector-hive
server - pgvector extension installed in
PostgreSQL
First, initialize the library with your connector-hive server:
import { AgenticaPgVectorSelector } from "pg-vector-selector";
const selectorExecute = AgenticaPgVectorSelector.boot<YourSchemaModel>(
"https://your-connector-hive-server.com"
);
Select the most appropriate functions for a given context:
const agent = new Agentica({
model: "chatgpt",
vendor: {
model: "gpt-4o-mini",
api: new OpenAI({
apiKey: process.env.CHATGPT_API_KEY,
}),
},
controllers: [
await fetch(
"https://shopping-be.wrtn.ai/editor/swagger.json",
).then(r => r.json()),
typia.llm.application<ShoppingCounselor>(),
typia.llm.application<ShoppingPolicy>(),
typia.llm.application<ShoppingSearchRag>(),
],
config: {
executor: {
select: selectorExecute,
}
}
});
await agent.conversate("I wanna buy MacBook Pro");