Core shared library for CreAI assistant services. This library provides essential functionality for building AI-powered assistants with support for various messaging platforms, vector databases, and LLM integrations.
- 🤖 AI Assistant Process Context Management
- 🔄 Message Queue Integration
- 📊 Vector Database Support
- 🎯 LLM Integration
- 🔌 Channel Providers (WhatsApp, Signal, WebSocket)
- 🔒 PII Sanitization
- 🎙️ Audio Transcription
- 🔄 Integration Framework
npm install @creai/shared-core
import { AIAssistantProcessContext } from '@creai/shared-core';
// Initialize the assistant context
const assistantContext = {
// Your assistant configuration
specialist: 'Customer Support',
domainTopic: 'Product Support',
businessName: 'Your Company',
communicationStyle: 'Professional',
openAIConfig: {
api_key: 'your-api-key',
model: 'gpt-4',
temperature: 0.7,
max_tokens: 1000,
},
pineconeConfig: {
apiKey: 'your-pinecone-key',
environment: 'your-environment',
projectId: 'your-project-id',
indexName: 'your-index',
},
whatsappConfig: {
phoneNumberId: 'your-phone-number-id',
accessToken: 'your-access-token',
},
};
// Create a message object
const message = {
message: 'Hello, I need help with my order',
channelId: 'whatsapp',
assistantId: 'support-bot',
product: 'whatsapp_business_account',
};
// Initialize the process context
const processContext = new AIAssistantProcessContext(message, assistantContext);
// Process the message
await processContext.question(message);
The IAssistantContext
interface defines the configuration for your assistant:
interface IAssistantContext {
specialist: string;
domainTopic: string;
businessName: string;
communicationStyle: string;
openAIConfig: {
api_key: string;
model: string;
temperature: number;
max_tokens: number;
top_p?: number;
presence_penalty?: number;
frequency_penalty?: number;
streaming?: boolean;
};
pineconeConfig: {
apiKey: string;
environment: string;
projectId: string;
indexName: string;
};
whatsappConfig?: {
phoneNumberId: string;
accessToken: string;
};
integrationConfig?: {
salesforce?: {
instanceUrl: string;
accessToken: string;
};
zendesk?: {
subdomain: string;
email: string;
apiToken: string;
};
};
externalId?: string;
interactionFlow?: string;
fallbackResponse?: string;
timePeriod?: string;
knowledgeDepth?: string;
sourcePreference?: string;
ethicalLimitations?: string[];
greeting?: string;
}
The main class for handling assistant interactions.
Key methods:
-
question(body: IMessageFromQueue): Promise<void>
- Process a user question -
transcription(audio: Buffer): Promise<string>
- Transcribe audio to text -
queue(message: string, assistantId: string, channelId: string, product: string): Promise<any>
- Queue a message for processing -
streamHandler(): (chunk: any) => Promise<void>
- Handle streaming responses -
answer(): (chunk: any) => Promise<void>
- Handle final responses
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
For support, please open an issue in the GitHub repository.