The Ollama Provider for the Vercel AI SDK contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.
This provider requires Ollama >= 0.5.0
The Ollama provider is available in the ollama-ai-provider
module. You can install it with
npm i ollama-ai-provider
You can import the default provider instance ollama
from ollama-ai-provider
:
import { ollama } from 'ollama-ai-provider';
import { ollama } from 'ollama-ai-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('phi3'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
Please check out the Ollama provider documentation for more information.