ollama-ai-provider
TypeScript icon, indicating that this package has built-in type declarations

1.2.0 • Public • Published

Ollama Provider for the Vercel AI SDK

The Ollama Provider for the Vercel AI SDK contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.

Requirements

This provider requires Ollama >= 0.5.0

Setup

The Ollama provider is available in the ollama-ai-provider module. You can install it with

npm i ollama-ai-provider

Provider Instance

You can import the default provider instance ollama from ollama-ai-provider:

import { ollama } from 'ollama-ai-provider';

Example

import { ollama } from 'ollama-ai-provider';
import { generateText } from 'ai';

const { text } = await generateText({
  model: ollama('phi3'),
  prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});

Documentation

Please check out the Ollama provider documentation for more information.

Readme

Keywords

Package Sidebar

Install

npm i ollama-ai-provider

Weekly Downloads

34,194

Version

1.2.0

License

Apache-2.0

Unpacked Size

161 kB

Total Files

9

Last publish

Collaborators

  • sgomez