@skadefro/litellm
TypeScript icon, indicating that this package has built-in type declarations

0.5.1 • Public • Published

🚅 LiteLLM.js

JavaScript implementation of LiteLLM.

Usage

npm install litellm
import { completion } from 'litellm';
process.env['OPENAI_API_KEY'] = 'your-openai-key';

const response = await completion({
  model: 'gpt-3.5-turbo',
  messages: [{ content: 'Hello, how are you?', role: 'user' }],
});

// or stream the results
const stream = await completion({
  model: "gpt-3.5-turbo",
  messages: [{ content: "Hello, how are you?", role: "user" }],
  stream: true
});

for await (const part of stream) {
  process.stdout.write(part.choices[0]?.delta?.content || "");
}

Features

We aim to support all features that LiteLLM python package supports.

  • Standardised completions
  • Standardised embeddings
  • Caching ❌
  • Proxy ❌

Supported Providers

Provider Completion Streaming
openai
cohere
anthropic
ollama
ai21
replicate
huggingface
together_ai
openrouter
vertex_ai
palm
baseten
azure
sagemaker
bedrock
vllm
nlp_cloud
aleph alpha
petals
deepinfra

Development

Clone the repo

git clone https://github.com/zya/litellmjs.git

Install dependencies

npm install

Run unit tests

npm t

Run E2E tests

First copy the example env file.

cp .example.env .env

Then fill the variables with your API keys to be able to run the E2E tests.

OPENAI_API_KEY=<Your OpenAI API key>
....

Then run the command below to run the tests

npm run test:e2e

/@skadefro/litellm/

    Package Sidebar

    Install

    npm i @skadefro/litellm

    Weekly Downloads

    1

    Version

    0.5.1

    License

    ISC

    Unpacked Size

    112 kB

    Total Files

    72

    Last publish

    Collaborators

    • skadefro