A lightweight and beautiful React component library for interacting with locally hosted Ollama models. Built with Vite, Tailwind CSS, and Lucide React icons.
⚡ Rapidly build AI chat interfaces using minimal setup.
- 🧩 Prebuilt chat UI components
- 💨 Tailwind CSS styling out of the box
- 🧠 Designed for Ollama model integrations
- 🧑🎨 Lucide icons for a modern look
- ⚛️ React 18+ support
Make sure you have react
, react-dom
, axios
, tailwindcss
, and lucide-react
installed in your project.
npm install ollama-web-ui
Or with Yarn:
yarn add ollama-web-ui
💡 This package relies on peer dependencies. Install them if you haven't:
npm install react react-dom axios tailwindcss @tailwindcss/vite lucide-react
import React from 'react';
import OllamaUi from 'ollama-web-ui';
function App() {
return (
<div className="p-4">
<OllamaUi />
</div>
);
}
Ensure Tailwind is configured in your project. Minimal setup:
npm install tailwindcss @tailwindcss/vite
npx tailwindcss init -p
Add this to your vite.config.js
:
import { defineConfig } from 'vite'
import tailwindcss from '@tailwindcss/vite'
export default defineConfig({
plugins: [
tailwindcss(),
],
})
Then include the Tailwind styles in your main index.css
or App.css
:
@import "tailwindcss";
If you're developing locally:
npm run dev
Build for production:
npm run build
Feel free to open issues or pull requests to improve or extend the library!
MIT © Obuh Daniel
This project is inspired by the need for a lightweight, easy-to-use chat interface for locally running LLMs like those powered by Ollama.