Skip to main content

Installation & Setup

This guide walks you through everything needed to run Orbit Classroom locally for development or evaluation.

Prerequisites

Docker & Docker Compose

Orbit runs as a set of containerized services. Docker Desktop (which includes Docker Compose) is required.

PlatformInstructions
LinuxInstall Docker Engine and the Compose plugin via your distribution's package manager, or follow the official guide.
macOSDownload and install Docker Desktop for Mac. Both Intel and Apple Silicon are supported.
WindowsInstall Docker Desktop for Windows with the WSL 2 backend enabled.

Git

PlatformInstructions
Linuxsudo apt install git (Debian/Ubuntu) or sudo dnf install git (Fedora).
macOSIncluded with Xcode Command Line Tools. Run xcode-select --install if not already present.
WindowsDownload from git-scm.com or install via winget install Git.Git.

Python 3.13+

The helper script (scripts/dev.py) requires Python 3.13 or later.

PlatformInstructions
LinuxUse your package manager or pyenv: pyenv install 3.13
macOSUse Homebrew: brew install python@3.13, or use pyenv.
WindowsDownload from python.org or install via winget install Python.Python.3.13.

Ollama (Optional)

If you want to run LLM inference locally instead of using a cloud API, install Ollama.

PlatformInstructions
Linux`curl -fsSL https://ollama.com/install.sh
macOSbrew install ollama or download from ollama.com.
WindowsDownload from ollama.com.
tip

Ollama is completely optional. You can use Anthropic or OpenAI API keys instead. Ollama is useful for offline development or when you want to avoid API costs.

Quick Start

1. Clone the repository

git clone git@github.com:MifralTech/orbit-ui.git && cd orbit-ui

2. (Optional) Start Ollama and pull a model

If you plan to use a local model:

ollama pull smollm2
info

The smollm2 model is small and fast, good for testing. You can configure a different model later through the Admin settings panel.

3. Generate secrets

python scripts/dev.py secrets

This creates the .env file with generated JWT secrets, database credentials, and other required values.

4. Start the development stack

python scripts/dev.py up

This pulls the required Docker images and starts all services. The first run may take several minutes while images download and build.

warning

Make sure Docker Desktop is running before executing this command. The script will fail if the Docker daemon is not available.

Access URLs

Once the stack is running, the following services are available:

ServiceURLDescription
Frontendhttp://localhost:5173The Orbit Classroom web application.
Backend APIhttp://localhost:8000The FastAPI backend. Swagger docs are at /docs.
Email Inboxhttp://localhost:9000Inbucket — catches all outgoing emails (signup confirmations, password resets).
tip

When you sign up for the first time, the confirmation email will appear in the Inbucket inbox at http://localhost:9000. Check there if you do not receive an email.

Configuration

Environment Variables

The python scripts/dev.py secrets command generates a .env file at the project root. This file contains all necessary configuration. Key variables you may want to modify:

VariablePurpose
ANTHROPIC_API_KEYAPI key for Anthropic (Claude) models. Set this if you want to use Claude as your LLM provider.
OPENAI_API_KEYAPI key for OpenAI models. Set this if you want to use GPT models as your LLM provider.
OLLAMA_BASE_URLURL for your Ollama instance. Defaults to http://host.docker.internal:11434 so Docker containers can reach Ollama running on your host machine.
info

You do not need all three providers. Configure at least one — Ollama for local inference, or an API key for cloud inference. You can switch providers at any time through the Admin settings panel in the UI.

Docker Compose Services

The development stack consists of the following services:

ServiceDescription
dbPostgreSQL database with pgvector extension for vector similarity search.
supabase-authSupabase GoTrue authentication server handling signup, login, and JWT tokens.
inbucketLocal email server that captures all outgoing mail for development.
apiFastAPI backend serving REST endpoints and managing business logic.
workerBackground task worker for document ingestion, embedding generation, and other async jobs.
redisRedis instance used as the message broker and result backend for background tasks.
frontendSvelteKit development server serving the Orbit Classroom UI.

Next Steps

Once the stack is running, head to the Quick Start Guide to create your first account, configure an LLM, and send your first chat message.