๐งฉ Run Open WebUI with FastFlowLM (Windows, YAML Method)
This guide walks you through using docker-compose.yaml
to run Open WebUI connected to a local FastFlowLM instance on Windows.
โ Prerequisites
- Docker Desktop for Windows
- During installation, enable WSL2 backend
- Reboot if prompted
- FastFlowLM
๐ Step 1: Create Project Folder
Open PowerShell and run:
mkdir open-webui && cd open-webui
This creates a clean workspace for your Docker setup.
๐ Step 2: Create docker-compose.yaml
Launch Notepad:
notepad docker-compose.yaml
Paste the following:
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
ports:
- "3000:8080"
volumes:
- open-webui-data:/app/backend/data
environment:
- OPENAI_API_BASE_URL=http://host.docker.internal:11434
- WEBUI_AUTH=false
- WEBUI_SECRET_KEY=dummysecretkey
- ENABLE_TITLE_GENERATION=false
- ENABLE_FOLLOW_UP_GENERATION=false
- ENABLE_TAGS_GENERATION=false
- ENABLE_RETRIEVAL_QUERY_GENERATION=false
- ENABLE_IMAGE_PROMPT_GENERATION=false
- ENABLE_WEB_SEARCH=false
- ENABLE_SEARCH_QUERY_GENERATION=false
restart: unless-stopped
volumes:
open-webui-data:
OPENAI_API_BASE_URL=http://host.docker.internal:11434
connects Open WebUI to local FastFlowLM
WEBUI_AUTH=false
disables login (optional)
โถ๏ธ Step 3: Launch the Open WebUI Container (in PowerShell)
docker compose up -d
It could take up to 1 min before you can access Open WebUI.
This starts the container in detached mode.
You can check logs with:
docker logs -f open-webui
๐ Step 4: Access the WebUI (in Browser)
Open browser and go to:
http://localhost:3000
You should now see the Open WebUI interface.
๐งช Step 5: Serve FastFlowLM with Model
flm serve llama3.2:1b
You can now use FastFlowLM
directly in Open WebUI.
When switching models, it may take longer time to replace the model in memory.
๐งผ Step 6: Stop or Clean Up (in PowerShell)
docker compose stop
To remove it completely:
docker compose down
This also removes the container but keeps persistent volume data.
or
docker compose down -v
This removes the container and persistent volume data.
๐ง Notes
- Want login? Set
WEBUI_AUTH=true
- You must keep FastFlowLM server running
http://host.docker.internal:11434
bridges from Docker container to your native Windows host FastFlowLM API- For persistent chat history, the volume
openwebui-data
stores user data
Note (When using Open WebUI):
The Open WebUI sends multiple background requests to the server.
To improve stability and performance, disable these in Settings โ Chat:
- Title Auto-Generation
- Follow-Up Auto-Generation
- Chat Tags Auto-Generation
Toggle them off, then refresh the page.