Quantcast
Viewing latest article 29
Browse Latest Browse All 34

Open WebUI + Ollama + DeepSeek-R1-Distill-Qwen-1.5B

gistfile1.txt
# HW Requirements
# I recommend 8 CPU cores and 16 GB of RAM on the system or VM if using the deepseek-r1:1.5b model
# 1. install docker
# e.g. see: https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-rocky-linux-9
# 2a. assuming an Nvidia GPU is available, run:
`docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama`
# 2b. if an Nvidia GPU is not available, run:
`docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama`
# 2c. alternate commands and docs here: https://docs.openwebui.com/
# 3. access web ui
# open `http://localhost:3000/`
# 4. click drop-down, select add model
# type in: `deepseek-r1:1.5b`; select 'download from ollama'
# 5. select the model, and query away!

Viewing latest article 29
Browse Latest Browse All 34

Trending Articles