Run Private AI on Your Raspberry Pi! (Ollama + Llama 3.2)
Автор: Jeffs Pi in the Sky
Загружено: 2025-11-02
Просмотров: 7507
Run your own AI — privately, securely, and locally — right on a Raspberry Pi!
In this video, I’ll walk you through how to host Ollama on a Raspberry Pi to run the Llama 3.2 1B model completely offline. You’ll learn how to install Ollama, set up Open WebUI as a simple web interface, and connect the two so you can chat with AI without relying on the cloud.
We’ll cover:
✅ Installing Ollama on Raspberry Pi
✅ Downloading the Llama 3.2 1B model
✅ Running queries via command line
✅ Installing and integrating Open WebUI
✅ Running AI queries in your browser
⚡ Acceleration options – why Hailo-8L won’t work, and what might in the future
🌐 *Links mentioned:*
🔗 Ollama: [https://ollama.com](https://ollama.com)
🔗 Open WebUI: [https://openwebui.com](https://openwebui.com)
Chapters:
00:00 Intro
05:26 Install Ollama and Llama 3.2 data model
08:41 Running prompt from the terminal
10:55 Install Docker and Open WebUI container
15:38 Integrating Open WebUI with Ollama
17:06 Running prompt from Open WebUI
18:00 Acceleration considerations
20:23 Outro
This setup is perfect if you care about privacy, control, and open-source tools — all powered by your Raspberry Pi.
#RaspberryPi #AI #Ollama #Llama3
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: