Install Ollama and Run AI Locally on Your PC (Offline, Free & Easy)
Автор: Kodar
Загружено: 2025-10-15
Просмотров: 889
Run powerful AI models offline right on your computer! 💻
In this full step-by-step tutorial, I’ll show you how to download and install Ollama, set up DeepSeek-R1:8B and LLaMA 2:7B models, and use them completely offline — no API keys or internet required once installed!
🧠 What You’ll Learn:
How to install and run Ollama on Windows / macOS / Linux
How to download and use models like DeepSeek-R1 and LLaMA 2
How to run AI models offline with no API or internet
Hardware requirements for each model (RAM, VRAM, CPU)
How to manage, list, and remove models easily
💡 Minimum System Requirements:
8GB RAM (for smaller models like Phi-3 ini)
16GB RAM (recommended for LLaMA 2 / DeepSeek)
6–8GB VRAM for GPU acceleration (optional)
Around 10GB of free storage
🔥 Models Covered:
DeepSeek-R1: 8B : ollama run deepseek-r1:8b
LLaMA 2:7B: ollama run llama2:7b
Ollama Status : ollama --version
Ollama Models : ollama list
Type /bye or Ctrl + d to close the chat or model
📦 Download Ollama:
👉 https://ollama.com/download
📘 Learn More About Ollama Models:
👉 https://ollama.com/library
💬 Subscribe for More AI Projects & Tutorials!
/ @kodar-d9i
#Ollama #DeepSeek #LLaMA2 #AIModels #OfflineAI #LocalAI #DeepSeekR1 #OllamaTutorial #AIonPC #RunAIWithoutInternet #AIMadeEasy #OllamaSetup #AIProjects #AIVideo #OllamaInstallation #LLaMA2Tutorial
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: