Build Your Own AI File Assistant with Blazor & Local LLMs (Like Claude Code Cowork)
Автор: CodeFilez
Загружено: 2026-01-20
Просмотров: 9
Build an AI-powered file management assistant that runs entirely on your local machine! In this video, I'll
show you TemuCowork Agent - a Blazor application that connects to Ollama for local LLM inference and uses
Microsoft Semantic Kernel to give AI the ability to read, write, search, and manage your files through
natural conversation.
📦 GET THE CODE:
https://github.com/DevMando/temuCowork
INSTALL OLLAMA
• Ollama, How to Install & Run LLM's on Wind...
IINSALL CLAUDE CODE
• How to Install Claude Code (for beginners)
🔥 FEATURES:
• Chat with AI to manage files using natural language
• @ mention files with autocomplete suggestions
• Read PDFs, Excel files, code, and 15+ formats
• Real-time streaming responses with status indicators
• Persistent chat sessions with LiteDB
• Cyberpunk UI theme with neon glow effects
• 100% local - no API keys, no cloud, complete privacy
🛠️ TECH STACK:
• .NET 8 / Blazor Server
• Microsoft Semantic Kernel
• Ollama (Local LLM)
• Radzen Blazor Components
• LiteDB
• EPPlus & PdfPig
⏱️ TIMESTAMPS:
0:00 - Demo & Introduction
2:30 - Project Architecture Overview
5:00 - Setting Up Ollama
8:00 - Semantic Kernel Integration
15:00 - Building the File System Plugin
22:00 - Creating the Chat UI
32:00 - @ Mention Autocomplete Feature
40:00 - Session Persistence with LiteDB
48:00 - Cyberpunk Styling & Polish
55:00 - Final Demo & Wrap Up
📋 REQUIREMENTS:
• .NET 8 SDK
• Ollama installed locally
• Any compatible LLM model (I use qwen2.5:14b)
💬 Let me know in the comments what features you'd like to see added!
#dotnet #blazor #ai #ollama #semantickernel #csharp #localllm #aiagent #coding #programming #aichatbot #ai
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: