Ollama on Windows | Run LLMs locally 🔥
Автор: Rishab in Cloud
Загружено: 2024-02-23
Просмотров: 54513
Ollama let's you run LLM's locally on your machine and is now available on Windows. In this video I share what Ollama is, how to run Large Language Models locally and how you can integrate it with LangChain.
Join this channel to get access to perks:
/ @rishabincloud
Resources:
Ollama - https://ollama.com/
LangChain - https://python.langchain.com/
Fine me on GitHub - https://github.com/rishabkumar7
Connect with me:
https://rishabkumar.com
Twitter → / rishabincloud
LinkedIn → / rishabkumar7
Instagram → / rishabincloud
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: