Power Each AI Agent With A Different LOCAL LLM (AutoGen + Ollama Tutorial)
Автор: Matthew Berman
Загружено: 29 нояб. 2023 г.
Просмотров: 107 545 просмотров
In this video, I show you how to power AutoGen AI agents using individual open-source models per AI agent, this is going to be the future AI tech stack for running AI agents locally. Models are powered by Ollama and the API is exposed using LiteLLM.
Enjoy :)
Join My Newsletter for Regular AI Updates 👇🏼
https://forwardfuture.ai/
My Links 🔗
👉🏻 Subscribe: / @matthew_berman
👉🏻 Twitter: / matthewberman
👉🏻 Discord: / discord
👉🏻 Patreon: / matthewberman
Media/Sponsorship Inquiries 📈
https://bit.ly/44TC45V
Links:
Instructions - https://gist.github.com/mberman84/ea2...
Ollama - https://ollama.ai
LiteLLM - https://litellm.ai/
AutoGen - https://github.com/microsoft/autogen
• AutoGen Agents with Unlimited Memory ...
• AutoGen Advanced Tutorial - Build Inc...
• Use AutoGen with ANY Open-Source Mode...
• How To Use AutoGen With ANY Open-Sour...
• AutoGen FULL Tutorial with Python (St...
• AutoGen Tutorial 🚀 Create Custom AI A...

Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: