Replacing my job at RunPod with agents: Can Mastra really do it?
Автор: Tim Pietrusky
Загружено: 9 апр. 2025 г.
Просмотров: 322 просмотра
I share exactly how I’m automating my day-to-day tasks at RunPod using AI agents built with Mastra. I have 43 repositories (workers) to maintain, validate, and update: doing it manually would take forever. But with Mastra agents, the entire process becomes automated, efficient and incredibly scalable.
You'll learn how I structured an agent system by using the models ToolACE-2-Llama-3.1-8b and claude-3.5-sonnet, enabling tasks like code validation, automatic bug fixes and even creating GitHub pull requests, all without me doing anything.
We’ll dive into:
My Mastra agent architecture setup
How agents collaborate to manage and fix repositories
Why I chose ToolACE-2-Llama-3.1-8B and claude-3.5-sonnet
Challenges and learnings from deploying agents with vLLM on RunPod
Resources
RunPod workers: https://github.com/runpod-workers
Serverless in Real Life: Use Cases & Insights: • Tim Pietrusky, DevRel Engineer, RunPo...
Janitor: https://github.com/runpod/janitor
mastra: https://mastra.ai
ToolACE-2-Llama-3.1-8B: https://huggingface.co/Team-ACE/ToolA...
claude-3-5-sonnet: https://www.anthropic.com/news/claude...
function calling leaderboard: https://gorilla.cs.berkeley.edu/leade...
worker-basic: https://github.com/TimPietrusky/worke...
vLLM worker: https://github.com/runpod-workers/wor...
RunPod: https://runpod.io
The docker image that contains the fix for the llama3_json parser: runpod/worker-v1-vllm:v2.2.0stable-cuda12.1.0-fix-llama3-json
watt-tool-70b: https://huggingface.co/watt-ai/watt-t...
Chapters
00:00 replacing myself with agents
01:28 janitor
02:58 worker-basic
03:25 git_checkout
03:48 docker_validation
05:15 repair
07:05 pull_request
08:34 mastra web ui
10:24 janitor live demo
15:32 ToolACE-2-Llama-3.1-8B on RunPod
17:00 bug in vLLM fixed
18:20 openai-compatible api
18:42 open source
18:57 watt-tool-70b

Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: