DIY 4x Nvidia P40 Homeserver for AI with 96gb VRAM!
Автор: AI HOMELAB
Загружено: 2024-07-15
Просмотров: 170006
We've built a homeserver for AI experiments, featuring 96 GB of VRAM and 448 GB of RAM, with an AMD EPYC 7551P processor. We'll be testing our Tesla P40 GPUs on various LLMs and CNNs to explore their performance capabilities. We'll also share our approach to cooling these GPUs effectively. This video is for fellow enthusiasts interested in experimenting with high-performance AI setups at home.
Important Note: First enable "above 4g decoding" before you insert a GPU with more than 16gb of VRAM. Otherwise your computer won't post.
You'll find the stl plan for the cooling shroud under:
https://drive.google.com/file/d/1vk0G...
Hashtags:
#ai #machinelearning #artificialintelligence
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: