Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

All You Need To Know About Running LLMs Locally

Автор: bycloud

Загружено: 2024-02-26

Просмотров: 246671

Описание:

RTX4080 SUPER giveaway!
Sign-up for NVIDIA's GTC2024: https://nvda.ws/48s4tmc
Giveaway participation link: https://forms.gle/2w5fQoMjjNfXSRqf7
Please read all the rules & steps carefully!!
1. Sign-up for NVIDIA's Virtual GTC2024 session between Mar 18 - 21st
2. Participate the giveaway DURING Mar 18 - 21st
3. ???
4. Profit


TensorRT LLM
[Code] https://github.com/NVIDIA/TensorRT-LLM
[Getting Started Blog] https://nvda.ws/3O7f8up
[Dev Blog] https://nvda.ws/490uadi

Chat with RTX
[Download] https://nvda.ws/3OHPRHE
[Blog] https://nvda.ws/3whKZTb

Links:
[Oobabooga] https://github.com/oobabooga/text-gen...
[SillyTavern] https://github.com/SillyTavern/SillyT...
[LM Studio] https://lmstudio.ai/
[Axolotl] https://github.com/OpenAccess-AI-Coll...
[Llama Factory] https://github.com/hiyouga/LLaMA-Factory
[HuggingFace] https://huggingface.co/models
[AWQ] https://github.com/mit-han-lab/llm-awq
[ExLlamav2] https://github.com/turboderp/exllamav2
[GGUF] https://github.com/ggerganov/ggml/blo...
[GPTQ] https://github.com/IST-DASLab/gptq
[LlamaCpp] https://github.com/ggerganov/llama.cpp
[vllm] https://github.com/vllm-project/vllm
[TensorRT LLM] https://github.com/NVIDIA/TensorRT-LLM
[Chat with RTX] https://www.nvidia.com/en-us/ai-on-rt...
[LlamaIndex] https://github.com/run-llama/llama_index
[Continue.dev] https://continue.dev/

Model recommendations (I know you are here after DeepSeek):
[All DeepSeek Models] https://huggingface.co/collections/de...
[Easily Download with Ollama] https://ollama.com/library/deepseek-r1
Here's the rule of thumb to know if you can run it:
If your VRAM is larger than the model GB size * 1.2, than you can run that model size locally.
Eg. DeepSeek-7B = 4.7GB then 4.7*1.2=5.64, so if your GPU has 8GB VRAM, since 8GB is bigger than 5.64, you can run DeepSeek-7B.
Check out my latest video on DeepSeek-R1 to understand the context better!

(the following are all outdated)
Just use Llama-3.1 instead for everything.
[Llama-3.1] https://huggingface.co/collections/me...
Translation can try Aya 23
[Aya 23] https://huggingface.co/CohereForAI/ay...

(the following are all outdated)
[Nous-Hermes-llama-2-7b] https://huggingface.co/NousResearch/N...
[Openchat-3.5-0106] https://huggingface.co/openchat/openc...
[SOLAR-10.7B-Instruct-v1.0] https://huggingface.co/upstage/SOLAR-...
[Google Gemma] https://huggingface.co/google/gemma-7b
[Mixtral-8x7B-Instruct-v0.1] https://huggingface.co/mistralai/Mixt...
[Deepseek-coder-33b-instruct] https://huggingface.co/deepseek-ai/de...
[Colbertv2.0] https://huggingface.co/colbert-ir/col...


This video is supported by the kind Patrons & YouTube Members:
🙏Andrew Lescelius, alex j, Chris LeDoux, Alex Maurice, Miguilim, Deagan, FiFaŁ, Daddy Wen, Tony Jimenez, Panther Modern, Jake Disco, Demilson Quintao, Shuhong Chen, Hongbo Men, happi nyuu nyaa, Carol Lo, Mose Sakashita, Miguel, Bandera, Gennaro Schiano, gunwoo, Ravid Freedman, Mert Seftali, Mrityunjay, Richárd Nagyfi, Timo Steiner, Henrik G Sundt, projectAnthony, Brigham Hall, Kyle Hudson, Kalila, Jef Come, Jvari Williams, Tien Tien, BIll Mangrum, owned, Janne Kytölä, SO, Richárd Nagyfi, Hector, Drexon

[Discord]   / discord  
[Twitter]   / bycloudai  
[Patreon]   / bycloud  

[Music] massobeats - magic carousel
[Profile & Banner Art]   / pygm7  
[Video Editor] maikadihaika

All You Need To Know About Running LLMs Locally

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

array(10) { [0]=> object(stdClass)#6837 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "TeQDr4DkLYo" ["related_video_title"]=> string(45) "Why LLMs get dumb (Context Windows Explained)" ["posted_time"]=> string(25) "2 месяца назад" ["channelName"]=> string(12) "NetworkChuck" } [1]=> object(stdClass)#6810 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "SVI7BxfOWdk" ["related_video_title"]=> string(156) "ПОВЫШЕНИЕ ШТРАФОВ В 8 - 40 РАЗ: НОВЫЕ штрафы, ловушка в ОСАГО, секретные отметки в правах" ["posted_time"]=> string(23) "6 часов назад" ["channelName"]=> string(33) "Александр Шумский" } [2]=> object(stdClass)#6835 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "UtSSMs6ObqY" ["related_video_title"]=> string(60) "Learn Ollama in 15 Minutes - Run LLM Models Locally for FREE" ["posted_time"]=> string(27) "5 месяцев назад" ["channelName"]=> string(13) "Tech With Tim" } [3]=> object(stdClass)#6842 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "wW-Rj5MW2EU" ["related_video_title"]=> string(28) "4 levels of LLMs (on the go)" ["posted_time"]=> string(25) "3 месяца назад" ["channelName"]=> string(12) "Alex Ziskind" } [4]=> object(stdClass)#6821 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "Ucv1m0TlHWs" ["related_video_title"]=> string(181) "Радиация в Иране: Кавказ под угрозой. Трамп: Всем эвакуироваться. Ожаровский о риске ядерной угрозы" ["posted_time"]=> string(21) "4 часа назад" ["channelName"]=> string(29) "Ходорковский LIVE" } [5]=> object(stdClass)#6839 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "eiMSapoeyaU" ["related_video_title"]=> string(66) "How To Run Private & Uncensored LLMs Offline | Dolphin Llama 3" ["posted_time"]=> string(25) "3 месяца назад" ["channelName"]=> string(22) "Global Science Network" } [6]=> object(stdClass)#6834 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "Qah17NXgUSw" ["related_video_title"]=> string(59) "Извините, что разочаровался в Xbox" ["posted_time"]=> string(23) "5 часов назад" ["channelName"]=> string(8) "JumboMax" } [7]=> object(stdClass)#6844 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "yFKOOK6qqT8" ["related_video_title"]=> string(58) "Deepseek R1 671b Running LOCAL AI LLM is a ChatGPT Killer!" ["posted_time"]=> string(25) "4 месяца назад" ["channelName"]=> string(17) "Digital Spaceport" } [8]=> object(stdClass)#6820 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "obSNYqgL53k" ["related_video_title"]=> string(54) "Native MoE Multimodal LLM Will Be The Next AI Frontier" ["posted_time"]=> string(21) "1 день назад" ["channelName"]=> string(7) "bycloud" } [9]=> object(stdClass)#6838 (5) { ["video_id"]=> int(9999999) ["related_video_id"]=> string(11) "Xgx-hf2Cdeo" ["related_video_title"]=> string(32) "I'm running my LLMs locally now!" ["posted_time"]=> string(23) "1 месяц назад" ["channelName"]=> string(25) "Maximilian Schwarzmüller" } }
Why LLMs get dumb (Context Windows Explained)

Why LLMs get dumb (Context Windows Explained)

ПОВЫШЕНИЕ ШТРАФОВ В 8 - 40 РАЗ: НОВЫЕ штрафы, ловушка в ОСАГО, секретные отметки в правах

ПОВЫШЕНИЕ ШТРАФОВ В 8 - 40 РАЗ: НОВЫЕ штрафы, ловушка в ОСАГО, секретные отметки в правах

Learn Ollama in 15 Minutes - Run LLM Models Locally for FREE

Learn Ollama in 15 Minutes - Run LLM Models Locally for FREE

4 levels of LLMs (on the go)

4 levels of LLMs (on the go)

Радиация в Иране: Кавказ под угрозой. Трамп: Всем эвакуироваться. Ожаровский о риске ядерной угрозы

Радиация в Иране: Кавказ под угрозой. Трамп: Всем эвакуироваться. Ожаровский о риске ядерной угрозы

How To Run Private & Uncensored LLMs Offline | Dolphin Llama 3

How To Run Private & Uncensored LLMs Offline | Dolphin Llama 3

Извините, что разочаровался в Xbox

Извините, что разочаровался в Xbox

Deepseek R1 671b Running LOCAL AI LLM is a ChatGPT Killer!

Deepseek R1 671b Running LOCAL AI LLM is a ChatGPT Killer!

Native MoE Multimodal LLM Will Be The Next AI Frontier

Native MoE Multimodal LLM Will Be The Next AI Frontier

I'm running my LLMs locally now!

I'm running my LLMs locally now!

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]