How to install Llama.cpp on Linux with GPU support
Автор: Primitive Finance
Загружено: 2026-01-01
Просмотров: 216
How to install llama.cpp on linux with Vulkan support for your GPU.
Sorry for the lower resolution in some sections. I am still getting used to recording on linux.
-------------------------------------------------------------------------
Commands used in this video:
1. Running the server within installation folder:
./llama-server -m "YOUR_MODEL_HERE" --host 127.0.0.1 --port 8080 -c 2048 -ngl -1 -t 1
2. Editing .bachrc file with nano.
nano ~/.bashrc
3. Running the server globally
llama-server -m "YOUR_MODEL_HERE" --host 127.0.0.1 --port 8080 -c 2048 -ngl -1 -t 1
0:00 - Installing llama.cpp
2:00 - Installing LLM
2:42 - Testing server within installation directory
5:16 - Editing .bashrc file
6:30 - Testing server outside of installation directory.
#ai #llama #llamacpp #llm #linux
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: