pull down to refresh
21 sats \ 1 reply \ @SimpleStacker 29 Jul 2024 \ on: How to Run Llama 3.1 405B on Home Devices? Build AI Cluster! alter_native
Eh, not really a home solution. To run Llama 3.1 405B which requires >800GB of RAM you still need devices at home totalling more than this (over 10 devices with 64 GB for example). Not exactly something people are gonna have "at home"
"with some optimization, we can run it on 192 gigabytes using 8x4090 GPUs"
Still not a home solution but closer.
reply