pull down to refresh

Eh, not really a home solution. To run Llama 3.1 405B which requires >800GB of RAM you still need devices at home totalling more than this (over 10 devices with 64 GB for example). Not exactly something people are gonna have "at home"
"with some optimization, we can run it on 192 gigabytes using 8x4090 GPUs"
Still not a home solution but closer.
reply