"It's on my todo to build my own stack of apple minis and run Llama on my own little cluster on my desk." - @garrytan
Quick AI homelab tutorial:
- Install @exolabs on each mini (open source, steps in README)
- Make sure all the minis are on the same WiFi network or for faster/more reliable results, connected over Thunderbolt/Ethernet. Devices will auto-discover each other and auto-shard the LLMs.
- Now you can chat to your cluster with your preferred LLM GUI (@__tinygrad__ tinychat ships with exo)