Will we see people being paid to host small single-GPU servers in their home ? I guess that would require redesigning the training system because the data transfer speed would be much slower with a higher latency. Maybe that is not even compatible with LLM training ?
2 comments