The newest AI boom pitch: Host a mini data center at your home

Ars Technica ·

The newest AI boom pitch: Host a mini data center at your home

Such a distributed computing network makes sense in that “computation for AI inference can and should be distributed at the ‘edge,’ deployed on smaller platforms closer to population centers and …

Such a distributed computing network makes sense in that “computation for AI inference can and should be distributed at the ‘edge,’ deployed on smaller platforms closer to population centers and users,” said Benjamin Lee , a computer architect and engineer at the University of Pennsylvania, in correspondence with Ars. “The strategy could impose much smaller impacts on the grid because inference requires a few GPUs, unlike training which requires thousands of them working in concert,” he said. …

Original source: Ars Technica

Mentioned

AI · Reddit · Nvidia · Silicon Valley