[...]As the 'terminal experiment' in the 70s and 80s proved - there is a good reason why you need a localized hardware: breakthroughs in silicon advance much faster than breakthroughs in telecommunications. There is a reason it failed then, and there is a reason it will fail now - Moore's law simply does not apply to telecommunications.
If it did, even if we take a starting point of v.34 modems when the most nooks were ironed out, in 1994 we had 28.8 kb/s (bauds, but let's roughly translate them to bits), we'd all be sporting 120Mbps connections now. For WiFi the stats are even grimmer - in 2000 we had 802.11b with 11Mbps rate, so if the Moore's law worked we'd all be having 45Gbps WiFi connections now. And I'm not even calculating the availability and general demand that makes deploying such networks, especially wireless ones with their limited frequencies and interference, next to impossible. To move everything to the cloud and still have a wireless access to it, we'd need to move the frequencies a couple notches up to the X-ray spectrum as radio waves cannot simply pack that much data. It's a physical limitation, not a technological one. And then you have the issue of creating a server (farm) fast enough to serve all those terminals with such high demands, which means you'd have to build a couple of nuclear power plants next to it just to supply it with a juice, and probably put it on Arctic as there's just no way to cool off so much processing at one centralized location... No matter how optimized the solution might be, it just cannot work... [...]
A strong conviction that something must be done is the parent of many bad measures. - Daniel Webster