Years ago, before I had fiber to my home, my DSL got 30 ms pings (round trip) to cities 500 miles away.
One ping, sure. Now ping that city 500 times in a minute. Still great? Now ping it 500 times every minute for the next 10 minutes.
You're going to see spikes. It's that simple. There is no such thing as a consistent path on the internet. Do a simple tracert to a destination server, and you'll be lucky to get the same route three times in a row.
When you're watching movies and your connection is getting maxed out, these hitches lead to buffering. Annoying, but manageable. If your connection isn't maxed out, it just buffers behind the screen.
This system can by very definition not buffer, because it doesn't know what you, the player, are going to do. It has to wait for your input to see where it needs to go and what to do. It can probably do some small prediction (although I imagine that games would have to be substantially rewritten to allow for server side client prediction, since there is no client that can predict for itself.), but it needs to wait for you to make that move.
A single bumped path in your route will lead to either hitches or graphical artifacts. In today's HD age, how long will people put up with that before they say "Why the fuck am I playing this when I have a console sitting RIGHT HERE?"And all the reporters and devs who have seen this actually in the field, attest that latency is not a problem with their system.
Their "in the field" test has been to servers 50 miles away with 3 concurrent connections. That is not an in the field test. That's a severely optimized tech demo.
Also, not all the reporters agree that it's so wonderful.So please, enough with the "latency" problems. It just shows that you don't know what you are talking about.
Okay... You're a doctor in psychophysical analysis, I take it? I'm sure the Internet is massively excited by the fact that you just declared its one major problem null and void. Can you please do the same for the economy?