There are too many insurmountable technical issues with this kind of service for it to enter the mainstream any time soon.
Bandwidth usage is a biggie. Streaming HD video of any kind of useful quality eats a lot of bandwidth, and most people who game spend a lot more time gaming than movie buffs do streaming movies, for example. Heavy gameplay on OnLive, combine with your normal internet downloads and usage would quickly burn out a 300GB Comcast cap. To say nothing of the less generous caps of other ISPs.
Latency is another problem. To keep it low, they have to have locations all over, and even then, network latency combined with video encode latency is just too much for anyone trying to land a headshot. Even slower paced multiplayer console shooters like Halo can be really annoying on a high latency system, whether it be from network latency, or input latency (either from OnLive, a slow HDTV with a lot of filtering/scaling or whatever).
The issue of many spread out datacenters also raises another problem: Community fragmentation. It may be in many scenarios that you simply would not be permitted to play with people who are connected to different streaming hubs because of the latency differential.
This technology relies on best-case-scenario bandwidth and latency, which is mostly out of their hands to do anything about. They can optimize their video encoding, and try some predictive algorithms for input, but at the end of the day they are dependent on ISP routing and the simple factor of distance.
On top of all that, they can't charge very much without making it more expensive than simply keeping a mid-range gaming rig up to date. ($500 a year can keep you perpetually in the mid-range, which is $42 a month approx.)
Do you have a single fact to back that up?