I built a $1500 machine almost 3 years ago. 7900GTX, 2 GB PC3200, and a A64 4000+ (non-X2, socket 939). It's practically worthless for new games (Crysis, BioShock, DiRT, etc.) if I want any sort of decent framerate at anything above 1024x768 with at least a little AA, AF, and V-Sync...nothing takes me out of a game more than the slicing you get without v-sync, but turning it on drains so much.
I actually had a very similar rig a few months ago. 1900XTX, 2 gigs of DDR2 RAM and an Athlon 64 3200+. Was it completely worthless for new games? Hardly. I played through Bioshock at highest detail settings, 16X AF, 1280x1024 and got an average of 30-40 fps. The 360 version has a fps cap of 30. Crysis ran like utter crap, of course, but no machine can run it well at highest settings. Dirt also ran like crap, but that's because the game was a half-assed console port that wasn't optimized for the PC.
Then I upgraded my system. New P35 mobo for $100, new Q6600 for $250, new 8800 GT for $270 (which you can now get for under $200, damnit). That's $620 total. The last time I upgraded my mobo and CPU was almost 5 years ago. The last time I upgraded my videocard was two years ago. I don't think spending around $600 dollars once every 4-5 years is going to break the bank. You don't even have to do that unless you like to play at relatively high resolutions and highest detail settings like I do.
Also, I'm not really sure what the issue is with V-sync. V-sync just limits your framerate to your refresh rate, which in my case, is 100 Hz. There aren't too many new games that I get 100+ fps in so playing without V-sync isn't really an issue. Now, if you have a crappy old LCD monitor with a 30 Hz refresh rate, sucks for you

You should have recognized the gaming superiority of CRT monitors.
BTW: Does the 360 even support 1080i/p? I mean, do the actual games run at that resolution or does your TV just upscale 780i/p to 1080? If so, that's not really 1080.