Especially when all it would take to remedy the entire situation is a few dollars extra to the sticker priceThat would not fix the problem. Whatever Intel's baseline graphic solution is, it will always pale in comparison to the speed of more expensive video cards which happen to be the very same video cards which developers use when they develop their games. Game developers not only ignore or abandon integrated graphics quickly, they also abandon or ignore the entry level graphics cards. The problem here is a combination of the graphics bar being set to high by developers and the the forced frequent upgrade cycle.
because developers need tech to sell copies ( it's impossible to get media attention without tech,Why is technology oh so nececssary to get attention for a PC game, and yet it isn't for a console game? Is that simply a misconception by developers or is the PC gaming press biased against all but the latest graphics?
While I'd love my computer to last an extra year, I value advances in technology and speedy release (as opposed to devs having to implement DirectX 7,8,9,10 versions to include a wider audience) much more. Because the games I am looking forward to simply wouldn't exist if they had to aim at lower systems (I'm thinking particularly of Crysis here).The problem I have with the state of the PC game development right now is that it is almost exclusively tilted to the high-end especially in some game genres like action and FPS games. Sequels to PC games generally won't run on the same PC as the predecessor especially if two or more years has lapsed between releases whereas on video game consoles, the sequels do run (at least when the sequel is released for the same console).
What you are saying is the same as if Sony puts out an 8-track player and cons 75% of the market into buying it, promising it plays CD's. There's no difference.Your analogy is specious. Intel does not claim that its various integrated graphics solutions have features which they do not have. The problem is that game developers are requiring features in their games which Intel's solutions do not provide. However, that doesn't mean that game developers couldn't make games with graphics that are adequate for a lot of players which would run on Intel's graphics. For example, some of Intel's current and older integrated graphics solutions support Dot3 bumpmapping. However, most game developers which use bumpmapping use pixel shaders to implement it because it looks a little better or they have moved on to normal mapping and other more advanced rendering effects. The point though is that Intel's graphics and other older graphics cards have many features which never really got fully exploited by a lot of games and could be fully used to provide some decent visuals without requiring a new video card. Sure these games won't look like Unreal Engine 3 or even Doom 3, but they don't have to, to still look alright, have a lot of features, and to be fun. If console game players don't mind games which run on five year old hardware, why must PC game players?
Did Sony just screw over their PS1 user base by introducing the PS2, and soon the PS3?Each of Sony's two released consoles has gotten at least five years of attention by game developers whereas even a three year old PC which hasn't been substantially upgraded is persona non grata when it comes to newly released PC games.
Again... for the hundredth time. This isn't about developers supporting old hardware. It's about a hardware maker, that creates NEW hardware, thats already outdated.It's only outdated because not enough developers support it. Integrated graphics have never been cutting edge. Even the upcoming iterations designed for Vista's Aero still trail dedicated graphics cards in terms of performance. The point though is that integrated graphics and older graphics cards can still be used to run games with decent visuals at a playable framerate. I have run plenty of games from 2002 and earlier including FPS games on PC's with Intel's integrated graphics like 845G's and 865G's. The problem is that PC game developers abandon PC's with older graphics technology far too quickly when commercially viable games could still be made for those PC's.
Really this fits just as easily into consoles.No it doesn't. Consoles have a significantly longer "shelf life" than PC's when it comes to games (at least the successful consoles do), and that should change. If developers can make good, commercially viable games on a four or five year old piece of console hardware, then they also could do the same for a four year old PC. More developers should target the large installed base of older PC's instead of always moving hardware requirements so far forward.
Soon SLI will become a standard and then only the rich few will be able to game on PC.
I think Marc is right that players won't go back to a game many months down the road if it is essentially the same experience with reused content as the previous one.We'll see. I bet HL:Ep2 and Ep3 will sell quite nicely. And be as dismissive as you like about the "sycophants" who buy these products, they're buying them because they enjoy them.
What Marc is saying is that because these episodic titles are competing against full length games for the consumer's attention, they don't stand a chance.Again, Valve's sale figures would disagree. What is true is that poor episodic games will not succeed. But that's true of most things.
In reality, eventually graphics will peak ( probably not for a while ), and subside, thats when gaming will get really good because tech is no longer an excuse to sell a game ( unless the tech is some wacky peripheral, which I'm sure will happen more and more as we get near this peak. )
Most hardware-knowledgeable folks have realized there is no need for separate graphics capabilities from what a modern cpu should be able to render.