That's not Mark's argument. His argument is that people who aren't hardcore are buying computers, and they all come with those integrated graphics that aren't anywhere close to a decent Nvidia/ATI card. As a result, the installed base is that integrated graphics crap, and publishers push developers to design to those low requirements (by and large, of course the Quake/Crysis/etc. are exceptions) which in turn stifles creativity into a downward spiral and THAT is what is killing PC gaming.
I don't agree, but that's his point.
Well I'm glad you don't agree with Mark, because he's wrong.
Intel isn't in the graphics business (Yet). And I'm talking ATi / Nvidia level.
chipsets are mainly intended for businesses, which throw away / donate their old machines every few years and buy new ones. They have ZERO need for Geforce 6200 level hardware. There's a heck of a lot more market there, than there are gamers. It only makes good economic sense that they try to keep the costs down as much as possible, and profit margins as wide as possible (even if integrated is way "outdated". (as far as gamers are concerned).
Now... Dell (and others) keeps selling these cheap $399 (with no monitor... $599 with) computers to people that don't know any better. They usually don't have a graphics slot (On Purpose... cost reasons). Now how the hell is it Intel's fault that there are millions of morons out there that don't know anything about modern gaming requirements? People like this SHOULDN'T have to know any better... and any software they buy SHOULD just work. PERIOD. That means supporting the lowest common denominator with an application. (Budget Business throw away machine)
It's up to the Publishers and Developers to SEE what the market is, and adjust accordingly for it.
Intel is NOT going to increase costs of budget machines just to appease game developers. That would be bad business, as far THEIR market is concerned... which is way more than just gamers.
Get your games from GOG DAMMIT!