Mark Rein Knocks Intel, Episodes

Epic's Mark Rein Intel is killing PC Gaming on Joystiq reflects the opinion the Epic VP expressed to the Develop Conference in Brighton that moves by Intel, in particular the promotion of integrated graphics on the motherboard, are hurting PC gaming. Epic VP: Intel is killing PC gaming! Ars Not really is an article offering an alternate opinion, which is probably not the last editorial that will be written on this topic. During his keynote address Rein also had an episode about episodic gaming, calling it "a broken business." According to the report, Rein's remarks met with some heckling from the audience, accusing him of being a "dinosaur" and "self-serving."
View : : :
86 Replies. 5 pages. Viewing page 2.
Newer [  1  2  3  4  5  ] Older
66.
 
Re: Huh?
Jul 14, 2006, 23:31
66.
Re: Huh? Jul 14, 2006, 23:31
Jul 14, 2006, 23:31
 
"Why is technology oh so nececssary to get attention for a PC game, and yet it isn't for a console game? Is that simply a misconception by developers or is the PC gaming press biased against all but the latest graphics? "

It's important for all gamers except the average blues news reader. Do you have a little brother? A nephew maybe? Any male around the age of 16. Tell him to pick out a game. I'll put down a hundred bucks, that he doesn't pick the latest tile based strategy game, and instead picks out the latest and greatest graphically intensive game. The platform doesn't matter.

This comment was edited on Jul 14, 23:36.
65.
 
Re: Huh?
Jul 14, 2006, 22:34
65.
Re: Huh? Jul 14, 2006, 22:34
Jul 14, 2006, 22:34
 
Especially when all it would take to remedy the entire situation is a few dollars extra to the sticker price
That would not fix the problem. Whatever Intel's baseline graphic solution is, it will always pale in comparison to the speed of more expensive video cards which happen to be the very same video cards which developers use when they develop their games. Game developers not only ignore or abandon integrated graphics quickly, they also abandon or ignore the entry level graphics cards. The problem here is a combination of the graphics bar being set to high by developers and the the forced frequent upgrade cycle.

because developers need tech to sell copies ( it's impossible to get media attention without tech,
Why is technology oh so nececssary to get attention for a PC game, and yet it isn't for a console game? Is that simply a misconception by developers or is the PC gaming press biased against all but the latest graphics?


64.
 
Re: Huh?
Jul 14, 2006, 20:40
64.
Re: Huh? Jul 14, 2006, 20:40
Jul 14, 2006, 20:40
 
It seems apparant after this much dialogue, atleast to me, that the issue boils down to this question:

Should Intel be allowed to singlehandedly slow the curve of technology, and determine what developers can and can't do with their games, because they have marketshare?

I don't believe they should. Especially when all it would take to remedy the entire situation is a few dollars extra to the sticker price, or just being honest with the customers and tell them that this chip is not going to play modern games.

Because they aren't doing this, and because developers need tech to sell copies ( it's impossible to get media attention without tech, and no one is going to buy a crappy looking game without first looking in the media for things like previews and review scores ), PC gaming is diminishing.

This comment was edited on Jul 14, 22:18.
63.
 
Re: Huh?
Jul 14, 2006, 20:19
63.
Re: Huh? Jul 14, 2006, 20:19
Jul 14, 2006, 20:19
 
While I'd love my computer to last an extra year, I value advances in technology and speedy release (as opposed to devs having to implement DirectX 7,8,9,10 versions to include a wider audience) much more. Because the games I am looking forward to simply wouldn't exist if they had to aim at lower systems (I'm thinking particularly of Crysis here).
The problem I have with the state of the PC game development right now is that it is almost exclusively tilted to the high-end especially in some game genres like action and FPS games. Sequels to PC games generally won't run on the same PC as the predecessor especially if two or more years has lapsed between releases whereas on video game consoles, the sequels do run (at least when the sequel is released for the same console).

62.
 
Re: Huh?
Jul 14, 2006, 20:09
62.
Re: Huh? Jul 14, 2006, 20:09
Jul 14, 2006, 20:09
 
What you are saying is the same as if Sony puts out an 8-track player and cons 75% of the market into buying it, promising it plays CD's. There's no difference.
Your analogy is specious. Intel does not claim that its various integrated graphics solutions have features which they do not have. The problem is that game developers are requiring features in their games which Intel's solutions do not provide. However, that doesn't mean that game developers couldn't make games with graphics that are adequate for a lot of players which would run on Intel's graphics. For example, some of Intel's current and older integrated graphics solutions support Dot3 bumpmapping. However, most game developers which use bumpmapping use pixel shaders to implement it because it looks a little better or they have moved on to normal mapping and other more advanced rendering effects. The point though is that Intel's graphics and other older graphics cards have many features which never really got fully exploited by a lot of games and could be fully used to provide some decent visuals without requiring a new video card. Sure these games won't look like Unreal Engine 3 or even Doom 3, but they don't have to, to still look alright, have a lot of features, and to be fun. If console game players don't mind games which run on five year old hardware, why must PC game players?

This comment was edited on Jul 14, 20:23.
61.
 
Re: Huh?
Jul 14, 2006, 19:39
61.
Re: Huh? Jul 14, 2006, 19:39
Jul 14, 2006, 19:39
 
Did Sony just screw over their PS1 user base by introducing the PS2, and soon the PS3?
Each of Sony's two released consoles has gotten at least five years of attention by game developers whereas even a three year old PC which hasn't been substantially upgraded is persona non grata when it comes to newly released PC games.

PC game developers make PC's become obsolete much more quickly as a game platform than game consoles do, and there is really no need for it other than game developers like to play with the latest toys and video card manufacturers love to sell new products so they seed PC game developers with the latest and greatest cards to get them to exploit those features at the expense of the larger market. There is nothing wrong with having some games which require cutting edge hardware, but the problem is that there are not enough games for sale especially in all genres that cater to the wide audience of older PC's. And, that is compounded by the fact that the PC game industry does such an abyssmal job as a whole of keeping its older titles available for sale on store shelves.

This comment was edited on Jul 14, 19:49.
60.
 
Re: Huh?
Jul 14, 2006, 19:28
60.
Re: Huh? Jul 14, 2006, 19:28
Jul 14, 2006, 19:28
 
I know we had a similar debate recently, but speaking selfishly I'm quite happy with the fact that modern games are built to fit more or less on the 'technology curve'. While I'd love my computer to last an extra year, I value advances in technology and speedy release (as opposed to devs having to implement DirectX 7,8,9,10 versions to include a wider audience) much more. Because the games I am looking forward to simply wouldn't exist if they had to aim at lower systems (I'm thinking particularly of Crysis here).

Avatar 18712
59.
 
Re: Huh?
Jul 14, 2006, 18:51
59.
Re: Huh? Jul 14, 2006, 18:51
Jul 14, 2006, 18:51
 
"It's only outdated because not enough developers support it. Integrated graphics have never been cutting edge. Even the upcoming iterations designed for Vista's Aero still trail dedicated graphics cards in terms of performance. The point though is that integrated graphics and older graphics cards can still be used to run games with decent visuals at a playable framerate. I have run plenty of games from 2002 and earlier including FPS games on PC's with Intel's integrated graphics like 845G's and 865G's. The problem is that PC game developers abandon PC's with older graphics technology far too quickly when commercially viable games could still be made for those PC's. "

What you are saying is the same as if Sony puts out an 8-track player and cons 75% of the market into buying it, promising it plays CD's. There's no difference.

58.
 
Re: Huh?
Jul 14, 2006, 17:36
58.
Re: Huh? Jul 14, 2006, 17:36
Jul 14, 2006, 17:36
 
So if that's the case, how come Sony doesn't make Gran Tourismo for the PS1 anymore? Or EA should still be making NHL/NFL for the PS1 too?
Did Sony just screw over their PS1 user base by introducing the PS2, and soon the PS3?


This comment was edited on Jul 14, 17:38.
57.
 
Re: Huh?
Jul 14, 2006, 15:10
57.
Re: Huh? Jul 14, 2006, 15:10
Jul 14, 2006, 15:10
 
Again... for the hundredth time. This isn't about developers supporting old hardware. It's about a hardware maker, that creates NEW hardware, thats already outdated.
It's only outdated because not enough developers support it. Integrated graphics have never been cutting edge. Even the upcoming iterations designed for Vista's Aero still trail dedicated graphics cards in terms of performance. The point though is that integrated graphics and older graphics cards can still be used to run games with decent visuals at a playable framerate. I have run plenty of games from 2002 and earlier including FPS games on PC's with Intel's integrated graphics like 845G's and 865G's. The problem is that PC game developers abandon PC's with older graphics technology far too quickly when commercially viable games could still be made for those PC's.

This comment was edited on Jul 14, 15:15.
56.
 
Re: Huh?
Jul 14, 2006, 13:12
56.
Re: Huh? Jul 14, 2006, 13:12
Jul 14, 2006, 13:12
 
"More developers should target the large installed base of older PC's instead of always moving hardware requirements so far forward. "

Again... for the hundredth time. This isn't about developers supporting old hardware. It's about a hardware maker, that creates NEW hardware, thats already outdated.

55.
 
Re: Huh?
Jul 14, 2006, 09:16
55.
Re: Huh? Jul 14, 2006, 09:16
Jul 14, 2006, 09:16
 
Really this fits just as easily into consoles.
No it doesn't. Consoles have a significantly longer "shelf life" than PC's when it comes to games (at least the successful consoles do), and that should change. If developers can make good, commercially viable games on a four or five year old piece of console hardware, then they also could do the same for a four year old PC. More developers should target the large installed base of older PC's instead of always moving hardware requirements so far forward.

This comment was edited on Jul 14, 09:17.
54.
 
Re: Huh?
Jul 14, 2006, 09:10
nin
 
54.
Re: Huh? Jul 14, 2006, 09:10
Jul 14, 2006, 09:10
 nin
 
Soon SLI will become a standard and then only the rich few will be able to game on PC.

OK, someone check him for a crack pipe...



--------------------------------------------------------------
GW: Nilaar Madalla, lvl 20 R/Mo / Tolyl Nor, lvl 20 E/Mo / Xylos Gath, lvl 16 W/Mo

http://www.muse.mu/
53.
 
Re: Huh?
Jul 14, 2006, 07:21
53.
Re: Huh? Jul 14, 2006, 07:21
Jul 14, 2006, 07:21
 
I think Marc is right that players won't go back to a game many months down the road if it is essentially the same experience with reused content as the previous one.
We'll see. I bet HL:Ep2 and Ep3 will sell quite nicely. And be as dismissive as you like about the "sycophants" who buy these products, they're buying them because they enjoy them.

But that being said, I think that's more because Valve are actually quite good at making fun games that follow on from one another. For a series such as Sin, I think you're probably right - people won't pay for 9 episodes of that tedium.

What Marc is saying is that because these episodic titles are competing against full length games for the consumer's attention, they don't stand a chance.
Again, Valve's sale figures would disagree. What is true is that poor episodic games will not succeed. But that's true of most things.

Avatar 18712
52.
 
Re: Huh?
Jul 14, 2006, 00:21
52.
Re: Huh? Jul 14, 2006, 00:21
Jul 14, 2006, 00:21
 
If you want to point blame at someone over the decline of PC gaming take a quick glance at Nvidia. Nvidia with there quck product cycle is whats driving up the cost of PC gaming. You buy a $300 card one day to have it obslite the next day and hordes of fanboys screaming for games that support the $500 card they just bought. But wait theres more now that we have the hard core gamers hooked and driving the market $500 a card every 6 months lets make a solution that is just out of every normal persons reach and we shall call it SLI. Soon SLI will become a standard and then only the rich few will be able to game on PC. Shorty followed by game devs going totally console and droping PC becasue they can't sell enough PC games.

There you have it bad grammer and all, the fall of pc gaming. I will now resume my console shoping.

51.
 
Re: Huh?
Jul 13, 2006, 23:54
51.
Re: Huh? Jul 13, 2006, 23:54
Jul 13, 2006, 23:54
 
In reality, eventually graphics will peak ( probably not for a while ), and subside, thats when gaming will get really good because tech is no longer an excuse to sell a game ( unless the tech is some wacky peripheral, which I'm sure will happen more and more as we get near this peak. )

That would be such an awesome day...imagine, seeing a game preview and not wondering "how much will it cost me to upgrade to play this?" or "wonder when I'll have a machine that can actually handle this game?"

50.
 
Re: Huh?
Jul 13, 2006, 23:27
50.
Re: Huh? Jul 13, 2006, 23:27
Jul 13, 2006, 23:27
 
Alright, let's generalize this to hell.

1) Companies make lots of different kinds of product A, which plays games.
2) Standard technology evolves, everyone makes and improves on product A's device and useage, eventually evolving into product B, and then product C.
3) New Company makes new product D. It's cheaper, and claims to play games.
4) Product D only plays Product A games.
5) Consumers cry out, "Product D looks like Product C, why doesn't it play all games?"
6) Consumers cry, get tired of things, and no longer want to deal with games on Product D.

Really this fits just as easily into consoles. Are you also upset your PS2 games don't play on a PS1? You need to be, in order for your argument to stand.

In reality, eventually graphics will peak ( probably not for a while ), and subside, thats when gaming will get really good because tech is no longer an excuse to sell a game ( unless the tech is some wacky peripheral, which I'm sure will happen more and more as we get near this peak. )

49.
 
Re: Huh?
Jul 13, 2006, 15:31
49.
Re: Huh? Jul 13, 2006, 15:31
Jul 13, 2006, 15:31
 
It IS intel to blame, because they are the ones claiming their garbage chipsets play modern games. The manufacturers take these claims, feed them to the salesmen and put them on the box.

Sorry... I disagree.

Intel Integrated Graphics can handle anything of Quake 3 or RTCW graphics level. That's PLENTY of eye candy for the non-enthusiast. PERIOD

If developers are stupid enough to not support the largest install base that's known to be out there... REQUIRING non-enthusiasts (that buy games at Wal-mart and Costco) to upgrade (whether, their shitty budget Dell, HP, or Gateway actually has a graphics card slot or not, is a different matter)... then they have effectively slashed THEIR OWN potential market.

Non enthusiasts are NOT going to spend 2K+ on a machine, nor buy a new video card every fucking year for $200+.

If anything... the asshole elitist epeen gaming enthusiasts that need games to have the "latest bleeding edge crap", are also helping destroy the PC gaming industry. Those people are totally outnumbered by a ton of pc-ignorant consumers that only care about having a cheap basic PC that can play some games too.

It also makes game companies sink so much development effort into the game's stupid graphics... that most of the time the gameplay sucks, is buggy, and incomplete.

There definitely needs to be a line drawn, so gameplay doesn't suffer, and we aren't constantly forced in to upgrades. Don't get me wrong... there still needs to be a reason for upgrading... but the yearly shit (for the most part) has to stop.

Get your games from GOG DAMMIT!
Avatar 19499
48.
 
Re: What a fucking hypocrite.
Jul 13, 2006, 14:53
48.
Re: What a fucking hypocrite. Jul 13, 2006, 14:53
Jul 13, 2006, 14:53
 
Most hardware-knowledgeable folks have realized there is no need for separate graphics capabilities from what a modern cpu should be able to render.

Crazyfool, you picked a poor username. With such an incredible display of ignorance, you should have picked "Cluelessmoron" as your username.

47.
 
Re: Huh?
Jul 13, 2006, 13:30
47.
Re: Huh? Jul 13, 2006, 13:30
Jul 13, 2006, 13:30
 
Straight off the intel website:

"Intel Extreme Graphics 2 supports the latest 2D and 3D APIs, delivering real-life environment and character effects. A 256-bit internal path enables up to four textures per pixel on a single pass for super light maps, atmospheric effects, and more realistic surface details. Flexible display capabilities enhance the personal computing experience, offering significant benefits for applications requiring 32 bpp and higher display resolution."

It IS intel to blame, because they are the ones claiming their garbage chipsets play modern games. The manufacturers take these claims, feed them to the salesmen and put them on the box.

Regardless, the main argument against intel is that for a few dollars more, they can put in a chip that will play most games that have been released in the last three years. They are intentionally misleading consumers to make a buck.

This comment was edited on Jul 13, 13:35.
86 Replies. 5 pages. Viewing page 2.
Newer [  1  2  3  4  5  ] Older