Rigs wrote on Oct 1, 2015, 14:24:
Way to generalize, there. Most current gen consoles have absolutely NO problem giving 60fps at 720p. It's when they get up to 1080p when they have trouble and not because the APU can't handle it, but rather, in Xbone's case anyway, the way they have video ram (or ESRAM) implemented. A paltry 512megs of it ain't gonna get you far. The APU in the PS4 has no problem with games, assuming the dev actually knows what the hell they're doing! As for PC-based APU's, I have an A6 (which has an equivalent HD6350) with 4gb of DDR3 and it has no trouble playing 1080p video, or playing anything up to 'AAA' graphics heavy games in 1080p. The Witcher3 and GTA5 have some trouble without pulling the options down a tad, but then again, my 7850 has trouble with it, too, at 1680x1050. And have I owned nVidia? Would you like to see the box of nVidia Geforce256, GF2MX's, GF3's and TNT2's and RIVA's I have sitting next to me? Aside from them, I have two laptops with 670M's. So yes, I've used nVidia, and I don't like them one bit.
I don't find this to be true and I own current gen consoles. Many games struggle to meet stable framerates and almost all of them make sacrifices in detail toggles. The GPU in the consoles is 2 generations old already and frankly wasn't that impressive to begin with. It's just a testament to optimization and lower level access to the hardware that devs get the results they do. The A6s are pieces of crap. Last time I used one it couldn't even manage a stable 30fps at 720p with low details in Alien: Isolation. They're fine for playing little indie games but 1080p AAA nope and Anandtech agrees.
The Xbox One has 32MB of ESRAM by the way, not 512MB. It's essentially a cache that is theoretically faster but requires more grunt work from devs. It also has a slightly weaker GPU than the PS4. I'm not even sure why people are talking about consoles when discussing videocards. Supplying those has more to do with contractual negotiation than anything else, it has no bearing on who has better parts. Intel, Nvidia and AMD have all previously supplied consoles.
AMD gutted its driver division and only seems to send out WHQL releases once a year at this stage. I don't mind installing betas but even those seem to take too long vs Nvidia. Their latest cards run hot as hell and eat power. The only saving grace is some clever cooling designs from third parties. I have no real confidence in buying them right now, to me the only reason to bother is price. The only compelling card in their lineup is the R9 Nano for people doing living room builds. What they have going for them is price but if they keep channel dumping they're just going to confuse the market.
I want a better AMD so that Nvidia is forced to compete but they're not investing a lot back into the business and don't seem to have a plan for the future. I wouldn't be shocked if they're gone within a few years at this rate.
Playing: Risk of Rain 2, Jedi Fallen Order, Last of Us II
Watching: Tenet, Peninsula, The Pale Door