Verno wrote on Oct 2, 2015, 09:48:
I don't find this to be true and I own current gen consoles. Many games struggle to meet stable framerates and almost all of them make sacrifices in detail toggles. The GPU in the consoles is 2 generations old already and frankly wasn't that impressive to begin with. It's just a testament to optimization and lower level access to the hardware that devs get the results they do. The A6s are pieces of crap. Last time I used one it couldn't even manage a stable 30fps at 720p with low details in Alien: Isolation. They're fine for playing little indie games but 1080p AAA nope and Anandtech agrees.
While the A6 isn't top of the line (it was never meant to be in the first place), I've given it a very thorough workout over the last two years or so that I've had it and I can promise you that if your system is configured correctly, you're not trying to stream 1080p video or have 50 tabs open in Chrome in the background or anything stressful like that, it will
play most games out there at playable frame rates. Of course you're not going to get 1080p and full detail. I already admitted to this. I'm not stupid nor am I clueless, contrary to apparent popular opinion. I know the what the hell I'm talking about. Obviously an A6 isn't going to win any awards and when I referenced the A6, I was talking more about the HD6350 inside, not the cpu side of the APU. But it's not a slouch if you know how to use it right and don't mind knocking the sliders back a little...or playing a little under 1080p, which, my LCD being 1680x1050, is no issue. Again, I've admitted all of this, so why does it keep coming back as a point of debate in this argument/discussion?
The Xbox One has 32MB of ESRAM by the way, not 512MB. It's essentially a cache that is theoretically faster but requires more grunt work from devs. It also has a slightly weaker GPU than the PS4. I'm not even sure why people are talking about consoles when discussing videocards. Supplying those has more to do with contractual negotiation than anything else, it has no bearing on who has better parts. Intel, Nvidia and AMD have all previously supplied consoles.
You're absolutely correct, the Xbone has 32mb of ESRAM (hey, I got the ESRAM part right, though!). I'm not sure where I got the 512mb from, maybe just fatigue from arguing all day...
AMD gutted its driver division and only seems to send out WHQL releases once a year at this stage. I don't mind installing betas but even those seem to take too long vs Nvidia. Their latest cards run hot as hell and eat power. The only saving grace is some clever cooling designs from third parties. I have no real confidence in buying them right now, to me the only reason to bother is price. The only compelling card in their lineup is the R9 Nano for people doing living room builds. What they have going for them is price but if they keep channel dumping they're just going to confuse the market.
My only contention here is the driver release times. They released 12 drivers last year, as the last one was 14.12 and I'm sure you're aware that it goes YEAR/MONTH. So I'm not sure where you're getting this one WHQL driver a year thing from? I'm not saying they always release a driver every month, they don't, but they try to get them out as soon as possible. They've had four or five driver releases this year already and not all of them were betas, though the majority were. Still, the latest was 15.9 which just got a hotfix and while that's not WHQL-certified, the July 15.7 release is. Personally, I don't see the big deal with it being certified or not, they work just the same.
I want a better AMD so that Nvidia is forced to compete but they're not investing a lot back into the business and don't seem to have a plan for the future. I wouldn't be shocked if they're gone within a few years at this rate.
To be bluntly honest, while I say I'm an AMD fanboy, it's really just the ATI-side (if you can call it that, or the video card side, let's say) that I'm partial to. If AMD's cpu business goes tits up, so be it. Aside from some notable cpus in the past such as the Thunderbirds, Durons, Semprons and Athlons, they've never really been a contender against Intel and I don't think they're really giving them much competition, which is probably why you still see the latest Core i7's come out at a $999 price point. I don't pretend to know the specifics, I don't have enough time in the day to keep track of their mobile line or SSD performance or whatever it was Lansbury and Tacosalad were talking about. That doesn't make my take on things any less valid and I'm sure if I had the time to staple my eyelids open and do some research, I could certainly jump into the deep end of that discussion. But I'm not going to embarrass myself by arguing about something I know nothing about. If AMD wasn't to spin the videocard side (the ATI-side, so to speak) off before they went under then that would definitely be a sad day for PC gaming...much like it was when 3dfx went under and was gobbled up by nVidia...
=-Rigs-=This comment was edited on Oct 2, 2015, 16:34.