Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - RSS Headlines   RSS Headlines   Twitter   Twitter

NVIDIA Relaunches Cloud Gaming

NVIDIA announces their GRID cloud gaming service is now known as GeForce NOW, claiming this is the first to offer 1080p, 60 FPS gaming over the internet. Word is: "GeForce NOW is your game library in the sky. See all the titles we offer from a game series at a glance — like the LEGO collection or the Batman Arkham series — for binge gaming. Use voice search to find your favorite games. Search through games by category. Or quickly browse the latest and most popular games." Here's more:
Next-Gen Quality

GeForce NOW is the first cloud-gaming service to stream at full high-definition 1080p quality and at 60 frames per second. Membership costs just $7.99 a month with the first three months free. GeForce NOW arrives on Oct. 1 in North America, the European Union and Japan with more than 50 popular games included. And it offers members the option to buy and play many more in an instant.

Get the Latest GeForce Gaming Technology with a Single Click

GeForce NOW is all about instant gratification. But it took us a decade to invent the technology behind the service that streams GeForce GTX-quality graphics to SHIELD devices.

GeForce NOW is a service designed, built and operated all by NVIDIA. We’ve optimized every piece of the technology behind GeForce NOW for gaming. That includes the low-latency game controller we built for SHIELD. The Tegra processors we built for our SHIELD devices. The advanced game engine we built into our GeForce GTX-powered servers.

So you’ll enjoy games at their best. You’ll enjoy GameWorks enhancements for special effects. And your experience will get better every year as we build more powerful GPUs and gaming technologies into GeForce NOW.

Think of GeForce NOW as your gaming supercomputer in the cloud. Next-gen gaming is now easy and instant.
View
43 Replies. 3 pages. Viewing page 1.
< Newer [ 1 2 3 ] Older >


43. Re: NVIDIA Relaunches Cloud Gaming Oct 5, 2015, 15:03 Burrito of Peace
 
Rigs wrote on Oct 5, 2015, 11:03:
I appreciate that you've explained all of that to me. I've had so much going on in the last two months that I have not had but a few minutes here and there to keep current on things related to gaming and computing. I would like nothing more than to sit down for an hour or two and go over it bit by bit but I just don't have the time. But that doesn't mean I'm not interested.

You are quite welcome. I'd rather inform than get in to an internet slapfight over an "us vs them" mentality. I also understand not having the time to pursue all the areas that interest you. Hell, if knowing this stuff wasn't part of my job, I'd probably not know half as much as I do.

I realize tone is a very hard thing to project in online communication, however I assure you that I was not frothing at the mouth, nor was I stamping my feet like a mad little boy. I will apologize though, as I broke one of my cardinal rules when commenting here at Blues. I don't post unless I have the time to fully explain my reasons (and any replies afterward) and I don't post if I don't know what I'm talking about. My other rule is that I don't post when I'm angry, tired, drunk, or hyper. As a writer, I tend to suffer from 'writers block' on occasion, however for the last few months I've been having a really hard time transferring my thoughts to words. Even when I'm talking this has become a serious problem, the words I want to say are there in my head but I just can't get them out. I know what I want to say, just I don't know how. And it's been getting worse. I should have kept my 'mouth' shut in this instance instead of going full-bore fanboy and putting 'MSW' on blast. I stepped into an argument I wasn't equipped to handle and I was more annoyed at my own inability to get out what I meant than what MSW was saying and that shows in my responses.

I think for now, I'm just going to silently lurk and read until I have the ability to adequately and respectfully discuss something. Again, I appreciate you breaking down the details, I wish sites would do that more often for those of us that don't have hours to devote to comparing stats and benchmarks...


=-Rigs-=


I absolutely understand. Normally I try to be calm and rational, with a snide of lighthearted snark here and there. There are some days, though, I can't articulate what I want to say even if what I wanted to say was "The sky is blue". For me, that usually starts on the 8th or 9th day of an insomnia streak and then it's like I'm a lunatic off his meds. I hope you get it worked out and get your "voice" back, so to speak.

Again, you are quite welcome. If at any time I can be of service like that, send me an email through the site and I'll get back to you as soon as I am able.

 
Avatar 21247
 



"No matter where you go, there you are." Buckaroo Banzai
Reply Quote Edit Delete Report
 

42. Re: NVIDIA Relaunches Cloud Gaming Oct 5, 2015, 11:03 Rigs
 
Burrito of Peace wrote on Oct 3, 2015, 23:33:
Rigs wrote on Oct 2, 2015, 13:27:
...I don't have enough time in the day to keep track of their mobile line or SSD performance or whatever it was Lansbury and Tacosalad were talking about...

That's an interesting sobriquet you've chosen for me, Rigs. I'm not entirely sure what I, personally, have done to earn it but thank you.

A very good friend of mine's nickname is actually Tacosalad, so I guess you might call it a term of endearment?

You should make the time because SoCs are going to be the mainline source of income for any chip manufacturer in the very near future. For some companies, that is already a fact. Without that mainline source of income for a chip manufacturer, niche products like video cards are going to be less and less important until they disappear entirely.

I'll give you an example. Intel, the largest hardware technology company ever, is focusing more and more on SoCs. They've flat out stopped making motherboards that aren't server or workstation class boards, and even then those boards are in limited designs and quantity. They are focused, in the consumer space, on the NUC design because it allows them to showcase their SoC/embedded/micro design and manufacturing capabilities. They've even taken it in to the enterprise sector with the Xeon D, which you can read about here. That's all on 14nm, by the way. What's AMD down to now? 28nm?

Iris Pro has always been targeted at the embedded/SoC/AIO/micro markets with the small benefit of being able to integrate it in to consumer oriented products as/if needed. Is it going to be performance gaming oriented? Hell no, but then most of the world's users aren't performance gaming oriented. They want to be able to play low stress games like Candy Crush, Minecraft, and other titles like it. Iris Pro will do that easily and in a tablet form or laptop form or even a cell form and it will do it with some pretty low power consumption? What does AMD have to counter that? All I hear is crickets.

Even if you wanted to narrowly compare AMD's ATi offerings against Nvidia's offerings, AMD is getting their ass handed to them there as well. Nvidia is refining and boosting their Tegra line (as I mentioned upthread with their X1) which is going in to Google's latest line of tablets. "But I don't buy Google tablets because..." doesn't matter. Other people do or other second tier manufacturers use them as reference designs for their own products if they aren't already going with Intel's solutions. The Surface Pro line is arguably the most powerful tablet cum laptop design in the world right now and it uses Intel's solution. If you buy a non-Pro, you're getting an Nvidia solution with the Tegra line. What does AMD have to compete with the entire tablet market? Nothing.

"B..b..but VIDEO CARDS!" OK, let's take a look at them. Hereis a comparison between a R390 and a 970. The R390 looks pretty good, right? But the 970 isn't the top of the line for Nvidia is it? Nope, that'd be the 980Ti. So let's bump the Radeon down to an equivalent class. That would be the 380. Here is that comparison. Uh oh, looks like the 970 destroys the R380 in absolute performance. Ouch.

Let's do top of the line. 390X versue 980GTX Ti. Well, here are those results. Pretty decent but AMD knew what the 980 series was going to be capable of because they had nearly a year since the launch of the 980GTX (non Ti) versus when they released the 390X. Again, ouch. The R390 is comparative to a one year old card and they released it this year.

I appreciate that you've explained all of that to me. I've had so much going on in the last two months that I have not had but a few minutes here and there to keep current on things related to gaming and computing. I would like nothing more than to sit down for an hour or two and go over it bit by bit but I just don't have the time. But that doesn't mean I'm not interested.

You've admitted that you're a fan boy and that's OK. I'm a Raiders fan and have been for years. However good the Raiders might look this year, it doesn't prevent me from honestly admitting that the Raiders have sucked for a very, very long time. I'm honestly not trying to antagonize you but you come off as pretty shrill and militant in your support of AMD. It makes you come off as a frothing, glazed eyed hype monkey. I've seen many of your posts over the years and, normally, you come across as a fairly rational person but in this thread...not so much.

I realize tone is a very hard thing to project in online communication, however I assure you that I was not frothing at the mouth, nor was I stamping my feet like a mad little boy. I will apologize though, as I broke one of my cardinal rules when commenting here at Blues. I don't post unless I have the time to fully explain my reasons (and any replies afterward) and I don't post if I don't know what I'm talking about. My other rule is that I don't post when I'm angry, tired, drunk, or hyper. As a writer, I tend to suffer from 'writers block' on occasion, however for the last few months I've been having a really hard time transferring my thoughts to words. Even when I'm talking this has become a serious problem, the words I want to say are there in my head but I just can't get them out. I know what I want to say, just I don't know how. And it's been getting worse. I should have kept my 'mouth' shut in this instance instead of going full-bore fanboy and putting 'MSW' on blast. I stepped into an argument I wasn't equipped to handle and I was more annoyed at my own inability to get out what I meant than what MSW was saying and that shows in my responses.

I think for now, I'm just going to silently lurk and read until I have the ability to adequately and respectfully discuss something. Again, I appreciate you breaking down the details, I wish sites would do that more often for those of us that don't have hours to devote to comparing stats and benchmarks...


=-Rigs-=
 
Avatar 14292
 
Reply Quote Edit Delete Report
 

41. Re: NVIDIA Relaunches Cloud Gaming Oct 3, 2015, 23:33 Burrito of Peace
 
Rigs wrote on Oct 2, 2015, 13:27:
...I don't have enough time in the day to keep track of their mobile line or SSD performance or whatever it was Lansbury and Tacosalad were talking about...

That's an interesting sobriquet you've chosen for me, Rigs. I'm not entirely sure what I, personally, have done to earn it but thank you.

You should make the time because SoCs are going to be the mainline source of income for any chip manufacturer in the very near future. For some companies, that is already a fact. Without that mainline source of income for a chip manufacturer, niche products like video cards are going to be less and less important until they disappear entirely.

I'll give you an example. Intel, the largest hardware technology company ever, is focusing more and more on SoCs. They've flat out stopped making motherboards that aren't server or workstation class boards, and even then those boards are in limited designs and quantity. They are focused, in the consumer space, on the NUC design because it allows them to showcase their SoC/embedded/micro design and manufacturing capabilities. They've even taken it in to the enterprise sector with the Xeon D, which you can read about here. That's all on 14nm, by the way. What's AMD down to now? 28nm?

Iris Pro has always been targeted at the embedded/SoC/AIO/micro markets with the small benefit of being able to integrate it in to consumer oriented products as/if needed. Is it going to be performance gaming oriented? Hell no, but then most of the world's users aren't performance gaming oriented. They want to be able to play low stress games like Candy Crush, Minecraft, and other titles like it. Iris Pro will do that easily and in a tablet form or laptop form or even a cell form and it will do it with some pretty low power consumption? What does AMD have to counter that? All I hear is crickets.

Even if you wanted to narrowly compare AMD's ATi offerings against Nvidia's offerings, AMD is getting their ass handed to them there as well. Nvidia is refining and boosting their Tegra line (as I mentioned upthread with their X1) which is going in to Google's latest line of tablets. "But I don't buy Google tablets because..." doesn't matter. Other people do or other second tier manufacturers use them as reference designs for their own products if they aren't already going with Intel's solutions. The Surface Pro line is arguably the most powerful tablet cum laptop design in the world right now and it uses Intel's solution. If you buy a non-Pro, you're getting an Nvidia solution with the Tegra line. What does AMD have to compete with the entire tablet market? Nothing.

"B..b..but VIDEO CARDS!" OK, let's take a look at them. Hereis a comparison between a R390 and a 970. The R390 looks pretty good, right? But the 970 isn't the top of the line for Nvidia is it? Nope, that'd be the 980Ti. So let's bump the Radeon down to an equivalent class. That would be the 380. Here is that comparison. Uh oh, looks like the 970 destroys the R380 in absolute performance. Ouch.

Let's do top of the line. 390X versue 980GTX Ti. Well, here are those results. Pretty decent but AMD knew what the 980 series was going to be capable of because they had nearly a year since the launch of the 980GTX (non Ti) versus when they released the 390X. Again, ouch. The R390 is comparative to a one year old card and they released it this year.

You've admitted that you're a fan boy and that's OK. I'm a Raiders fan and have been for years. However good the Raiders might look this year, it doesn't prevent me from honestly admitting that the Raiders have sucked for a very, very long time. I'm honestly not trying to antagonize you but you come off as pretty shrill and militant in your support of AMD. It makes you come off as a frothing, glazed eyed hype monkey. I've seen many of your posts over the years and, normally, you come across as a fairly rational person but in this thread...not so much.

 
Avatar 21247
 



"No matter where you go, there you are." Buckaroo Banzai
Reply Quote Edit Delete Report
 

40. Re: NVIDIA Relaunches Cloud Gaming Oct 2, 2015, 15:15 Verno
 
While the A6 isn't top of the line (it was never meant to be in the first place), I've given it a very thorough workout over the last two years or so that I've had it and I can promise you that if your system is configured correctly, you're not trying to stream 1080p video or have 50 tabs open in Chrome in the background or anything stressful like that, it will play most games out there at playable frame rates. Of course you're not going to get 1080p and full detail. I already admitted to this. I'm not stupid nor am I clueless, contrary to apparent popular opinion. I know the what the hell I'm talking about. Obviously an A6 isn't going to win any awards and when I referenced the A6, I was talking more about the HD6350 inside, not the cpu side of the APU. But it's not a slouch if you know how to use it right and don't mind knocking the sliders back a little...or playing a little under 1080p, which, my LCD being 1680x1050, is no issue.

I'm not debating per se, just disagreeing and pointing out what I felt was an exaggeration. The A6 can barely manage 720p 30fps at lowest detail settings in many popular games, the benchmarks are pretty clear. The A10-7870k is a better bet for budget gaming but even that isn't going to do much with modern titles and people might as well get a previous gen dedicated card at that stage. Nvidia makes better laptop GPUs and they are often paired with Intel CPUs for an added bonus. AMDs APU products were never really that impressive from anything other than a cost stand point. Conceptually they were supposed to help AMD make inroads with the laptop market and give them ammo to fight Intel whose weak IGP performance was a sticking point for a long time. Unfortunately that market got gobbled by hybrid solutions from Intel/NV or moved onto cellphones and tablets.

Not a big deal, I was just perusing the thread and wanted to comment on that.

My only contention here is the driver release times. They released 12 drivers last year, as the last one was 14.12 and I'm sure you're aware that it goes YEAR/MONTH. So I'm not sure where you're getting this one WHQL driver a year thing from? I'm not saying they always release a driver every month, they don't, but they try to get them out as soon as possible. They've had four or five driver releases this year already and not all of them were betas, though the majority were. Still, the latest was 15.9 which just got a hotfix and while that's not WHQL-certified, the July 15.7 release is. Personally, I don't see the big deal with it being certified or not, they work just the same.

I want better quality and if they can't do better quality then faster releases. I've been hit by several major AMD driver bugs over the years and compatibility issues with games. I can count on game ready drivers for most new titles within a day of release from NV. I remember one of the devs who worked at a porting house commented on here how difficult it was to get anything out of AMD while Nvidia answered emails same day and even assigned dedicated reps for big projects. I used to give them a pass on this until I found out they gutted the driver and dev outreach teams as a cost saving measure awhile ago.
 
Avatar 51617
 



Playing: RDR2 PC, The Outer Worlds, Control
Watching: The Last Kingdom, Twin Peaks, Red Dwarf
Reply Quote Edit Delete Report
 

39. Re: NVIDIA Relaunches Cloud Gaming Oct 2, 2015, 13:27 Rigs
 
Verno wrote on Oct 2, 2015, 09:48:
I don't find this to be true and I own current gen consoles. Many games struggle to meet stable framerates and almost all of them make sacrifices in detail toggles. The GPU in the consoles is 2 generations old already and frankly wasn't that impressive to begin with. It's just a testament to optimization and lower level access to the hardware that devs get the results they do. The A6s are pieces of crap. Last time I used one it couldn't even manage a stable 30fps at 720p with low details in Alien: Isolation. They're fine for playing little indie games but 1080p AAA nope and Anandtech agrees.

While the A6 isn't top of the line (it was never meant to be in the first place), I've given it a very thorough workout over the last two years or so that I've had it and I can promise you that if your system is configured correctly, you're not trying to stream 1080p video or have 50 tabs open in Chrome in the background or anything stressful like that, it will play most games out there at playable frame rates. Of course you're not going to get 1080p and full detail. I already admitted to this. I'm not stupid nor am I clueless, contrary to apparent popular opinion. I know the what the hell I'm talking about. Obviously an A6 isn't going to win any awards and when I referenced the A6, I was talking more about the HD6350 inside, not the cpu side of the APU. But it's not a slouch if you know how to use it right and don't mind knocking the sliders back a little...or playing a little under 1080p, which, my LCD being 1680x1050, is no issue. Again, I've admitted all of this, so why does it keep coming back as a point of debate in this argument/discussion?

The Xbox One has 32MB of ESRAM by the way, not 512MB. It's essentially a cache that is theoretically faster but requires more grunt work from devs. It also has a slightly weaker GPU than the PS4. I'm not even sure why people are talking about consoles when discussing videocards. Supplying those has more to do with contractual negotiation than anything else, it has no bearing on who has better parts. Intel, Nvidia and AMD have all previously supplied consoles.

You're absolutely correct, the Xbone has 32mb of ESRAM (hey, I got the ESRAM part right, though!). I'm not sure where I got the 512mb from, maybe just fatigue from arguing all day...

AMD gutted its driver division and only seems to send out WHQL releases once a year at this stage. I don't mind installing betas but even those seem to take too long vs Nvidia. Their latest cards run hot as hell and eat power. The only saving grace is some clever cooling designs from third parties. I have no real confidence in buying them right now, to me the only reason to bother is price. The only compelling card in their lineup is the R9 Nano for people doing living room builds. What they have going for them is price but if they keep channel dumping they're just going to confuse the market.

My only contention here is the driver release times. They released 12 drivers last year, as the last one was 14.12 and I'm sure you're aware that it goes YEAR/MONTH. So I'm not sure where you're getting this one WHQL driver a year thing from? I'm not saying they always release a driver every month, they don't, but they try to get them out as soon as possible. They've had four or five driver releases this year already and not all of them were betas, though the majority were. Still, the latest was 15.9 which just got a hotfix and while that's not WHQL-certified, the July 15.7 release is. Personally, I don't see the big deal with it being certified or not, they work just the same.

I want a better AMD so that Nvidia is forced to compete but they're not investing a lot back into the business and don't seem to have a plan for the future. I wouldn't be shocked if they're gone within a few years at this rate.

To be bluntly honest, while I say I'm an AMD fanboy, it's really just the ATI-side (if you can call it that, or the video card side, let's say) that I'm partial to. If AMD's cpu business goes tits up, so be it. Aside from some notable cpus in the past such as the Thunderbirds, Durons, Semprons and Athlons, they've never really been a contender against Intel and I don't think they're really giving them much competition, which is probably why you still see the latest Core i7's come out at a $999 price point. I don't pretend to know the specifics, I don't have enough time in the day to keep track of their mobile line or SSD performance or whatever it was Lansbury and Tacosalad were talking about. That doesn't make my take on things any less valid and I'm sure if I had the time to staple my eyelids open and do some research, I could certainly jump into the deep end of that discussion. But I'm not going to embarrass myself by arguing about something I know nothing about. If AMD wasn't to spin the videocard side (the ATI-side, so to speak) off before they went under then that would definitely be a sad day for PC gaming...much like it was when 3dfx went under and was gobbled up by nVidia...


=-Rigs-=

This comment was edited on Oct 2, 2015, 16:34.
 
Avatar 14292
 
Reply Quote Edit Delete Report
 

38. Re: NVIDIA Relaunches Cloud Gaming Oct 2, 2015, 09:48 Verno
 
Rigs wrote on Oct 1, 2015, 14:24:
Way to generalize, there. Most current gen consoles have absolutely NO problem giving 60fps at 720p. It's when they get up to 1080p when they have trouble and not because the APU can't handle it, but rather, in Xbone's case anyway, the way they have video ram (or ESRAM) implemented. A paltry 512megs of it ain't gonna get you far. The APU in the PS4 has no problem with games, assuming the dev actually knows what the hell they're doing! As for PC-based APU's, I have an A6 (which has an equivalent HD6350) with 4gb of DDR3 and it has no trouble playing 1080p video, or playing anything up to 'AAA' graphics heavy games in 1080p. The Witcher3 and GTA5 have some trouble without pulling the options down a tad, but then again, my 7850 has trouble with it, too, at 1680x1050. And have I owned nVidia? Would you like to see the box of nVidia Geforce256, GF2MX's, GF3's and TNT2's and RIVA's I have sitting next to me? Aside from them, I have two laptops with 670M's. So yes, I've used nVidia, and I don't like them one bit.

I don't find this to be true and I own current gen consoles. Many games struggle to meet stable framerates and almost all of them make sacrifices in detail toggles. The GPU in the consoles is 2 generations old already and frankly wasn't that impressive to begin with. It's just a testament to optimization and lower level access to the hardware that devs get the results they do. The A6s are pieces of crap. Last time I used one it couldn't even manage a stable 30fps at 720p with low details in Alien: Isolation. They're fine for playing little indie games but 1080p AAA nope and Anandtech agrees.

The Xbox One has 32MB of ESRAM by the way, not 512MB. It's essentially a cache that is theoretically faster but requires more grunt work from devs. It also has a slightly weaker GPU than the PS4. I'm not even sure why people are talking about consoles when discussing videocards. Supplying those has more to do with contractual negotiation than anything else, it has no bearing on who has better parts. Intel, Nvidia and AMD have all previously supplied consoles.

AMD gutted its driver division and only seems to send out WHQL releases once a year at this stage. I don't mind installing betas but even those seem to take too long vs Nvidia. Their latest cards run hot as hell and eat power. The only saving grace is some clever cooling designs from third parties. I have no real confidence in buying them right now, to me the only reason to bother is price. The only compelling card in their lineup is the R9 Nano for people doing living room builds. What they have going for them is price but if they keep channel dumping they're just going to confuse the market.

I want a better AMD so that Nvidia is forced to compete but they're not investing a lot back into the business and don't seem to have a plan for the future. I wouldn't be shocked if they're gone within a few years at this rate.
 
Avatar 51617
 



Playing: RDR2 PC, The Outer Worlds, Control
Watching: The Last Kingdom, Twin Peaks, Red Dwarf
Reply Quote Edit Delete Report
 

37. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 23:54 Megalodon
 
I agree. I found AMD video cards this generation were a disappointment after the significant jump we saw with Maxwell. Beyond keeping Nvidia honest I'm not sure what they're going to good for in the future. Their marketshare is pretty low even in the enthusiast space and their mainstream product lines are all toast. As a company they seem to be circling the toilet, losing money and investor faith.

They completely missed the boat on tablets and phones which could've propped them up in the face of their ailing CPU lines. They really need a miracle, lost something like 400mil last year and I don't think they have anymore fabs to sell off.
 
Reply Quote Edit Delete Report
 

36. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 23:16 Burrito of Peace
 
Murder She Wrote wrote on Oct 1, 2015, 20:03:
They don't have a viable mobile SoC.

This, I think more than anything, is what is going to kill AMD in the long run. Right now, they've got some money rolling in from being an OEM supplier to Sony and Microsoft but that's not going to last forever. Intel is investing billions in mobile/embedded/micro technology, Nvidia is doing the same (and their new Tegra X1 looks pretty decent), ARM is currently the king with no end in sight to their reign, Samsung is plowing billions in to their SoCs and let's not forget Qualcomm, whom may be currently on the ropes but is not out yet. AMD, on the other hand, is waving their hands and saying "Guys? Guys! Check out our APUs!" APUs are all well and good but when I can't hold your entire system including proc, memory, storage, video, and networking (both physical and wireless) in the palm of my hand then it's utterly useless for the direction the market is trending.

Unless Samsung sees some value in the patents or manufacturing that AMD holds that no one else does, I don't think Samsung is likely to purchase what is largely a sinking ship. Why bother? They have their own massive R&D, they're a household name in everything from SSDs, to displays and on down the line to refrigerators.
 
Avatar 21247
 



"No matter where you go, there you are." Buckaroo Banzai
Reply Quote Edit Delete Report
 

35. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 22:45 jdreyer
 
Timmeh wrote on Oct 1, 2015, 16:58:
Im pretty sure Nvidia was started by Satan.

but it might just have been Beelzebub instead.

You're thinking of ATI > AMD. Why else would their cards throw off so much heat?
 
Avatar 22024
 



The land in Minecraft is flat, Minecraft simulates the Earth, ergo the Earth is flat.
Reply Quote Edit Delete Report
 

34. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 22:16 Rigs
 
Murder She Wrote wrote on Oct 1, 2015, 19:55:
That is quite the meltdown. When you announce yourself a fanboy and declare loyalty for a brand you discredit yourself, don't blame me for your own actions. Stomping your feet and swearing at me doesn't exactly make anyone take you more seriously, I'm not a 13 year old who is impressed by such things. All I said was that AMD has made some shitty video cards over the years and several people agreed. I didn't say they were all bad but you apparently decided to take it that way. I've owned several previous gen Nvidia and AMD cards too, like I'm sure most people here have. We currently have an AMD based HTPC and an Nvidia gaming PC even. I always buy products based on their own merits, not based out of some misplaced loyalty or susceptibility to marketing/branding.

Dude, seriously, get over yourself! I'm not 'stomping my feet'...you just don't fucking listen whatsoever. But that's fine. Now we're playing the 'oh, I'm an adult and I don't have pissing contests with 12-year-olds' when in fact I'm most likely older than you are! And this isn't a pissing contest...so put your dick away and put the ruler up. Go play your games and when you want to have an adult discussion like I was trying to have, then we'll talk. You're just rehashing everything you've said over and over and I've given an answer to everything you said, yet you still seem to not understand...


=-Rigs-=
 
Avatar 14292
 
Reply Quote Edit Delete Report
 

33. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 20:03 Megalodon
 
Burrito of Peace wrote on Oct 1, 2015, 18:47:
I think it's interesting that, when discussing AMD/ATi, the products most focused on are graphics cards. I agree with Murder She Wrote's statement of "Every arch can't be a winner and both companies have made notable stinkers in the past." The FX5800 and its noise level come to mind from Nvidia.

However, what I mostly focus on is CPUs and AMD hasn't released a decent CPU that is competitive in power consumption, performance, and heat generation with Intel in a very, very long time. The closest thing that comes to mind is the Athlon XP (Palomino) way back in 2000. Yes, AMD is less expensive than Intel's offerings but, then, they have to be as that is the only area that they can be competitive with Intel in.

My own anecdotal experience with AMD is not a very particularly positive one. I had an Athlon XP (Thunderbird) with an Asus A7V board. The CPU dropped dead within 11 months. The next AMD procs I had were the Athlon MPs in a cluster of servers that were running off of Tyan Tiger S2460 motherboards. Every second core dropped dead within 3 months of deployment, all within about the span of two weeks. After working with Tyan and AMD, it was discovered that the problem was not with the Tyan motherboards, but a voltage regulation problem with the SMP of the Athlon MPs. The AMD engineer promised it would be fixed with a microcode update but I had a business to run. Swapped them out for Xeons and different motherboards and never had another problem.

Just for grins, I went to CPUBoss and pitted the Xeon E3-1275v2 that I have in my current gaming machine against an Opteron of the same vintage with double the cores. The E3 won rather handily. You can see it here. Before any AMD fanboy whines that it's just single core performance comparison with no multicore sampling, I'd point out that in every day use, most of what you run is single threaded.

Is my Xeon a bit of an overkill for home use? For most people, probably so, but then most people don't have 6 to 8 VMs running in the backround while playing Dying Light, either.

I don't think anyone will dispute their CPUs being trash for a long time now. Hot and power hungry, terrible IPC and their lack of a real certification program in lieu of AMD guideline boards means their partners have been phoning in QA for a long time. That's a big part of why their motherboards have been so dubious in my opinion. Their R&D got gutted and they're losing money like crazy, I'm not sure where they have left to go. Intel is killing them in laptops which was almost a strong market once upon a time. They don't have a viable mobile SoC. Their video cards this gen barely kept parity with Nvidia who has a lot more R&D muscle. I'm glad I'm not an AMD shareholder, they're in a tough spot.

Hopefully they get bought out by a bigger player who can bankroll them. I think there was a Samsung rumor awhile ago.
 
Reply Quote Edit Delete Report
 

32. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 19:55 Megalodon
 
That is quite the meltdown. When you announce yourself a fanboy and declare loyalty for a brand you discredit yourself, don't blame me for your own actions. Stomping your feet and swearing at me doesn't exactly make anyone take you more seriously, I'm not a 13 year old who is impressed by such things. All I said was that AMD has made some shitty video cards over the years and several people agreed. I didn't say they were all bad but you apparently decided to take it that way. I've owned several previous gen Nvidia and AMD cards too, like I'm sure most people here have. We currently have an AMD based HTPC and an Nvidia gaming PC even. I always buy products based on their own merits, not based out of some misplaced loyalty or susceptibility to marketing/branding.  
Reply Quote Edit Delete Report
 

31. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 19:25 Rigs
 
Murder She Wrote wrote on Oct 1, 2015, 15:53:
Rigs wrote on Oct 1, 2015, 14:24:
Did I not just say I was an AMD/ATI fanboy? Why am I 'so defensive and hostile about others being balanced and buying products based on their merits'? Oh, 'bruh'? It's so funny how you can sit there in previous posts and bash AMD and then wonder why someone that actually LIKES AMD would come in and defend it. For the same reason you talk shit about it is why I come in here and defend it

Admitting your own bias doesn't make your comments valid, if anything its more the opposite. It's funny you think I was bashing AMD by stating they had made some shitty products in that timespan, its just a fact. Every arch can't be a winner and both companies have made notable stinkers in the past.People shouldn't have to preface every potentially negative comment just because you're a sensitive little girl about graphics cards. I'm not sure why you're so emotionally invested in a random corporation who makes video cards but its apparent you want to argue to the death or something so I'm going to go play Borderlands. And laugh about the thought of you fuming behind your keyboard whenever anyone dares to insult our lord and savior Advanced Micro Devices

Listen, Lansbury, like my 'own bias doesn't make my comments valid', your opinion doesn't make this fact...

they had made some shitty products in that timespan

That's an opinion, not a truth. Do you know the difference? Do you understand the definition of 'fanboy'? Do you not like certain products? I would find it extremely difficult to believe that you don't prefer a certain kind of car, food or electronics. I like AMD...well, actually, I like ATI, which is why I don't really argue about AMD's rather shitty record with cpu's as much as video cards. Am I 'fuming behind my keyboard whenever dares to insult our lord and savior AMD'? Well, first I'm glad that you've decided to take AMD into your heart as your lord and savior, second, you should see a psychologist, because that's not natural or in any way healthy. Third, I'm actually laughing because you're the one having a hard time with this concept, not me. It's like you can't understand why someone would have a vested interest in something that they use every single day. I don't have a fucking prayer alter with an AMD poster hanging in front giving sacrificial RAM sticks to the AMD god. It just pisses me off when uninformed people talk shit about something they don't know about and don't have personal experience with. You don't have an AMD card, so why talk shit? THIS is what I'm getting at...


=-Rigs-=
 
Avatar 14292
 
Reply Quote Edit Delete Report
 

30. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 18:47 Burrito of Peace
 
I think it's interesting that, when discussing AMD/ATi, the products most focused on are graphics cards. I agree with Murder She Wrote's statement of "Every arch can't be a winner and both companies have made notable stinkers in the past." The FX5800 and its noise level come to mind from Nvidia.

However, what I mostly focus on is CPUs and AMD hasn't released a decent CPU that is competitive in power consumption, performance, and heat generation with Intel in a very, very long time. The closest thing that comes to mind is the Athlon XP (Palomino) way back in 2000. Yes, AMD is less expensive than Intel's offerings but, then, they have to be as that is the only area that they can be competitive with Intel in.

My own anecdotal experience with AMD is not a very particularly positive one. I had an Athlon XP (Thunderbird) with an Asus A7V board. The CPU dropped dead within 11 months. The next AMD procs I had were the Athlon MPs in a cluster of servers that were running off of Tyan Tiger S2460 motherboards. Every second core dropped dead within 3 months of deployment, all within about the span of two weeks. After working with Tyan and AMD, it was discovered that the problem was not with the Tyan motherboards, but a voltage regulation problem with the SMP of the Athlon MPs. The AMD engineer promised it would be fixed with a microcode update but I had a business to run. Swapped them out for Xeons and different motherboards and never had another problem.

Just for grins, I went to CPUBoss and pitted the Xeon E3-1275v2 that I have in my current gaming machine against an Opteron of the same vintage with double the cores. The E3 won rather handily. You can see it here. Before any AMD fanboy whines that it's just single core performance comparison with no multicore sampling, I'd point out that in every day use, most of what you run is single threaded.

Is my Xeon a bit of an overkill for home use? For most people, probably so, but then most people don't have 6 to 8 VMs running in the backround while playing Dying Light, either.
 
Avatar 21247
 



"No matter where you go, there you are." Buckaroo Banzai
Reply Quote Edit Delete Report
 

29. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 17:01 nin
 
Timmeh wrote on Oct 1, 2015, 10:30:
Nvidia could announce they are shitting diamonds and i still wouldn't care.

And yet you care enough to keep posting about them...

 
Reply Quote Edit Delete Report
 

28. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 16:58 Timmeh
 
Im pretty sure Nvidia was started by Satan.

but it might just have been Beelzebub instead.
 
Reply Quote Edit Delete Report
 

27. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 15:53 Megalodon
 
Rigs wrote on Oct 1, 2015, 14:24:
Did I not just say I was an AMD/ATI fanboy? Why am I 'so defensive and hostile about others being balanced and buying products based on their merits'? Oh, 'bruh'? It's so funny how you can sit there in previous posts and bash AMD and then wonder why someone that actually LIKES AMD would come in and defend it. For the same reason you talk shit about it is why I come in here and defend it

Admitting your own bias doesn't make your comments valid, if anything its more the opposite. It's funny you think I was bashing AMD by stating they had made some shitty products in that timespan, its just a fact. Every arch can't be a winner and both companies have made notable stinkers in the past.People shouldn't have to preface every potentially negative comment just because you're a sensitive little girl about graphics cards. I'm not sure why you're so emotionally invested in a random corporation who makes video cards but its apparent you want to argue to the death or something so I'm going to go play Borderlands. And laugh about the thought of you fuming behind your keyboard whenever anyone dares to insult our lord and savior Advanced Micro Devices

This comment was edited on Oct 1, 2015, 15:59.
 
Reply Quote Edit Delete Report
 

26. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 15:21 jdreyer
 
Desalus wrote on Oct 1, 2015, 09:32:
I'd be much more interested in this if I didn't have to spend another $200 upfront on yet another device I don't need. For those who already have a PS3/PS4/360/XBONE what incentive do they have to shell out that kind of money when their console already allows them to stream movies and do so much more?

Dude, DUDE. You can play Witcher 3 on ultra settings WHILE TAKING A DUMP. That's worth the money right there. No more poopsocking.
 
Avatar 22024
 



The land in Minecraft is flat, Minecraft simulates the Earth, ergo the Earth is flat.
Reply Quote Edit Delete Report
 

25. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 15:16 swaaye
 
I feel the need to point out that the only reason a particular console game doesn't run at 60 fps is because the developer made the call that prettier pixels were more important than a 60 fps frame rate.

 
Avatar 49717
 
Reply Quote Edit Delete Report
 

24. Re: NVIDIA Relaunches Cloud Gaming Oct 1, 2015, 14:24 Rigs
 
Murder She Wrote wrote on Oct 1, 2015, 10:40:
Rigs wrote on Oct 1, 2015, 09:15:
Bizarre and defensive, eh? So bringing up facts that support my argument, even if it isn't strictly gaming related is bizarre now? Yes, I'm defensive. I'm an AMD/ATI fanboy...duh. Everyone knows that. Get with the program, dude!

Bitcoin mining and the weak console APUs which can barely manage 60fps in most 720p games aren't exactly facts that support any argument here. Buying a brand instead of a product just means you're buying into marketing. AMD has put out dog shit products in the past and I didn't buy them. Same with Nvidia. My comment is as valid as yours, maybe you've never used an Nvidia product and are lying. I guess if I was weird I could just assume that but I gave you the benefit of the doubt. I don't know why you're so defensive and hostile about others being balanced and buying products based on their merits "bruh". Getting that emotionally invested in a brand is really strange, it reminds me of the Apple cult.

Getting back to things that matter, I dislike the idea of GeForce Now. The latency will never be there for many of the games I care about and I don't like having no local access to the product.

Did I not just say I was an AMD/ATI fanboy? Why am I 'so defensive and hostile about others being balanced and buying products based on their merits'? Oh, 'bruh'? It's so funny how you can sit there in previous posts and bash AMD and then wonder why someone that actually LIKES AMD would come in and defend it. For the same reason you talk shit about it is why I come in here and defend it, because it was the subject of the discussion. And for the record, I call bullshit on your ' weak console APU's that can barely manage 60fps in most 720p games'. Way to generalize, there. Most current gen consoles have absolutely NO problem giving 60fps at 720p. It's when they get up to 1080p when they have trouble and not because the APU can't handle it, but rather, in Xbone's case anyway, the way they have video ram (or ESRAM) implemented. A paltry 512megs of it ain't gonna get you far. The APU in the PS4 has no problem with games, assuming the dev actually knows what the hell they're doing! As for PC-based APU's, I have an A6 (which has an equivalent HD6350) with 4gb of DDR3 and it has no trouble playing 1080p video, or playing anything up to 'AAA' graphics heavy games in 1080p. The Witcher3 and GTA5 have some trouble without pulling the options down a tad, but then again, my 7850 has trouble with it, too, at 1680x1050. And have I owned nVidia? Would you like to see the box of nVidia Geforce256, GF2MX's, GF3's and TNT2's and RIVA's I have sitting next to me? Aside from them, I have two laptops with 670M's. So yes, I've used nVidia, and I don't like them one bit.


=-Rigs-=
 
Avatar 14292
 
Reply Quote Edit Delete Report
 
43 Replies. 3 pages. Viewing page 1.
< Newer [ 1 2 3 ] Older >