Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:
LAN Parties
Upcoming one-time events:

Regularly scheduled events

Morning Tech Bits

View
17 Replies. 1 pages. Viewing page 1.
< Newer [ 1 ] Older >

17. Re: Morning Tech Bits Sep 29, 2017, 22:07 Scheherazade
 
MeanJim wrote on Sep 29, 2017, 13:12:
Scheherazade wrote on Sep 28, 2017, 11:53:
Moores law has been a thing of the past for a while now in CPU land.

My 6 core CPU from ~2011 performs indistinguishably similar against the latest offerings in real world application. It's been half a decade.
(Barring esoteric stuff like video transcoding - in which case it's only /most/ of the speed of the latest offerings.)

I'm not that old, but I remember when speeds would double every year, and that was so normal that it was expected like the way people expect the sun to rise next morning.

Moore's law isn't about speed, it was an observation that the number of transistors on a chip double roughly every two years.

That hasn't played out in CPU land. Browse the various CPUs. Transistor counts are rather stagnant. Minor variance, with some series even losing transistors (eg. 3930 -> 4930 lost quite a few).
https://www.techpowerup.com/cpudb/

More importantly, the relevance of Moore's law was that doubling transistors = shrinking feature size by -30%. Shrunken size meant better voltages, better thermals and higher clocks (back then, when things like tunneling were not an issue), which meant better performance.

Regarding a single core, you can't throw more transistors at it to make it faster. At some point, you can't think of anything to use the extra transistors for.

In any case, short of applications that are primarily width limited, like GPUs, or specialized vector processors, there is no need increase transistor count. You really need the transistors you have now to switch faster. Since shrinking feature size isn't getting that done anymore, things are rather stagnant.

-scheherazade
 
Reply Quote Edit Delete Report
 
16. Re: Morning Tech Bits Sep 29, 2017, 13:12 MeanJim
 
Scheherazade wrote on Sep 28, 2017, 11:53:
Moores law has been a thing of the past for a while now in CPU land.

My 6 core CPU from ~2011 performs indistinguishably similar against the latest offerings in real world application. It's been half a decade.
(Barring esoteric stuff like video transcoding - in which case it's only /most/ of the speed of the latest offerings.)

I'm not that old, but I remember when speeds would double every year, and that was so normal that it was expected like the way people expect the sun to rise next morning.

Moore's law isn't about speed, it was an observation that the number of transistors on a chip double roughly every two years.
 
Avatar 17277
 
MeanJim on Steam
Reply Quote Edit Delete Report
 
15. Re: Morning Mobilization Sep 28, 2017, 19:36 Scottish Martial Arts
 
DrSquick wrote on Sep 28, 2017, 16:39:
Can GPUs replace CPUs? My limited understanding was that GPUs were highly specific and could only do a few things but ultra fast and the equivalent of thousands of threads at once, whereas CPUs were more all-purpose.

Yeah that's basically correct. The headline doesn't really capture what the article says to be honest: Huang seems to be suggesting that GPUs will become far more prevalent in the machine learning and cloud computing spaces, which is definitely an in progress trend. Deep Learning in particular is particularly amenable to GPU acceleration, as Deep Learning models are essentially large computational graphs with many layers of computational nodes which can be run in parallel as the input traverses the graph. In fact, a new class of hardware, TPUs or Tensor Processing Units, are increasingly being added to data centers, and a TPU is essentially a GPU that's been optimized for processing tensors, i.e. vectors or matrices of arbitrary dimensionality, which is typically the means by which a Deep Learning computational graph is implemented for processing.
 
Reply Quote Edit Delete Report
 
14. Re: Morning Tech Bits Sep 28, 2017, 18:55 Scheherazade
 
CJ_Parker wrote on Sep 28, 2017, 18:14:
HoSpanky wrote on Sep 28, 2017, 14:21:
The Half Elf wrote on Sep 28, 2017, 12:23:
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.

While the performance leap (and low power usage) of the 1000 series was indeed a big, awesome surprise, nothing comes even close to the mind-boggling difference the first 3dfx card offered.

While I agree that the introduction of 3dfx Glide was a pivotal moment in graphics rendering at the time, I would say we are also in hyperbole land with a blanket statement like that ("nothing comes even close to the mind-boggling difference the first 3dfx card offered").
For example, when hardware T&L made its way onto VGA cards (first or second GeForce IIRC) and we got our first pixel-shaded water in games (like in Morrowind) it definitely came close to the first 3Dfx experiences in terms of jaw-dropping.
Or the release of the awesome, almost legendary, GTX/GTS 8800 cards which also introduced all new levels of image quality and efficiency.

Geforce 256 had T&L. I believe T&L is what made the GeForce the "Ge" force.
The Riva series that preceded it handled mostly texture workloads.
However, at that time, texture workload was still by far the main problem, and hardware vertex transformation made little impact (nearly indistinguishable) over software vertex transformation. Geometry complexity was just so low back then.

8800's were really good. You could also enable quadro features in their bios for cheap CAD acceleration.

-scheherazade
 
Reply Quote Edit Delete Report
 
13. Re: Morning Tech Bits Sep 28, 2017, 18:14 CJ_Parker
 
HoSpanky wrote on Sep 28, 2017, 14:21:
The Half Elf wrote on Sep 28, 2017, 12:23:
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.

While the performance leap (and low power usage) of the 1000 series was indeed a big, awesome surprise, nothing comes even close to the mind-boggling difference the first 3dfx card offered.

While I agree that the introduction of 3dfx Glide was a pivotal moment in graphics rendering at the time, I would say we are also in hyperbole land with a blanket statement like that ("nothing comes even close to the mind-boggling difference the first 3dfx card offered").
For example, when hardware T&L made its way onto VGA cards (first or second GeForce IIRC) and we got our first pixel-shaded water in games (like in Morrowind) it definitely came close to the first 3Dfx experiences in terms of jaw-dropping.
Or the release of the awesome, almost legendary, GTX/GTS 8800 cards which also introduced all new levels of image quality and efficiency.
 
Reply Quote Edit Delete Report
 
12. Re: Morning Tech Bits Sep 28, 2017, 17:44 Scheherazade
 
Beamer wrote on Sep 28, 2017, 17:25:
[...]

Yes, but...

For me, the Holy Shit moment was Doom. We went to a Super Bowl party where a kid had Doom. I'd actually purchased it the week prior, but the 3.5" floppies I bought were corrupt and since it was from the mall I hadn't had time to go back and exchange it. But I didn't know what Doom was, I was only buying it because it had the name id on it. I had no clue what to expect. Literally entering with zero expectations.

So were most of the kids at the party. Some knew Wolfenstein. Maybe a third. The rest just knew Mario and Sonic. The kid said Doom was running around killing things. Great, so was Bionic Commando, big deal.

Yeah, it was a big deal. Jaws dropped. Minds blown. We ran around for a while just punching stuff, amazed at how cool punching stuff looked.

To date, no video game experience compares to that. None come close. Yeah, the first time I saw a 3D accelerated game it was awesome, but not that awesome. Maybe because I had a crappy S3 Virge card before going 3dfx, so there was something bridging the gap. And next steps were still cool, as I remember when Hardware T&L was a big deal, and when colored lighting was a big deal. I remember spending time just firing the blaster down a hallway in Quake 2 and watching the lighting move down the hall, or watching the flies buzz around a dead Strogg. All those were big moments, but nothing blew minds quite like Doom.

For me, wolfenstein and spear of destiny were my transition towards doom, which made doom less of a shocker.

BBS multiplayer was what made doom amazing for me (rather than the game's initial impression). Same for warcraft2 (pre battle net).

Once I started gaming multiplayer on BBS', single player games just about stopped mattering (barring some amazing ones).

-scheherazade
 
Reply Quote Edit Delete Report
 
11. Re: Morning Tech Bits Sep 28, 2017, 17:25 Beamer
 
Scheherazade wrote on Sep 28, 2017, 16:50:
HoSpanky wrote on Sep 28, 2017, 14:21:
The Half Elf wrote on Sep 28, 2017, 12:23:
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.

While the performance leap (and low power usage) of the 1000 series was indeed a big, awesome surprise, nothing comes even close to the mind-boggling difference the first 3dfx card offered. The comparison screenshots of Tomb Raider and Mechwarrior 2 were all I needed to see to convince me I needed to save up for a 3dfx card. There wasn’t any YouTube back then, so I was unaware that it wouldn’t only LOOK better, but it’d run at a MUCH higher framerate. No one even paid attention to framerate back then, outside of “it’s unplayably slow”.

3dfx changed everything. It’s a shame they’re gone, but they held onto the Glide wrapper for too long. Nvidia’s GeForce came along with 32 bit color and support for high-poly curves in Quake 3, and it was over.

Gosh.

Lan party, playing quake.

We're all playing 320x240 or 640x480. The fast computers are hitting 15 fps.

One dude had a voodoo. 800x600 at 30 fps. With _dithering_ (never seen before. Before that every texel showed up like as if you're playing mario)

Minds exploded. There was always a group of people standing around that machine looking at it like as if it had teleported there from the future.

Nothing in computing has ever replicated that moment.

It was "holy shit!" in a pc component.

It's like country bringing F16s to WW2.

It wasn't just a proverbial game changer, it was an entirely new game.

Imagine someone dropping a GPU today that can run today's AAA titles in 8K resolution at a steady 120+ fps with free damn-near-perfect anti-aliasing, 40 bit color, and 10 bits per channel color output. And lets you run two of them together for 240 fps.
... Then 2 years later drops another GPU that lets you do the same at 16K rez. That's like what 3DFX did.

-scheherazade

Yes, but...

For me, the Holy Shit moment was Doom. We went to a Super Bowl party where a kid had Doom. I'd actually purchased it the week prior, but the 3.5" floppies I bought were corrupt and since it was from the mall I hadn't had time to go back and exchange it. But I didn't know what Doom was, I was only buying it because it had the name id on it. I had no clue what to expect. Literally entering with zero expectations.

So were most of the kids at the party. Some knew Wolfenstein. Maybe a third. The rest just knew Mario and Sonic. The kid said Doom was running around killing things. Great, so was Bionic Commando, big deal.

Yeah, it was a big deal. Jaws dropped. Minds blown. We ran around for a while just punching stuff, amazed at how cool punching stuff looked.

To date, no video game experience compares to that. None come close. Yeah, the first time I saw a 3D accelerated game it was awesome, but not that awesome. Maybe because I had a crappy S3 Virge card before going 3dfx, so there was something bridging the gap. And next steps were still cool, as I remember when Hardware T&L was a big deal, and when colored lighting was a big deal. I remember spending time just firing the blaster down a hallway in Quake 2 and watching the lighting move down the hall, or watching the flies buzz around a dead Strogg. All those were big moments, but nothing blew minds quite like Doom.
 
-------------
Music for the discerning:
http://www.deathwishinc.com
http://www.hydrahead.com
http://www.painkillerrecords.com
Reply Quote Edit Delete Report
 
10. Re: Morning Tech Bits Sep 28, 2017, 16:51 Hoop
 
http://tdfx.de/eng/pure3d.shtml
1996 or 97 imported from the states specifically for quake 2.

Wow, still remember running it for the first time with my
friend who also got the same card

http://fabiensanglard.net/quake2/quake2_opengl_renderer.php


This comment was edited on Sep 28, 2017, 22:47.
 
Avatar 34289
 
Um .. Behind you...
Reply Quote Edit Delete Report
 
9. Re: Morning Tech Bits Sep 28, 2017, 16:50 Scheherazade
 
HoSpanky wrote on Sep 28, 2017, 14:21:
The Half Elf wrote on Sep 28, 2017, 12:23:
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.

While the performance leap (and low power usage) of the 1000 series was indeed a big, awesome surprise, nothing comes even close to the mind-boggling difference the first 3dfx card offered. The comparison screenshots of Tomb Raider and Mechwarrior 2 were all I needed to see to convince me I needed to save up for a 3dfx card. There wasn’t any YouTube back then, so I was unaware that it wouldn’t only LOOK better, but it’d run at a MUCH higher framerate. No one even paid attention to framerate back then, outside of “it’s unplayably slow”.

3dfx changed everything. It’s a shame they’re gone, but they held onto the Glide wrapper for too long. Nvidia’s GeForce came along with 32 bit color and support for high-poly curves in Quake 3, and it was over.

Gosh.

Lan party, playing quake.

We're all playing 320x240 or 640x480. The fast computers are hitting 15 fps.

One dude had a voodoo. 800x600 at 30 fps. With _dithering_ (never seen before. Before that every texel showed up like as if you're playing mario)

Minds exploded. There was always a group of people standing around that machine looking at it like as if it had teleported there from the future.

Nothing in computing has ever replicated that moment.

It was "holy shit!" in a pc component.

It's like country bringing F16s to WW2.

It wasn't just a proverbial game changer, it was an entirely new game.

Imagine someone dropping a GPU today that can run today's AAA titles in 8K resolution at a steady 120+ fps with free damn-near-perfect anti-aliasing, 40 bit color, and 10 bits per channel color output. And lets you run two of them together for 240 fps.
... Then 2 years later drops another GPU that lets you do the same at 16K rez. That's like what 3DFX did.

-scheherazade

This comment was edited on Sep 28, 2017, 17:13.
 
Reply Quote Edit Delete Report
 
8. Re: Out of the Blue Sep 28, 2017, 16:39 DrSquick
 
Can GPUs replace CPUs? My limited understanding was that GPUs were highly specific and could only do a few things but ultra fast and the equivalent of thousands of threads at once, whereas CPUs were more all-purpose.  
Reply Quote Edit Delete Report
 
7. Re: Morning Tech Bits Sep 28, 2017, 16:11 The Half Elf
 
HoSpanky wrote on Sep 28, 2017, 14:21:
The Half Elf wrote on Sep 28, 2017, 12:23:
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.

While the performance leap (and low power usage) of the 1000 series was indeed a big, awesome surprise, nothing comes even close to the mind-boggling difference the first 3dfx card offered. The comparison screenshots of Tomb Raider and Mechwarrior 2 were all I needed to see to convince me I needed to save up for a 3dfx card. There wasn’t any YouTube back then, so I was unaware that it wouldn’t only LOOK better, but it’d run at a MUCH higher framerate. No one even paid attention to framerate back then, outside of “it’s unplayably slow”.

3dfx changed everything. It’s a shame they’re gone, but they held onto the Glide wrapper for too long. Nvidia’s GeForce came along with 32 bit color and support for high-poly curves in Quake 3, and it was over.

Would you like me to post pictures of my 3dfx cards? (Yes I still have them, and the Mechwarrior 2 CD's that came with them).
 
Avatar 12670
 
Using a steering wheel on a Burnout game is like using the Space Shuttle controls to fly a kite.
Reply Quote Edit Delete Report
 
6. Re: Morning Tech Bits Sep 28, 2017, 14:21 HoSpanky
 
The Half Elf wrote on Sep 28, 2017, 12:23:
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.

While the performance leap (and low power usage) of the 1000 series was indeed a big, awesome surprise, nothing comes even close to the mind-boggling difference the first 3dfx card offered. The comparison screenshots of Tomb Raider and Mechwarrior 2 were all I needed to see to convince me I needed to save up for a 3dfx card. There wasn’t any YouTube back then, so I was unaware that it wouldn’t only LOOK better, but it’d run at a MUCH higher framerate. No one even paid attention to framerate back then, outside of “it’s unplayably slow”.

3dfx changed everything. It’s a shame they’re gone, but they held onto the Glide wrapper for too long. Nvidia’s GeForce came along with 32 bit color and support for high-poly curves in Quake 3, and it was over.
 
Avatar 15603
 
Reply Quote Edit Delete Report
 
5. Re: Morning Tech Bits Sep 28, 2017, 12:23 The Half Elf
 
I thought we got that with the 1060-1080 series from Nvidia and that was just within the last 2 years.  
Avatar 12670
 
Using a steering wheel on a Burnout game is like using the Space Shuttle controls to fly a kite.
Reply Quote Edit Delete Report
 
4. Re: Morning Tech Bits Sep 28, 2017, 11:58 Luke
 
Yup need a new 3dfx wow moment ..the only true moment in pc history  
Reply Quote Edit Delete Report
 
3. Re: Morning Tech Bits Sep 28, 2017, 11:53 Scheherazade
 
Moores law has been a thing of the past for a while now in CPU land.

My 6 core CPU from ~2011 performs indistinguishably similar against the latest offerings in real world application. It's been half a decade.
(Barring esoteric stuff like video transcoding - in which case it's only /most/ of the speed of the latest offerings.)

I'm not that old, but I remember when speeds would double every year, and that was so normal that it was expected like the way people expect the sun to rise next morning.

Per core performance has stagnated for a while now.

Clock speeds are running up against quantum mechanical issues that effectively cap them off under 'normal' conditions. Quantum mechanical issues are also effectively restricting feature size and voltage. Most per-core performance improvements are from cute computation design trickery (of which there is only so much you can think of, and returns are limited), not from raw speed increases.

CPUs adding cores does only a little for overall performance in most cases. Most tasks are in-order, and can't be divided and ran in parallel. Adding more cores adds heat, which drives down per-core clock speeds.

GPUs scale almost linearly with core count. Their workload is 'embarrassingly parallel', and they can bump performance by bumping their core count, i.e. bumping physical size (ignoring whether or not it makes economic sense to do so, it's simply 'possible').

Basically, for the current materials and methods and architecture that we have today, CPU single core performance has 'arrived'. There won't be anything more than incremental small improvement until there is a move to a replacement technology.

-scheherazade
 
Reply Quote Edit Delete Report
 
2. Re: Morning Tech Bits Sep 28, 2017, 10:29 Pigeon
 
Terrible time to be in the market for a new GPU as cryptocurrency miners push prices over the MSRPs. AMD and Nvidia have to be happy about this, they can't make cards fast enough.  
Reply Quote Edit Delete Report
 
1. Re: Morning Tech Bits Sep 28, 2017, 10:16 RedEye9
 
To get your free credit lock you have to absolve equifax of all evils past, present and future.
What could possibly go wrong.

Whoops, posted above in wrong thread.

The death of Moores law is declared more frequently than Mark Twains.

This comment was edited on Sep 28, 2017, 10:41.
 
Avatar 58135
 
https://www.newyorker.com/contributors/andy-borowitz
Reply Quote Edit Delete Report
 
17 Replies. 1 pages. Viewing page 1.
< Newer [ 1 ] Older >




Blue's News is a participant in Amazon Associates programs
and earns advertising fees by linking to Amazon.



footer

Blue's News logo