Oh good lord you're making my brain hurt.
Will all caps make you understand?
READ THE CNET QUOTE:
""Larrabee silicon and software development are behind where we hoped to be at this point in the project," Intel spokesman Nick Knupffer said Friday. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product," he said. "
FIRST PRODUCT. THEY HAVE SHELVED THE FIRST PRODUCT. THEY HAVE NOT KILLED THE TECHNOLOGY. THEY HAVE NOT ENDED THE PRODUCT LINE. I SWEAR TO GOD YOU MUST UNDERSTAND THIS AND ARE SIMPLY TROLLING. EVERY SINGLE LINK I POSTED SPELLS THIS OUT. AND, CONTRARY TO WHAT YOU'VE SAID, THAT IS THE FIRST ATTACK ON YOU.
HERE WAS FROM THE ANANDTECH LINK:
"As of today, the first Larrabee chip’s retail release has been canceled. "
NOTE THE WORD "FIRST" THERE, IMPLYING THAT PLANS FOR THE SECOND AND THIRD RETAIL RELEASES HAVE NOT YET BEEN CHANGED.
HERE IS MORE FROM ANANDTECH:
"Next, this brings us to the future of Larrabee. Larrabee Prime may be canceled, but the Larrabee project is not. As Intel puts it, Larrabee is a “complex multi-year project” and development will be continuing. Intel still wants a piece of the HPC/GPGPU pie (least NVIDIA and AMD get it all to themselves) and they still want in to the video card space given the collision between those markets. For Intel, their plans have just been delayed."
NOTE THE LAST LINE. THE PLANS HAVE NOT BEEN CHANGED, JUST DELAYED.
Are the caps working? I'll keep with them, just in case. Clearly rational typing has not worked.
MORE FROM ANANDTECH:
"For the immediate future, as we mentioned earlier Larrabee Prime is still going to be used by Intel for R&D purposes, as a software development platform. This is a very good use of the hardware (however troubled it may be) as it allows Intel to bootstrap the software side of Larrabee so that developers can get started programming for real hardware while Intel works on the next iteration of Larrabee."
HE CLAIMS IT'S A GOOD THING THE CURRENT TECH WAS SHELVED TO R&D, AS THIS WILL GIVE DEVELOPERS A CHANCE TO LEARN TO USE IT BEFORE IT LAUNCHES TO CONSUMERS.
LAST BIT FROM ANANDTECH:
"For that matter, Since the Larrabee project was not killed, it’s a safe assumption that any future Larrabee chips are going to be based on the same architectural design. The vibe from Intel is that the problem is Larrabee Prime and not the Larrabee architecture itself. The idea of an x86 many-cores GPU is still alive and well."
"SINCE THE PROJECT WAS NOT KILLED" and "THE IDEA OF AN x86 MANY-CORES GPU IS STILL ALIVE AND WELL."
Still not believing it? Then there's no hope for you. You're usually not a dense poster. I don't understand why it's so hard for you to understand this.
I'll break it down, then: Intel was not getting the results they wanted from Larrabee. The project was delayed to the point that it was no longer significantly better than standard GPUs on the market, yet would cost more. Furthermore yields were not really anywhere near what they needed to be. Rather than push out a half-assed project when the tech wasn't quite ready Intel put the first-gen on the shelf. No need to release it. They have huge pockets but desperately want a chunk of the GPU market. No point in releasing something that wasn't perfect - you only get one first impression.
Gee, sounds like Fermi, doesn't it? Delayed to the point that, though competitive (and likely the fastest single-board solution), it's dollars/performance ratio isn't quite right. The fabrication process also isn't quite right. Yet, unlike Intel, Nvidia is pushing Fermi through. Intel, wiser and with deeper pockets, is waiting a generation (or two) before bringing this to consumers.