Alienware's Video Array

Alienware makes their big E3 announcements; X2, an upcoming PCI-Express motherboard, and Video Array, a way of using multiple video cards in parallel, saying: "The Alienware Video Array and X2 motherboard will debut in Q3/Q4, exclusively through Alienware’s new ALX brand, a family of extreme performance systems catering to the demands of the most hardcore PC enthusiasts. ALX systems will be sold only in the United States, directly through www.alienware.com/alx or 1-800-ALIENWARE." There's a Q&A on the announcement on HomeLAN Fed, and here's more:
Video Array is an accelerated graphics processing subsystem that will allow users to add multiple, off-the-shelf video cards to their Alienware computer systems and have both cards process graphic commands in parallel. Understanding the wide-ranging wants and needs of its customers, Alienware designed its solution so that it is not tied to any one specific video card. This design will allow users to take full advantage of the fastest video card on the market for a significant performance increase.

Alienware’s exclusive Video Array combined with X2, an Alienware designed motherboard which is currently based on Intel Corporation’s next-generation chipset and will include dual PCI-Express high performance graphics card slots, will deliver significant performance gains over current graphic solutions. The new Video Array Technology and X2 motherboard will enable users to run graphics intensive applications flawlessly at maximized settings, render 3D visuals in record time, and much more.
View : : :
51 Replies. 3 pages. Viewing page 1.
Newer [  1  2  3  ] Older
51.
 
Alienware ALX
May 18, 2004, 11:47
51.
Alienware ALX May 18, 2004, 11:47
May 18, 2004, 11:47
 
We are very excited about the overwhelming response we have received concerning our new Video Array technology. However, in order to quell the quickly growing storm of misinformation, we are providing a link to our Video Array FAQ. We hope that this FAQ will help clear the confusion surrounding the new technology.

This post is not meant to incite debate, nor does it mark the beginning of a Q&A session. It is merely being provided in an effort to prevent any false rumors from circulating.

http://www.alienware.com/alx_pages/main_content.aspx

Thank you all and have a nice day.

Joshua Spatz
Alienware Corporation

This comment was edited on May 20, 11:45.
50.
 
Re: No subject
May 15, 2004, 22:56
MM
50.
Re: No subject May 15, 2004, 22:56
May 15, 2004, 22:56
MM
49.
 
Re: Another question....
May 13, 2004, 16:19
49.
Re: Another question.... May 13, 2004, 16:19
May 13, 2004, 16:19
 
I think it's pretty obvious that this would only work with 2 of the same card. You're not going to have an Nvidia and an ATI card working at the same time.

~Steve

48.
 
Re: Another question....
May 13, 2004, 10:28
48.
Re: Another question.... May 13, 2004, 10:28
May 13, 2004, 10:28
 
No, it scales exponentially. You don't see the effect so much with only two processors, but stick four or eight processors in for massive parrallel computing and you've got the start of a supercomputer.

Dual processor systems aren't so limited by the overhead of instructin dispatching, but by the nature of modern PC software, that doesn't (for the most part) use multi-threading in the way they could.

If all PC were multiproc and all software was written for multiproc then you'd se more of a difference. At the moment, one proc will usually take one application thread and the other picks up the other threads, like OS services, etc.


This comment was edited on May 13, 10:32.
47.
 
Re: Dual. AGP....uh no.
May 13, 2004, 10:22
Enahs
 
47.
Re: Dual. AGP....uh no. May 13, 2004, 10:22
May 13, 2004, 10:22
 Enahs
 
Perhaps instead of simply stating this, you should have been looking (or listening) for the response, to which you'd discover that the AGP architecture is fundamentally limited to a single port.

Yes, I have known about this and whined about it before on more than one occasion.
But people can want something that is not easily feasible/at all possible.

And it is still a good idea for what I said, as dual screens rock, rock on!


_____
Enahs If you can read this - congratulations - You have won!!! Just e-mail with your CC# to claim your prize!
I am free of all prejudice. I hate everyone equally.
- W. C. Fields
Avatar 15513
46.
 
Aye...
May 13, 2004, 04:57
46.
Aye... May 13, 2004, 04:57
May 13, 2004, 04:57
 
...this does all seem a bit wasteful. Realistically, how much will these cost? What technical hurdles will exist? Are not Alienware systems already high end? Even if you get 1337 performance, how long until your basic graphic functions/abilities are outdated? Are not video cards already ridiculously overpriced?

Mayhaps I am missing something, but it seems more like a marketing tactic rather than anything truly amazing.

I was always a bit skeptical of this so-called announcement, but I was mildly hopeful Alienware would come up with something along the lines of a new cooling system or trying to make/shape a new market standard.

Thinking an adventurer is me!,
Ray

--------------------------------
http://www.thief3.org/
http://users.ign.com/collection/RayMarden
http://www.guzzlefish.com/collection.php?username=ray_marden
I love you, mom.
Everything is awesome!!!
http://www.kindafunny.com/
I love you, mom.
Avatar 2647
45.
 
No subject
May 13, 2004, 02:39
45.
No subject May 13, 2004, 02:39
May 13, 2004, 02:39
 
So... Would I be able to run a X800 and 6800 in parallel? That way I could get the advantages of both! LOL

Yeah, right.. I'd never spend that much money for that. I'm sure the X2 mobo will be expensive enough by itself.

This space is available for rent
44.
 
$6000 Guinea Pig
May 12, 2004, 22:33
44.
$6000 Guinea Pig May 12, 2004, 22:33
May 12, 2004, 22:33
 
What people have to remember also is that the 1st generation tech will be the buggiest. Does anyone really want to spend quite a chunk of change to be a beta tester when the benefits aren't exactly clear? Even if there are benchmarks released I'd still be skeptical (lies, damn lies, statistics and all that...)

I don't care if I can play Far Cry at 3000 frames per second with this. I want to be able to walk up to a wall and not have it dissolve into a pixelated mess. I want to see wheels that are totally round, every damn grill bar and lug nut and piece of moulding modeled on a car, with no banding and other graphical anomalies. And THEN I want it to run at 3000 frames per second!

---------------------------------------------
"There's so much comedy on television. Does that cause comedy in the streets?" -- Dick Cavett
mocking the TV-violence debate
--------
BOOBIES Filter Greasemonkey script:
http://camaro76.web.aplus.net/BOOBIES_filter.user.js
Punk Buster (Ignore Trolls) Greasemonkey script:
http://camaro76.web.aplus.net/punkbuster.user.js
43.
 
This is pure marketing!
May 12, 2004, 21:47
43.
This is pure marketing! May 12, 2004, 21:47
May 12, 2004, 21:47
 
I think this is mostly marketing speak more than anything else. What it sounds like they are doing is putting two (or more) 16x pci-e slots onto their mobo. That's really all it is. Otherwise they will need to work with (insert your fav grfx card developer here) and design something that can link the multiple cards together. The graphic card has to be especially designed for this very purpose. But you may say, well they don't have to link the cards together, they could put the two cards in there and do some magic to get ~2x performance increase?

In order for this to happen, they would have to put their solution between the game and the graphics card driver. Ok, so the create a directx/opengl wrapper which somehow they tell one graphics card to draw one screen and the other card to do the other. This was similar to ATI's frame interleaving technique that was done ~voodoo2 days. That seems to work. You look at the screen and its flashing? Why, well you only have 1 monitor, and it can only connect to 1 graphics card.

Then you might say, well, maybe they have a merging box that takes the output of card A and B and puts them together (like externally). Well, that is do-able. Well, how do you sync up the outputs of the cards.... I could go on and on.. This doesn't sound like much of a solution...

My bet its a two 16x slot mobo that can support a grfx card that does voodoo-like SLI. The only company that has a patent on sli is nvidia (since they bought 3dfx.) This could mean Nvidia will bring it back (which btw, isn't the frame interleaving solution I mentioned before.)

anyways, I'll stop with my technical ramblings.....

42.
 
Dual. AGP....uh no.
May 12, 2004, 21:15
42.
Dual. AGP....uh no. May 12, 2004, 21:15
May 12, 2004, 21:15
 
I have been stating they should do this for a while, with AGP.

Perhaps instead of simply stating this, you should have been looking (or listening) for the response, to which you'd discover that the AGP architecture is fundamentally limited to a single port.

Dual 16x (lane) PCI-Express could be nice. I suppose the most important thing to notice here is that a gaming hardware company is dictating to the computer hardware developers to create what they need.

Also a good sign that PCI-Express will be around for the long-run (10 years+)


41.
 
No subject
May 12, 2004, 21:07
41.
No subject May 12, 2004, 21:07
May 12, 2004, 21:07
 
This is lame. The only reason people went for dual voodoo cards was for the jump in resolution. Nowadays, everybody is already playing at a very high resolution with stuff ramped all the way up.

All this technology exists for is to get press people into the Alienware booth - nothing more.

Avatar 18037
40.
 
Re: I don't get it
May 12, 2004, 20:57
Enahs
 
40.
Re: I don't get it May 12, 2004, 20:57
May 12, 2004, 20:57
 Enahs
 
I have been stating they should do this for a while, with AGP.
Not to make two of them work together, but to have full 3D power on a dual monitor setup.


_____
Enahs If you can read this - congratulations - You have won!!! Just e-mail with your CC# to claim your prize!
I am free of all prejudice. I hate everyone equally.
- W. C. Fields
Avatar 15513
39.
 
I don't get it
May 12, 2004, 20:15
39.
I don't get it May 12, 2004, 20:15
May 12, 2004, 20:15
 
So you link up multiple videocards to, presumably, get a nice performance increase (although as the poster below me so aptly points out, current day cards are already being held back by the CPU being too slow), but what about image quality / drivers / differences in handling function calls, etc?

While I could see linking two exactly the same cards MIGHT somehow work, I get the idea that Alienware is saying that you can just add any two cards together and it will work? I'm very skeptical about that.

Also, this isn't like the Voodoo2's SLI capability, since that was hardware integrated into the card. This all happens on a software / driver level, and that means that processing cycles (and I'm thinking quite a few of them) are needed. Since no games out there that I can think of really even support multi-threading, I honestly fail to see what would be the gain of this.

While I guess theoretically you could get the performance of say, a Radeon9700 by hooking up a couple cheaper cards (radeon 9000s or so), the expense in getting a system from Alienware that supports that Video Array thing would most likely be far far higher than just getting a 9700 in the first place.

I think it's a good idea of Alienware, but as long as CPUs and memory bandwith are the bottlenecks in current day PCs, adding graphics card power really isn't going to give "70% or more" increase.

The ONLY thing I can see this doing is giving you free resolution upgrades and free 6x (or maybe even 12x?) FSAA and, what, 32xAF? Free as in, it won't cost you any framerates. Then again, ATI's cards have ALREADY been giving us that since the 9700, so again I really fail to see the point.

Creston


This comment was edited on May 12, 20:23.
Avatar 15604
38.
 
Re: No subject
May 12, 2004, 19:58
38.
Re: No subject May 12, 2004, 19:58
May 12, 2004, 19:58
 
As quite a few benchmarks of the high-end cards showed, most graphics cards today are being held back by the CPU, not their own processing power. Buying a system like this would be a monumental waste of money and graphics power. Save the $500 on an extra card, or put it into some other component (maybe a set of Logitech Z-680 speakers, and a Logitech Cordless Desktop MX? It's not that I'm Logitech's bitch, it's just they've put out so much good stuff). Maybe in a few months, when CPU speeds have continued to ramp up but graphics cards are still awaiting further updates, this will be a more attractive powergaming option. But, until then, don't bother.

Avatar 19465
37.
 
Re: No subject
May 12, 2004, 19:47
37.
Re: No subject May 12, 2004, 19:47
May 12, 2004, 19:47
 
Yeah, like I'm gonna run out and spend a grand on video cards hahahaha! I can afford it and I still won't. What happens when your video cards don't support the latest version of DirectX, or new effects/features? No thanks, I'll just buy one card at a time and keep up with the technology.

This comment was edited on May 12, 19:50.
--
He cut the possum's face off then cut around the eye socket. In the center of the belt buckle, where the possum's eye would be, he has placed a small piece of wood from his old '52 Ford's home made railroad tie bumper. Damn, he misses that truck.
36.
 
Re: No subject
May 12, 2004, 19:30
36.
Re: No subject May 12, 2004, 19:30
May 12, 2004, 19:30
 
Exactly!

Because most hardcore geeks/gamers build their own rigs

35.
 
No subject
May 12, 2004, 19:24
35.
No subject May 12, 2004, 19:24
May 12, 2004, 19:24
 
This is obviously targeted at all the rich kids and rich geeks that have the disposable income available, and have to have the absolute best gaming rig around...

its not for the average person, the average gamer, the hardcore gamer, or the hardcore geek...

ITS FOR RICH PEOPLE

_____________________________________________
Give me slack. Or kill me.
______________________________________________
"When the bomb drops it'll be a bank holiday
Everybody happy in their tents and caravans
Everybody happy in their ignorance and apathy
No one realizes until the television breaks down..."

- SUBHUMANS
34.
 
Re: No subject
May 12, 2004, 19:19
34.
Re: No subject May 12, 2004, 19:19
May 12, 2004, 19:19
 
They do. They're called "The higher end cards from either Nvidia or ATI".

33.
 
Re: No subject
May 12, 2004, 19:13
33.
Re: No subject May 12, 2004, 19:13
May 12, 2004, 19:13
 
It's called supply and demand. There are enough people out there with more money than sense that pay those rediculous amounts for video cards. That's the only reason they cost so much. If the majority of the gamer community voted with their wallets when they try to stick us with $400-500 cards you can bet the prices would go down faster.

32.
 
No subject
May 12, 2004, 19:09
32.
No subject May 12, 2004, 19:09
May 12, 2004, 19:09
 
Why doesn't some video card company make cards with 2, 3, or 4+ GPU's on it from either Nvidia or ATI?

Cards are overpriced. Just look at all the markup on them. Example: New cards are ~$400 right now. Give it some time and those SAME cards will be selling for less then $99 and be considered "low end" cards.

Same card: $99 or $400?

Markup?

51 Replies. 3 pages. Viewing page 1.
Newer [  1  2  3  ] Older