Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

External Thunderbolt Graphics Card On Its Way

timothy posted more than 3 years ago | from the hook-to-the-big-monitor-on-the-wall dept.

Graphics 207

An anonymous reader writes "Last week, as the result of a straw poll on Facebook, Village Instruments agreed to begin development of an external Thunderbolt-connected graphics card enclosure. Village Instruments already has experience with its ExpressCard-connected ViDock graphics card chassis, which provides extra GPU juice for Windows and Mac laptops, and the Thunderbolt version is expected to be the same kind of thing — but faster. The only problem is, Thunderbolt is only 4x PCIe 2.0, so you won't be using this to connect modern, desktop-class GPUs to your laptop — and more importantly you need to carry around a second monitor to actually use a ViDock. So why not just buy a proper gaming laptop?"

Sorry! There are no comments related to the filter you selected.

HP dv7 (3, Informative)

Anonymous Coward | more than 3 years ago | (#37030570)

So why not just buy a proper gaming laptop?

It's not exactly a gaming laptop... but it does have a Core i7 2ghz CPU, Radeon HD 6770M 1GB, 8GB of RAM, and a 17.3" LCD... Oh and when I get bored of gaming it also came with a BD-ROM.

Costco has them for $999 and I bought two :)

Re:HP dv7 (1)

Joce640k (829181) | more than 3 years ago | (#37031668)

You were ripped off.

Thunderbolt = dead in two years. (0, Insightful)

Anonymous Coward | more than 3 years ago | (#37030576)

Anybody else think thunderbolt is a technology looking for a solution?

USB is cheaper, almost as fast, and ubiquitous. There are probably literally millions of USB devices that work with a USB port.

Thunderbolt has one RAID box you can buy, and now VI is really stretching the bounds of credulity to come up with another use for it.

I'd bet a month's pay that Apple will start removing Thunderbolt ports from Macs in 2014

Re:Thunderbolt = dead in two years. (5, Funny)

TheRaven64 (641858) | more than 3 years ago | (#37030610)

Totally agree. I mean, a single connector that can drive a monitor, external disks, and a range of peripherals and is small enough to fit on something like a mobile phone? What possible use case is there for that?

Please use the correct syntax (1)

AwaxSlashdot (600672) | more than 3 years ago | (#37030632)

You forgot the tag.

Re:Please use the correct syntax (1)

AwaxSlashdot (600672) | more than 3 years ago | (#37030640)

It originally said : "you forgot the <sarcasm> tag" but /. considered it a true XML tag and stripped it.
I didn't use HTML entities.

I should have used the preview button.

Redundant (2)

CountBrass (590228) | more than 3 years ago | (#37030734)

For hundreds of years written English managed perfectly well without having to sign-post everything with tags.

I certainly got the sarcasm inherent in the gp's post, indeed it was more effective without the silly sign-posting.

Re:Redundant (1)

Joce640k (829181) | more than 3 years ago | (#37030898)

For hundreds of years the English language wasn't exposed to people who could respond to writers instantly in public forums.

Re:Redundant (0)

Anonymous Coward | more than 3 years ago | (#37030954)

No tag means no sarcasm. I don't know what you guys find funny about this.

Re:Redundant (0)

Anonymous Coward | more than 3 years ago | (#37031082)

For hundreds of years, English men have exposed themselves in public forums to writers.

Re:Please use the correct syntax (0)

Anonymous Coward | more than 3 years ago | (#37031126)

<sarcasm>Only in slashdot people are stupid enough to not get it without a stupid tag</sarcasm>

Did i get it right?

Re:Please use the correct syntax (1)

DJRumpy (1345787) | more than 3 years ago | (#37031230)

I'm confused. Are you being sarcastic, or instructing us in the finer use of non-compliant HTML tags?

Re:Thunderbolt = dead in two years. (3, Insightful)

RyuuzakiTetsuya (195424) | more than 3 years ago | (#37030622)

Thunderbolt is protocol agnostic. It's not meant to compete with USB, but express card. In fact you can run USB devices over thunderbolt.

Re:Thunderbolt = dead in two years. (3, Informative)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#37030780)

It isn't exactly protocol agnostic, it's essentially an external 4x PCIe cable. Assuming the device in question doesn't flip out at finding itself a bit further than usual from the PCIe controller, there is certainly a lot of stuff you can plug in to it(with the addition of a case and one of those fancy custom Intel converter chips); but it isn't "agnostic"...

Re:Thunderbolt = dead in two years. (1)

serviscope_minor (664417) | more than 3 years ago | (#37031640)

Thunderbolt is protocol agnostic.

No, it carries displayport and PCIe data. The protocol is very well defined.

In fact you can run USB devices over thunderbolt.

Provided that you wire in a PCIe -> USB chip on the other end. But it seems you can bridge any generic transport to any other, given sufficient will. Did you know you can get a USB to ISA bridge, for instance? It's even supported under Linux!

Re:Thunderbolt = dead in two years. (2)

SimonTheSoundMan (1012395) | more than 3 years ago | (#37030792)

One RAID box? Their are several now, Lacie released one a couple of weeks back. Apple also have their Thunderbolt display [apple.com] , you might want to look at what it does.

Thunderbolt is not competing against USB either.

Re:Thunderbolt = dead in two years. (2)

ricky-road-flats (770129) | more than 3 years ago | (#37030914)

As someone else has already pointed out, it is not a competitor to USB.

As to the RAID box, well, something has to be first. But there are already three others I'm aware of:

There is already also a Sony laptop with a Thunderbolt connector to docking station which has an optical drive, a graphics chip, *and* USB 2.0 and 3.0 sockets. The newer Apple monitors, as well as the new iMacs, use it for USB and DisplayPort. The laptops with it can use a powered-down iMac as a monitor. You can't do a lot of that with USB.

As usual with technologies like this, as soon as it's integrated into chipsets and/or standard motherboards, the products will follow. Just the fact that Apple are selling hundreds of thousands of units with this integrated will help stimulate companies to produce more products that use it...

Re:Thunderbolt = dead in two years. (1)

GordonBX (1059078) | more than 3 years ago | (#37030942)

Um...

USB 2.0, even at its theoretical maximum is 20 x slower than Thunderbolt. USB3.0 at its theoretical maximum is 2 x slower. The thunderbolt architecture means that you get a full 10Gb/sec in both directions unlike USB which has so much processor overhead that you never get anywhere near its theoretical maximum. So no, USB is not "Almost as fast".

RS232 Serial ports used to be ubiquitous.

I'll take that bet. If only because they are still including firewire ports, and they have been useless for years and years and years. You must not earn very much to make such a bet.

Re:Thunderbolt = dead in two years. (1)

drsmithy (35869) | more than 3 years ago | (#37031060)

Anybody else think thunderbolt is a technology looking for a solution?

Thunderbolt is a PCIe bus on a cable. USB isn't even playing the same game, let alone in the same league.

Re:Thunderbolt = dead in two years. (1)

Bengie (1121981) | more than 3 years ago | (#37031240)

"Anybody else think thunderbolt is a technology looking for a solution?
USB is cheaper, almost as fast, and ubiquitous. There are probably literally millions of USB devices that work with a USB port."

I don't think you understanding the difference between TB and USB. TB is meant to replace PCIe, HDMI, SATA, etc. Just wait for the teamable 40gbit optical version that's coming out in a few years.

Let me ask you this hypothetical situation. Imagine you have a computer 10 years from now, it has no PCIe slots or anything like that. All it has is a bunch of USB ports and you connect your 64 teraflop videocard to your motherboard via the USB ports, your 4GB/sec SSD via USB, and your 2100p 36bit monitor via USB.

People would say that this is completely stupid because USB isn't meant for this situation and would be horrible. Well, TB *IS* meant for this, will do this, and will do it better than anything else that exists or has been announced.

I'm not sure if this would happen, but I could see it happening. TB will eventually switch to a fiber phys. Once it does that, it will have a cheap fiber connection that is good for 40/100gbit. Could you imagine if NICs adopted the connection? I'm not saying to also adopt the TB protocol, but just make use of the connection for a cheaply mass-produced 100gbit 100meter fiber phys.

Re:Thunderbolt = dead in two years. (1, Interesting)

Luckyo (1726890) | more than 3 years ago | (#37031494)

It would stupid because there's more then enough cables in a desktop already. If anything, we need LESS of cable solutions and more slots. Look up at what's missing inside laptops in comparison to desktops. Yeah.

Grandparent is exactly correct. This is a technology looking for a solution, or more correctly for a problem to fix. Because there simply isn't one at the moment.

Re:Thunderbolt = dead in two years. (1)

deains (1726012) | more than 3 years ago | (#37031484)

Thunderbolt seems to get a lot of hate because it's marketed by Apple. Techies are quick to forget their main partner in this, who designed the technology, is Intel. I'm sure if Intel had released the technology off their own back it would have had the full support of the tech community, and would also have died within 6 months thanks to lack of use.

By going for the consumers via Apple first, this technology has the chance to thrive. It has a chance to gain some ground in the peripherals market, which it wouldn't have done if it was only found on high-end gaming PCs (see: eSata). Probably Intel will start putting TB ports on their motherboards soon-ish, and once those start appearing, the other mobo manufacturers will almost certainly follow.

So basically, odds are TB will still be going strong in two years. It could still fail of course, since nothing is really certain in this market (who's to say ATi won't come up with a competing standard?). But right now there's no reason to doubt Thunderbolt is a growing force in the world of plugging stuff in.

What's the use case? (1)

TheRaven64 (641858) | more than 3 years ago | (#37030596)

And machine with Thunderbolt already has a modern GPU, because it's integrated with the display port. Or they've added a Thunderbolt card to an old machine, but if they can add expansion cards then they can add a new GPU.

The new MacBook Pro already supports chaining two displays from the port, and I doubt this will be a very unusual feature for devices with Thunderbolt. I suppose this might be useful for adding a third one, but then you're really pushing the available bandwidth.

Woosh! (2, Interesting)

CountBrass (590228) | more than 3 years ago | (#37030638)

You've completely missed the point. Don't think MacBook Pro, think the new Thunderbolt equipped MacBook Airs that lack a decent built in graphics card.

And to answer the summary's closing question: because it means I can carry an ultra-portable (MacBook Air) when I travel and plug it in at home to give it a much needed graphics boost for use at home.

Re:Woosh! (0)

TheRaven64 (641858) | more than 3 years ago | (#37030658)

The latest Intel graphics cards aren't that bad. They're not great, but I'd imagine that they'd stack up quite well against something that's crippled by the bandwidth of Thunderbolt. Modern GPUs use 16x PCIe cards. Even with PCIe 1, this is 3 times the bandwidth that this device can use. With PCIe 3, it's 12 times as much. A slightly weaker GPU on a fast interface is going to beat a fast one that's spending 90% of its time waiting for data over the bus.

Re:Woosh! (2)

CountBrass (590228) | more than 3 years ago | (#37030670)

All the bench marks show the intel integrated graphics are signigficantly worse than the NVIDIA discrete graphics chip in the previous generation of MacBook Air.

So no, by modern standards, the intel graphics solution that's built into the cpu is still pretty dreadful. Intel have never managed to make a competetive gpu and that's still the case today.

As the graphics card will be at the far end of the thunderbolt connector (ie the display end) I don't see the problem.

Re:Woosh! (1)

Billly Gates (198444) | more than 3 years ago | (#37030722)

The AMD ones have an integrated ATi 6800 series graphics that is actual as good as a dedicated card from 3 years ago. The Intel ones seem 10 years behind and it is rediculous that games from 2002 barely play on an icore5 in this day and age because the Intel graphics are so horrible. It is ruining the pc as a gaming platform they are so terrible.

Anyway you can play wow with an AMD Llamo on medium settings and give the intel a run on the money. IE 9 and flash with 1080p fly on it too with hardware acceleration unlike an Intel one so they are becoming more important than just video games.

The Llamo shares the memory controller with the CPU so it is not as bottlnecked. It still is but nothing like the horrible solutions we have seen in the last 5 years.

Re:Woosh! (1)

DJRumpy (1345787) | more than 3 years ago | (#37031378)

Actually the benchmarks show that the 2011's are pretty much on par with the 2010's, within 3-4 FPS. I wouldn't call that significant.

http://lowendmac.com/bookrev/11br/0805.html [lowendmac.com]

http://arstechnica.com/apple/news/2011/07/intel-integrated-graphics-not-hampering-sandy-bridge-macs.ars [arstechnica.com]

All the bench marks show the intel integrated graphics are signigficantly worse than the NVIDIA discrete graphics chip in the previous generation of MacBook Air.

Re:Woosh! (0)

Anonymous Coward | more than 3 years ago | (#37031438)

This.

Going all the way back to the i740 in the Pentium-II days, Intel's GPU solutions have always been a step or two behind every other competitor on the market.

Posting anonymous to preserve well-deserved moderation in this thread.

Re:Woosh! (1)

Anonymous Coward | more than 3 years ago | (#37031490)

There's one significant problem - interface bandwidth. PCI Express x16 v3.0 runs at 128 Gbit/sec. Thunderbolt runs at 10Gbit on 2 channels (20 total). If Thunderbolt moved data at 64Gbit/sec, the performance hit may be worth the mobility, but 20Gbit is anemic for graphics. Expect up to an 80% performance hit for I/O intensive operations (texture/geometry loading). Once it's to the card, things should go smoothly.

Re:Woosh! (0)

Anonymous Coward | more than 3 years ago | (#37030796)

modern GPUs are *connected* by 16x pcie lanes. They don't use all this bandwidth during normal use.

All this means is there will be a very, very marginal and completely unnoticeable delay when feeding VBOs and Textures to the card from system memory.

Re:Woosh! (3, Interesting)

drsmithy (35869) | more than 3 years ago | (#37031070)

The latest Intel graphics cards aren't that bad. They're not great, but I'd imagine that they'd stack up quite well against something that's crippled by the bandwidth of Thunderbolt. Modern GPUs use 16x PCIe cards. Even with PCIe 1, this is 3 times the bandwidth that this device can use. With PCIe 3, it's 12 times as much. A slightly weaker GPU on a fast interface is going to beat a fast one that's spending 90% of its time waiting for data over the bus.

The vast majority of stuff you might want a GPU to do, is not bandwidth-limited. Numerous tech sites have shown that in most cases, the difference in performance between a GPU on a x16 PCIe bus and a x4 PCIe bus is nothing, and even a x1 PCIe bus doesn't suffer much.

Re:Woosh! (0)

Anonymous Coward | more than 3 years ago | (#37031676)

Thus the reason that SLI / CrossFire works. I'm not aware of any non-server motherboard that has multiple x16 slots on it.

The "anonymous poster" who submitted this doesn't know much about that which he writes. Also, could he load the thing up with his opinion more? Most people don't want to lug around a 10 pound "gaming" laptop when they only use it for gaming Lenovo x1?), and still be able to use it with a large display (I believe Apple just put out a 27" with Thunderbolt built in) and gaming power when they want to. Not hard to understand, but he seems to be having difficulty.

Re:Woosh! (3, Informative)

bemymonkey (1244086) | more than 3 years ago | (#37030694)

"And to answer the summary's closing question: because it means I can carry an ultra-portable (MacBook Air) when I travel and plug it in at home to give it a much needed graphics boost for use at home."

Sure, that would be great - but Apple crippled the MBA with a downsized Thunderbolt port. http://www.slashgear.com/macbook-air-gets-half-power-thunderbolt-29168292/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+slashgear+(SlashGear) [slashgear.com]

If the thing can't even handle two external screens, I doubt it'll handle an external screen and an external graphics card...

Your argument makes no sense. (1)

CountBrass (590228) | more than 3 years ago | (#37030716)

I'm not really sure how you manage to get from "[it] can't even handle two external screens" to "It [won't] handle ... an external graphics card".

I don't believe the former could you lead you to conclude the latter,

Re:Your argument makes no sense. (1)

bemymonkey (1244086) | more than 3 years ago | (#37030746)

Hmmm, I suppose you're right. Just saying, with Apple's track record: It wouldn't surprise me if there was some magical compatibility issue...

Re:Woosh! (1)

Kjella (173770) | more than 3 years ago | (#37030828)

That doesn't seem to be directly related. The main issue is that a 16x PCIe 2.1 slot - what you'll find on most motherboards - can transfer 8 GB/s (64 Gbit/s). Thunderbird can do 10 Gbit/s, and that probably includes the 8/10b encoding so the comparable number is 8 Gbit/s. Any real high-end graphics card will probably starve. As for the outputs, wouldn't you then naturally use the additional outputs on the card? I don't see much sense in sending anything but the laptop screen - if in use at all - back to the laptop.

Re:Woosh! (2)

jpapon (1877296) | more than 3 years ago | (#37030872)

Any real high-end graphics card will probably starve.

Actually, afaik GPUs are very rarely limited by the bandwidth of a 4x PCIe slot, nevermind 8x or 16x. You have to be doing some very specific things to actually take advantage of a 16x PCIe slot.

You very rarely need to transfer data on the order of 8GB/s to/from the GPU... most of what goes across the PCIe bus is just commands, not data. That's why your DVI/HDMI/Displayport is on the back of the graphics card, and not on mainboard; your CPU doesn't need to know much about the results of the GPU calculations.

Re:Woosh! (1)

bemymonkey (1244086) | more than 3 years ago | (#37030904)

"As for the outputs, wouldn't you then naturally use the additional outputs on the card? I don't see much sense in sending anything but the laptop screen - if in use at all - back to the laptop."

True, there's not much sense in it - but that's a good question for someone in the know: Do outputs on an external Thunderbolt graphics card require any additional bandwidth on the "root" Thunderbolt port?

Re:Woosh! (1)

jpapon (1877296) | more than 3 years ago | (#37031162)

Do outputs on an external Thunderbolt graphics card require any additional bandwidth on the "root" Thunderbolt port?

That's kind of an odd question. If you didn't put the outputs on the external card then you would have to send your data for display back over the "root" port, which would consume a huge amount of bandwidth. So compared to that, using external ports consumes far less bandwidth.

Are you asking does it use more of the ports bandwidth than an external graphics card which has no external outputs and doesn't send output back to the host (ie it has no output)? In that case... presumably yes, external ports would require some small bandwidth overhead... but then your GPU would be a pretty useless coprocessor.

Re:Woosh! (2)

itsdapead (734413) | more than 3 years ago | (#37030916)

If the thing can't even handle two external screens, I doubt it'll handle an external screen and an external graphics card...

The "lite" thunderbolt chip on the Airs has zero practical consequences: The limitation on external screens ultimately comes from the on-CPU Intel HD Graphics which only support one DisplayPort and a maximum of two displays (including the built-in screen). The 13" MB Pro has the same limitation for the same reason.

The full-fat Thunderbolt chip supports a second physical Thunderbolt port (but only the iMac actually uses this) and can carry a second DisplayPort signal (only useful on the machines with Radeon graphics like the 15" and 17" pros). It would be completely pointless in an Air.

Re:Woosh! (1)

bemymonkey (1244086) | more than 3 years ago | (#37030952)

"The limitation on external screens ultimately comes from the on-CPU Intel HD Graphics which only support one DisplayPort and a maximum of two displays (including the built-in screen)."

Do you have a source for this? Even Intel's 4500MHD (back from the last Core2Duo mobile generation) was already capable of pushing two screens of 1080p, or possibly even one at 2560x1600 and another at 1080p at the same time (need to get my hands on one of the bigger screens to find out). I doubt that the latest gen is incapable of driving two external displays, at least at 1080p... 2x2560x1600 seems much more likely.

Re:Woosh! (1)

drsmithy (35869) | more than 3 years ago | (#37031078)

The "lite" thunderbolt chip on the Airs has zero practical consequences: The limitation on external screens ultimately comes from the on-CPU Intel HD Graphics which only support one DisplayPort and a maximum of two displays (including the built-in screen). The 13" MB Pro has the same limitation for the same reason.

Given Intel integrated GPUs in laptops were capable of driving a couple of 27" LCDs over displayport 2-3 years ago, I find that difficult to believe.

Re:What's the use case? (1)

GNious (953874) | more than 3 years ago | (#37030644)

Doesnt MacBook Air now come with some Intel CPU-based graphics? Then I should think pretty much anything Village Instruments can come up with, would improve matters.

Re:What's the use case? (1)

beelsebob (529313) | more than 3 years ago | (#37030842)

Modern, yes. Fast, not necessarily. the 13" MacBook Pro uses only the intel integrated graphics, which... while much improved in sandy bridge, don't hold a candle to a real graphics card.

For me, a laptop that's thin, light, portable, but can be used to play games on and/or to do graphics work when I get home would be perfect. This dock (combined with a MacBook Pro 13 or MacBook Air) seems to fit the bill dead on.

Re:What's the use case? (1)

tepples (727027) | more than 3 years ago | (#37031526)

For me, a laptop that's thin, light, portable, but can be used to play games on

If it's games you want, any laptop will run an NES emulator, even an Atom netbook.

So why not just buy a proper gaming laptop? (0)

Anonymous Coward | more than 3 years ago | (#37030606)

Well, you don't. Clearly the use-case here is an additon to a notebook like the mac book air. While on the road it's graphic performance may be sufficant, but at home/work you may want a little more punch, which dosn't always justify an extra pc or mac.

"So why not just buy a proper gaming laptop?" (4, Interesting)

macklin01 (760841) | more than 3 years ago | (#37030652)

So why not just buy a proper gaming laptop?"

For docking stations and such. Plenty of us plop our laptop onto a docking station or a USB hub + monitor + speakers + keyboard + mouse anyway.

It beats the hell out of hauling an overpriced 10-pound beast to the same office desk every day, when you can just keep better equipment (with better ergonomics) neatly arranged and haul a lighter machine to/from work.

Re:"So why not just buy a proper gaming laptop?" (0)

Anonymous Coward | more than 3 years ago | (#37030666)

That's it. Also, just a thought but at the moment the penalty between PCIe 4x and 16x is something like 3 frames a second. That's not a really problem when your getting 50+ frames a second.

Re:"So why not just buy a proper gaming laptop?" (1)

Anonymous Coward | more than 3 years ago | (#37030688)

Some pretty graphs: http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/25.html

Re:"So why not just buy a proper gaming laptop?" (1)

Billly Gates (198444) | more than 3 years ago | (#37030742)

My exwife had one of those fancy Toshiba Qosmio gaming laptops. Lets look at it this way ... she will never buy a laptop again. it is bulky, hot, and needs a USB fan underneath so it wont catch on fire. They draw power and are highly unreliable and black screen all the time.

Get an AMD Llamo if you want ok graphics in the low to medium settings as it has an ATI 68000 inside the CPU that shares its ram contoller so it does not have latency waiting for the chipset to access the ram. The AMD bulldozer should come out in a month or 2 that will offer higher bandwith and will be comeptitive with dedicated GPU systems. They are fit in regular laptops so you do not need a beast.

Windows 8 with accelerated HTML 5 and flash will fly on these. I can't wait and it will give Intel a run for the money if you need graphics performance and not follow simple CPU benchmarks only ignoring the video.

68000: What's old is new again (1)

tepples (727027) | more than 3 years ago | (#37031542)

it has an ATI 68000 inside the CPU

My Sega Genesis also has a 68000 inside the CPU [wikipedia.org] . What's old is new again.

Re:"So why not just buy a proper gaming laptop?" (0)

Anonymous Coward | more than 3 years ago | (#37030822)

You might want to consider one of them 'desktops'.
Try it, they're pretty neat.

Re:"So why not just buy a proper gaming laptop?" (1)

teh kurisu (701097) | more than 3 years ago | (#37030994)

This is a little bit off-topic, but does anybody know if there's a Thunderbolt docking station in the pipeline, from any manufacturer?

The MacBook Air + Thunderbolt Display combination has piqued my interest, because it provides a relatively full array of ports when plugged in. But I don't need an external monitor, just the docking station, and I've got no desire to splash £899 on a monitor that I don't want or need.

No bandwidth limiting yet (4, Insightful)

EdZ (755139) | more than 3 years ago | (#37030730)

Thunderbolt is only 4x PCIe 2.0, so you won't be using this to connect modern, desktop-class GPUs to your laptop

For multi-GPU systems in current desktops at least, there's little to no performance penalty going from 16x to 4x [googleusercontent.com] .

Re:No bandwidth limiting yet (1)

AHuxley (892839) | more than 3 years ago | (#37031182)

Will be great for a Mac mini. Energy use is low when working in OS X, then enjoy a Windows game at okish fps for a while in bootcamp.

Re:No bandwidth limiting yet (1)

StoneyMahoney (1488261) | more than 3 years ago | (#37031256)

I saw an article on Tom's Hardware Guide a few years ago where this very topic was examined by reducing the number of PCIe lanes available to, what was at the time, a high-end graphics card. The were drops in frame rate but they were suprisingly small with the difference between 4 and 16 lanes being about 10% IIRC. However, times have changed, games have been tailored to higher resolutions with larger textures and modern graphics cards expect much higher available bandwidth than when this test was originally carried out. Think it's time to redo the test.

Move the GPU into the monitor (1)

simonecaldana (561857) | more than 3 years ago | (#37030758)

Apple should simply integrate a modern GPU in their monitors: there's room to properly cool down and the monitor becomes the definitve docking station. It would also justify the price, even if it's slightly increased.

Re:Move the GPU into the monitor (0)

Anonymous Coward | more than 3 years ago | (#37030922)

Need a video card upgrade?

Throw the monitor away!

You know apple would make it non replaceable....

Re:Move the GPU into the monitor (1)

simonecaldana (561857) | more than 3 years ago | (#37030960)

need a computer upgrade? Throw the old one away.

that's apple policy to upgrades, and it always has been. That's why they don't sell low end CPUs: the point is to make machines last enough for consumer to swallow this pill and buy another mac.

Re:Move the GPU into the monitor (1)

itsdapead (734413) | more than 3 years ago | (#37030946)

Apple should simply integrate a modern GPU in their monitors: there's room to properly cool down and the monitor becomes the definitve docking station. It would also justify the price, even if it's slightly increased.

They do: it's called an iMac. They throw in a CPU and a hard drive, too.

Re:Move the GPU into the monitor (1)

simonecaldana (561857) | more than 3 years ago | (#37031342)

it lacks an iBicycle to power it up while you're not within distance from wall socket.

Mac SCSI display (1)

dltaylor (7510) | more than 3 years ago | (#37030760)

I remember it (for the really early Macs), Wikipedia mentions it (no footnote), and I have an old SCSI spec' for it (and SCSI Ethernet) around somewhere.

Sounds like more of the same. Connect a general-purpose interface to a box with some limited resource (no I/O slots in the original Mac, and only a few dedicated mass storage slots in most current portables) and there will be someone to use the GP interface to run a display.

Not a bad idea, just not terribly original.

Re:Mac SCSI display (0)

Anonymous Coward | more than 3 years ago | (#37031582)

Sorta. The big exception here is that Thunderbolt is made from the ground up to speak other protocols. When someone finally creates a usb endpoint for this, it will speak usb over the Thunderbolt wire. Same for firewire, et. al.
Scsi cables can (as far as i know) only speak scsi protocol and can't efficiently embed other protocols.

This has been done before - AMD XGP (1)

Skowronek (795408) | more than 3 years ago | (#37030768)

AMD released this kind of product before - http://en.wikipedia.org/wiki/ATI_XGP [wikipedia.org] . It was generally considered to be a failure, partially because the software support was not perfect, and partially because people just didn't want to lug a dock / GPU box around. The hardware bandwidth was more satisfying, though, at 8 PCIe lanes and not 4 like Thunderbolt.

I find it amusing that the same ideas return and return in this industry, presented as an innovation every single time.

Re:This has been done before - AMD XGP (1)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#37030882)

There's actually an entire(rather obscure) industry of external PCI and PCIe connector products: ATI had their stab with the XGP, which unfortunately foundered because it only ever appeared in a few systems of no particular interest. Nvidia, to this day, has an external PCIe connector card for their higher-end Tesla products, for their legacy D870 enclosure or their current S2050 rackmount(both of which used a 16x PCIe interface card+proprietary cable to allow a normal desktop/workstation to connect to an external enclosure containing up to 4 Tesla cards). Various smaller outfits have had external PCI backplanes available for users who absolutely needed to run specialized ADC cards or similar industry-specific stuff on SFF workstations or laptops in the field. The Magma [magma.com] guys have been at it for a while now...

The question is not of Apple being innovative, which they wouldn't be; but whether Apple will, through force of just deciding that this is the new baseline, create a market large enough that such expansion gear won't be obscure, expensive specialty hardware...

Re:This has been done before - AMD XGP (1)

UnknowingFool (672806) | more than 3 years ago | (#37031528)

The question is not of Apple being innovative, which they wouldn't be; but whether Apple will, through force of just deciding that this is the new baseline, create a market large enough that such expansion gear won't be obscure, expensive specialty hardware...

Since Thunderbolt is an Intel technology I would say that it isn't being driven by Apple exclusively. That was the problem with the ATI and nVidia products getting traction. I think there is now a Sony Vaio Z laptop that has Thunderbolt and a dock. As for expensive, initial hardware will be until there is more competition.

Gaming + laptop = contradiction (4, Insightful)

Stormwatch (703920) | more than 3 years ago | (#37030844)

I just don't understand the purpose of a high end gaming laptop. It's always quite more expensive than the equivalent desktop; and ultimately you're playing with a small screen, a cramped keyboard, and an imprecise pointing device, in a far less comfortable way... unless you plug the laptop to an external screen, keyboard, and mouse, so what was the point of a portable anyway?

Re:Gaming + laptop = contradiction (1)

lawyer boy (152954) | more than 3 years ago | (#37030862)

I play World of Warcraft on a MBP over WiFi while reclining on the couch. PVP is a challenge because mouse turns are hard to do with a trackpad, but I can raid without difficulties. I assume that most people play games on a desktop, but I can't imagine sitting in a chair for hours like that. YMMV.

Re:Gaming + laptop = contradiction (1)

misexistentialist (1537887) | more than 3 years ago | (#37031012)

Could use a monitor arm.

[Posted from my bed]

Re:Gaming + laptop = contradiction (1)

myspys (204685) | more than 3 years ago | (#37030912)

Most of the time my laptop moves between the office and my home. Once in a while, I want to play a game (or two) and for quite obvious reasons, I don't want a gaming rig (ugliness and space).

I'd rather have a small device I connect between my laptop and my screen (external screen both at home and at work) for when I need to play.

Point of a portable? Being able to use it as a portable computer MOST of the time.

Re:Gaming + laptop = contradiction (1)

Chris Mattern (191822) | more than 3 years ago | (#37030938)

A laptop is much more convenient to take to a LAN party than a desktop.

Re:Gaming + laptop = contradiction (1)

iMouse (963104) | more than 3 years ago | (#37031038)

What is this "LAN party" you speak of? Gamers getting together in one place? Sounds so 90's...

Re:Gaming + laptop = contradiction (1)

tepples (727027) | more than 3 years ago | (#37031576)

What is this "LAN party" you speak of? Gamers getting together in one place? Sounds so 90's...

I'd hate to live in a world that's so isolating that friends and family members never visit one another.

Re:Gaming + laptop = contradiction (3, Interesting)

ledow (319597) | more than 3 years ago | (#37030950)

I used to think this. Then, by chance, my workplace bought me one. I'd specified nothing more than "Must have Intel chip, more than one core, and an nVidia graphics card" - for convenience, compatibility with my existing disk images, etc. and to suit all the tasks I do during the average work day.

I ended up with an MSI gaming laptop - my workplace didn't even realise that the rucksack and mouse it came with were anything more than "freebies" even though the mouse was one of those stupidly expensive ones that has multiple DPI modes, weights for balance and all sorts of other shite (but, hell, it's a very good mouse).

They didn't even care that the WASD keys were highlighted or that it had all sorts of gaming features like a touch-button to overclock both processor and graphics (2 year warranty not applicable...). Apparently it was a super-cheap deal and even now I can't get the same laptop or any equivalent for even half the price they paid for it.

I have to say - it's been wonderful. I've always had a dedicated "games" machine in the past and never had the money for this sort of laptop and probably would never have bought it for myself. I threw 300 Steam games at it and it laughed at every single one (I've always played the defaults that games offer but now I can actually ramp up to maximum easily).

It has a huge screen that, even as up close as being laptop-range, you can really appreciate every pixel. It does HD video like I was asking it to add 2+2. The processor laughs at my Eclipse platform and compiles take no time at all. I've never NEEDED to press the overclock button for any reason, ever, at all. It has all the usual gadgets (webcam, bluetooth, wifi, even an "eco" button) and some more unusual (e.g. an external wifi antenna port!).

It has a huge (full) keyboard that's ideal for typing AND gaming. It has a solid aluminium construction that has so far suffered more and survived better than any other laptop I've ever seen in my life (and has a custom-designed backpack to carry it in that holds more weight that I ever thought a backpack like that could). The sound is amazing and the first full 7.1 setup I've ever owned (hell, I've never bothered to have anything but stereo before - and this is WITHOUT having to plug any speakers in) and it's the LOUDEST laptop I've ever heard (you can easily watch a DVD on a crowded noisy airplane, or in a room with the TV on, and hear every word - and the positional audio does still work in those circumstances.

I would never have touched this laptop in a million years, much preferring two or three more ordinary ones instead. But now I'm trying to find this EXACT laptop again for myself at a decent price. It's really changed the way I used my computer and I use the laptop exclusively now. There's nothing better than having a machine that you can use all day at work for menial tasks and then have that same machine at home to play anything you throw at it, and take the same machine with you on holiday and have it do everything you need/want while you're away too.

Plus, gaming laptops have huge advantages in terms of some basic specifications - big GPU's that you just don't get on business laptops, great for video encoding - large amounts of RAM, big screens, every port imaginable, full keyboards that you can get to every key easily, and a lot of money spent on making it feel "right" and solid. I can type on this laptop all day long, go home and type on it for hours, and then take it on the road and type on it for even longer and not fatigue. Even the mouse is the most comfortable that I've ever used.

I would never pay what I see as the gaming premium (similar to the wedding premium - a £5 cake suddenly costs £50) but a single gaming laptop changed it for me. It's not like this is even a model that *pretends* to be gaming while actually being general purpose - the WASD are marked and everything about it says "gaming laptop". But it laughs at everything you throw at it because, compared to a top-of-the-line game, you are asking nothing from it.

Hell, if I could afford it I'd buy ten of these things to keep me going for the next few decades. The point of a good gaming laptop is not to be a substandard gaming device, it's to be something that laughs at every job you throw at it and make you feel comfortable at all times even if you're on it for hours. You don't have a small screen (because you're so much closer), you don't have a cramped keyboard, you don't have an imprecise pointing device (external gaming mouse aside, the touchpad on this thing is the toughest and nicest thing I've ever used), and it's infinitely more comfortable AND does everything else you do. A gaming desktop can't do laptop things. A non-gaming laptop can't do gaming things. But a gaming laptop, although technically slightly "inferior" to a top-of-the-line gaming desktop, is the best compromise you'll ever see and does everything.

In all the years I've been doing IT, this is the one machine that I absolutely will NOT let out of my hands because the single feature of having one machine that does EVERYTHING to a good standard (and lets you game hard at home, or browse lightly on the move) is worth its weight in gold. I have an agreement with my employer that the laptop will become mine should I leave for any reason (I will pay what they consider to be the correct portion of their purchase price for it).

Re:Gaming + laptop = contradiction (1)

del_diablo (1747634) | more than 3 years ago | (#37030988)

Somebody mod up.
This is not 2002 anymore: Laptop CPUs and laptop GPUs have gotten quite decent over the years.
And besides: Nothing beats sitting down comfortably in the couch, and game hard.

Re:Gaming + laptop = contradiction (1)

mlk (18543) | more than 3 years ago | (#37030976)

I loved my 17" screen on my old Alienware laptop. It was the perfect size and was not cripped by the current HD fad. I'd happily swap my two current 19" "HD" screens for 2 17" 1900x1200(ish) screens.

> cramped keyboard

Take a look at a high end Alienware (ignore the uglyness for a moment. It is hard I know). That is not a "cramped" keyboard.

> imprecise pointing device

Agreed, but have you seen the size of a mouse? They are tiny. Most laptop bags even included little tiny pockets just the right size for a mouse.

> what was the point of a portable anyway

When I bought my Alienware I lived & worked in two places - UK and Greece. So it made sence to me at the time to be portable. Not portable in the sence that a phone is portable (I can whip it any anywhere and "work") but that I could put my life into two bags and move. Something you can not do with desktop. You can do with a laptop and a mouse. Playing video games is a big part of my life, so it should be covered by what goes in to my bag. You might not have a use case for a gaming laptop but that does not mean that they do not exist.

Re:Gaming + laptop = contradiction (0)

Anonymous Coward | more than 3 years ago | (#37031048)

I just don't understand the purpose of a high end gaming laptop.

If you're an CAD/CAM person and you need to go to a customer's office to demo designs that you have to date for feedback, and need the power to run the software. (Assuming they can't come to your office.)

Bigger screen than a DS (1)

tepples (727027) | more than 3 years ago | (#37031598)

I just don't understand the purpose of a high end gaming laptop.

I can see two reasons. For one thing, unlike a desktop PC, a laptop can play video games while on an airplane or a Greyhound bus. For another, Chris Mattern mentioned LAN parties. (These wouldn't be quite as necessary if more PC games supported split-screen co-op, but that's a discussion for a different day.)

and ultimately you're playing with a small screen

It's far bigger than both screens of a Nintendo DS put together, or even a 3DS.

Re:Gaming + laptop = contradiction (1)

gilesjuk (604902) | more than 3 years ago | (#37031606)

People who are fanatical gamers but also have a job where they are on the road a lot?

So why not just buy a proper gaming laptop? (3, Insightful)

mlk (18543) | more than 3 years ago | (#37030860)

They are bloody heavy and expensive. And when you drop it in an airport... :sob: (x-Alienware laptop owner).

This seams like an interesting idea, get a mid-range laptop (£500 will get you an i5 with a smallish screen) and then add this and a nice big monitor for home use. That way I can get a the odd game of TF2 and about and get my work done while out and about, but get home and play something a little more taxing.

Why to buy gaming laptop at all? (0)

Anonymous Coward | more than 3 years ago | (#37030886)

If you are so addicted to gamint that you need portable gaming machine, then better stay at home. DUDE

CUDA (1)

Anonymous Coward | more than 3 years ago | (#37030906)

More interesting is whether CUDA would run across this interface. Running a Tesla board (or just a Fermi based GPU) from a laptop would be a major benefit for scientific research for which there is lots of CUDA accelerated software.

Re:CUDA (0)

Anonymous Coward | more than 3 years ago | (#37030966)

Or OpenCL, for the obligatory bitcoin post!

Re:CUDA (1)

TheRaven64 (641858) | more than 3 years ago | (#37031254)

I R'd TFA, and it seems that calling it an external graphics card is actually a bit misleading. It's basically a breakout box that turns the PCIe channel in Thunderbolt into a dedicated PCIe slot. This is a lot more interesting, because it means that you can plug any PCIe card into it, not just a graphics card. Thunderbolt's PCIe looks just like normal PCIe to the rest of the kernel, so you should be able to use any card that you have drivers for. If you're transferring a lot of data, you'll notice the bandwidth limit a bit, but if you're doing something that fits in VRAM and then does a lot of processing on it, a Tesla board should be very fast. You could alternatively plug in an FPGA dev board.

This is particularly interesting, because it means that one of the big reasons for using a desktop over a laptop - being able to plug in that one expansion card that only comes in desktop form factors that your business needs - no longer exists.

Mythtv low res app? (1)

vlm (69642) | more than 3 years ago | (#37030982)

The only problem is, Thunderbolt is only 4x PCIe 2.0, so you won't be using this to connect modern, desktop-class GPUs to your laptop

My recent interest is hardware mpeg decoding to low resolutions like 1080 HDTV (I haven't owned a computer monitor smaller than 1600x1200 since the 90s, so HDTV does seem low res to me, both absolute res and especially by DPI).

I'm curious if "something like this" would have enough horsepower to be a mythtv frontend. My gut level guess is, "probably yeah". I love my mythtv system...

Re:Mythtv low res app? (1)

StoneyMahoney (1488261) | more than 3 years ago | (#37031314)

It'll quietly piss all over something as simple as MythTV.

Also, 16x PCIe 1.0 was shown by Tom's Hardware Guide to be pretty massive overkill a few years ago. Even taking into account increased texture resolutions pumping larger quantities of data over the bus, I suspect the performance hit will be pretty minimal - I'd estimate about 5% based on Tom's old figures but I'd still like to re-run the test on up-to-date hardware to find out.

Re:Mythtv low res app? (1)

UnknowingFool (672806) | more than 3 years ago | (#37031600)

I'm curious if "something like this" would have enough horsepower to be a mythtv frontend. My gut level guess is, "probably yeah". I love my mythtv system...

Well the mythtv wiki seems to think that the Intel HD 2000 and 3000 on the new Core iSeries is sufficient [mythtv.org] for it. That is without VAAPI support which is scheduled right now for 0.25.

Thunderbolt == Docking port (1)

Trevelyan (535381) | more than 3 years ago | (#37031024)

Apple hasn't marketed as such, as least not in this neck of the woods, but Thunderbolt is clearly a Docking port. The first one ever on a MacBook!! (That I know of)

Take a look at their new Thunderbolt display. With one cable connection, your MacBook gets network, sound, firewire, USB and power(!), all via your external Display. No need to attach a second cable.

Considering that Thunderbolt already is a DisplayPort connection, I don't see the benefit of connection a second graphics card over the PCI-e connection. Some says to have a more powerfull card, over 4x PCIs 2.0?, for games. However lots of suppliers have hard PCexpress (also on MacBooks) GFX cards, but none work with Macs because Apple wont play fair with regards to GFX drivers in OS X.

In the end, to be honest, I find it far more exciting that I can finally replace the 8 cables that I have to plug into my MacBook with just one.

Re:Thunderbolt == Docking port (1)

iMouse (963104) | more than 3 years ago | (#37031088)

I believe the last time Apple had any type of docking port was in the early PowerBook / PowerBook Duo days (DuoDock). You could probably think of the MacBook Air as a modern PowerBook Duo. The Duo was designed to stay light and slim by leaving all the bulky I/O and modular drives in the docking station. It would be really nice to see Apple get back to this with Thunderbolt/MacBook Air.

If people are worried about the graphics performance of a card in the external PCIe/Thunderbolt enclosure off of a MBP, it will be even worse with the MacBook Air. The Thunderbolt controller is about 1/2 the performance of that in the iMac, MacBook, MacBook Pro and I assume the mini.

Re:Thunderbolt == Docking port (1)

Amarantine (1100187) | more than 3 years ago | (#37031514)

With one cable connection, your MacBook gets network, sound, firewire, USB and power(!), all via your external Display. No need to attach a second cable.

No power. That is supplied through, eh, a second cable.

What does Intel GMA stand for? (1)

tepples (727027) | more than 3 years ago | (#37031618)

I don't see the benefit of connection a second graphics card over the PCI-e connection.

If you've ever tried to game on an Intel "Graphics My Ass", you would.

promo products (0)

Anonymous Coward | more than 3 years ago | (#37031044)

Great blog, promo products [addvalue.com.au] thanks for sharing this..

Add these to the external displays (1)

franciscohs (1003004) | more than 3 years ago | (#37031046)

Apple got it right with the new displays that act as a docking station, providing USB ports, gigabit ethernet, another thunderbolt port, etc. Add a graphics card to it and you have the perfect docking station.

Idiot (0)

Anonymous Coward | more than 3 years ago | (#37031106)

Yeah, because no one moves around with their laptop and then docks it to use on a desk.

64 lane traffic on a 10 lane highway? (1)

mastermind7373 (1932626) | more than 3 years ago | (#37031450)

10Gb/s? Only 10 Gb/s? They want to drive a low latency GPU on only 10Gb/s? I think they forgot the whole "computer thing" when they cooked up this piece of crap. Unless the GPU is a 7600GT, this is a useless idea. Any card within the last 2 generations consumes most of the PCIe 16x 2.0 bandwidth in texture and physics memory swaps(not transfer but swaps, latency makes a huge difference here). CUDA eats through that bandwidth like a starved bear. 10Gb/s? Freakin' useless, and not just because of the speed, but also the increased latencies.

MACs Making Gaming Machines? (0)

Anonymous Coward | more than 3 years ago | (#37031496)

MACs have games?

can vs. should (1)

ChemGeek4501 (1044544) | more than 3 years ago | (#37031622)

It sounds to me as though this is a case of "Just because something CAN be done, doesn't mean it SHOULD be done"

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?