Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Wireless PCIe To Enable Remote Graphics Cards

timothy posted more than 4 years ago | from the no-cat-involved dept.

Displays 181

J. Dzhugashvili writes "If you read Slashdot, odds are you already know about WiGig and the 7Gbps wireless networking it promises. The people at Atheros and Wilocity are now working on an interesting application for the spec: wireless PCI Express. In a nutshell, wPCIe enables a PCI Express switch with local and remote components linked by a 60GHz connection. The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor. wPCIe works transparently to the operating system, which only sees additional devices connected over PCI Express. And as icing on the cake, wPCie controllers will let you connect to standard Wi-Fi networks, too."

Sorry! There are no comments related to the filter you selected.

I must admit... (1)

magsol (1406749) | more than 4 years ago | (#32906974)

That is coooooooooooool.

Re:I must admit... (1)

parlancex (1322105) | more than 4 years ago | (#32907046)

Although interesting, I imagine it will be impractical for many devices that would be adversely affected by the latency, even if the bandwidth was suitable. Additionally, wireless networks come with a certain amount of packet loss, which means even if they are encapsulating the PCI bus protocols over a TCP stream with large enough buffers to queue up lost packets, the huge burst in latency would most certainly cause strange behavior for many devices as modern OS's probably assume that if a PCI device does not respond quickly something horrible has gone wrong.

Re:I must admit... (5, Informative)

iamhassi (659463) | more than 4 years ago | (#32908036)

"I imagine it will be impractical for many devices"

You're right, and the summary is wrong and the article's a bit misleading.

"... will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor."

Sorta... PCIe 16x is 16 GB/s [wikipedia.org] , that's with a big B for bytes. They're hoping for 7Gbps, or 875 MB/s. [techreport.com] "the spec should move "quickly" to 7Gbps (875MB/s)." That's 1/20th the speed of 16x PCIe. They might be able to do PCIe x1 but that's it.

If they would have read the whitepaper that is all explained [wilocity.com] :

"A reliable wPCIe connection can be maintained with a relatively low data rate channel. However, to achieve meaningful performance between local and remote devices, the data rate needs to be on the order of 2 Gbps, or that of a single lane of PCIe. The only practical wireless channel that can support this capacity is 60 GHz."

So basically this can transfer wirelessly at ~500+ MB/s, so you can have wireless BD-ROM, wireless hard drives, and yes even wireless displays, since it's fast enough to transfer 1080i without any compression, [ttop.com] but I'm sorry to dash the hopes of anyone that thought they could someday upgrade their laptop's video card by simply buying a wireless external Radeon HD 5970 or Geforce GTX 480, you will still need a GPU connected by 16x PCIe to process the video and then stream it similar to what OnLive Remote Gaming Service offers now. [slashdot.org]

Re:I must admit... (4, Funny)

Peach Rings (1782482) | more than 4 years ago | (#32907050)

You'd better get used to your computer experience looking like thaaaaaaaaat if your display has to be sennnnnnnnt over a wireless linnnnnk.

Re:I must admit... (4, Informative)

inKubus (199753) | more than 4 years ago | (#32907246)

Not to mention security. I mean, you thought Tempest [wikipedia.org] was bad before, now I can wirelessly sniff and alter PCI traffic, which is a direct conduit into the RAM.

Re:I must admit... (1)

Peach Rings (1782482) | more than 4 years ago | (#32907396)

Does the PCI bus really work that way? Are you sure that the device controls where the data goes into memory? I would have thought that the destination is safely set up in software to point somewhere harmless like a raw data buffer, and then the device dumps into that spot.

Re:I must admit... (5, Interesting)

JesseMcDonald (536341) | more than 4 years ago | (#32907688)

Some recent systems have IOMMUs which provide privilege separate between hardware devices much like normal MMUs govern software. However, unless this sort of IOMMU device is active, PCI and PCIe hardware is generally capable of transferring data to or from any other connected device, including any area of system RAM. Sometime this can even extend to external interfaces; for example, people have been known to take advantage of the DMA capabilities of the Firewire protocol to read the contents of RAM on an active system.

In general, non-hotpluggable hardware has been granted the same level of trust as the OS kernel, so no one worried very much about it. IOMMUs were more about protecting against faulty or corrupted software (device drivers) than malicious hardware. However, more and more hardware is hotpluggable these days. Also, some software interfaces are becoming too complex to really trust—consider, for example, the interface to a modern GPU, which must transfer data to and from RAM, and perhaps other GPUs, under the control of code provided by user-level application software (shaders, GPGPU). Without an IOMMU it is up to the driver software to prove that such code is perfectly safe, which is an inherently hard problem.

Re:I must admit... (1)

tatsu69 (59184) | more than 4 years ago | (#32907740)

Yes it really does work that way. Any PCI device in your system can read/write to any location it can address. If the device only has 32-bit PCI then it is limited to the lower 4G of memory space, if it is 64-bit PCI then it can go anywhere.

There is an IOMMU (http://en.wikipedia.org/wiki/IOMMU) but I am not very familiar with it. More modern machines than I was working with would probably implement this for protection from the device.

Re:I must admit... (1)

SpazmodeusG (1334705) | more than 4 years ago | (#32908090)

Yes. All low level devices are wired to the CPU by the memory bus. Writing data to a PCI card is simply a matter of writing to a certain memory address. The PCI card will see a specific address on the memory address bus and know the data on the data bus is intended for it.
It's not like x86 CPUs have one bus for devices and one for memory.

Re:I must admit... (4, Insightful)

blueg3 (192743) | more than 4 years ago | (#32908010)

You're unlikely to be able to *alter* PCI traffic, though you could perhaps *insert* PCI traffic.

Still, people figured out properly encrypting wireless links some time ago. Tempest is primarily interesting because the signals you're looking at are unintentional (and often unknown) side effects and they often deal with links that are impossible or unreasonable to encrypt.

Re:I must admit... (1)

grcumb (781340) | more than 4 years ago | (#32908176)

Not to mention security. I mean, you thought Tempest [wikipedia.org] was bad before, now I can wirelessly sniff and alter PCI traffic, which is a direct conduit into the RAM.

Yep. Can't wait. Endless fun and games at the next Powerpoint presentation by Corporate.

Re:I must admit... (1)

pckl300 (1525891) | more than 4 years ago | (#32907248)

Can't they develop a really high-bandwidth external graphics port designed for this sort of thing? I'm thinking along the lines of eSATA, but for graphics. Maybe ePCIe?

Re:I must admit... (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#32907296)

What would that other e stand for?
Express PCI Express?

Re:I must admit... (1)

pckl300 (1525891) | more than 4 years ago | (#32907404)

External PCI Express. Like I said, similar to eSATA

Re:I must admit... (1)

kevinmenzel (1403457) | more than 4 years ago | (#32907416)

External PCI express... I know I know ... "woosh" or something.

Re:I must admit... (2, Informative)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#32907364)

There have been various ad-hoc solutions to the problem, nothing standardized has yet hit the field, though the PCI-SIG has an initial standard [pcisig.com] . These guys [magma.com] are representative enough of the sort of products actually available, usually break-out boxes to allow laptops or undersized desktops to run a few more cards. A few more specialized instances, for the laptop market, have consisted of basically your usual docking station; but with a cable that plugs into an expresscard port, rather than a proprietary connector.

Re:I must admit... (1)

sexconker (1179573) | more than 4 years ago | (#32907544)

They did [google.com] .
And no one used it.

Re:I must admit... (4, Informative)

Entropius (188861) | more than 4 years ago | (#32907692)

It's not that bad -- I've done it before.

X Windows over plain old wifi.

Re:I must admit... (1)

morcego (260031) | more than 4 years ago | (#32907462)

Is it ? From the specification, you can read:

Support for beamforming, enabling robust communication at distances beyond 10 meters

So, the standard range is less than 10 meters ? This is anything but awe inspiring.

Maybe the text is misleading, and it is not a standard 10m range. But that is the impression I get ...

Re:I must admit... (1)

Rivalz (1431453) | more than 4 years ago | (#32907596)

I just hope Apple isn't behind this technology.

You'll get degraded signal if you hold or sit next to your computer. And if someone using a microwave within 100meters you lose signal completely. But that is a feature so you know when your food is done cooking.

Re:I must admit... (1)

h4rr4r (612664) | more than 4 years ago | (#32907626)

What did you expect at 60Ghz?
This signal will not penetrate a sheet of paper.

Re:I must admit... (1)

peacefinder (469349) | more than 4 years ago | (#32908006)

That was my first thought as well.

My second thought was "I wonder how they're going to handle security and authentication?" Which rather took the shine off my first thought, I'm afraid.

Finally! (1)

binarylarry (1338699) | more than 4 years ago | (#32906982)

A solution to all of OnLive's problems! Now they'll be able to put an access point in every neighborhood!

like cable vod? wait having it at the headend soun (1)

Joe The Dragon (967727) | more than 4 years ago | (#32907420)

like cable vod? wait having it at the headend sounds like a better idea then what they have now.

Re:Finally! (1)

h4rr4r (612664) | more than 4 years ago | (#32907636)

At 60GHz those homes better be very close together and very small. Is your house wider than 10m?

hm (0)

Anonymous Coward | more than 4 years ago | (#32906984)

what if I don't read Slashdot?

Re:hm (3, Funny)

Anonymous Coward | more than 4 years ago | (#32907048)

Then it won't make a sound.

Question (3, Insightful)

The Clockwork Troll (655321) | more than 4 years ago | (#32906994)

To those in the know, why will this succeed where UWB/wireless USB failed in the market?

Remote graphics seems like an even more esoteric need than the remote mass storage, printing, cameras that UWB would have offered?

Re:Question (3, Insightful)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#32907490)

I have no idea whether it will go anywhere; but I'd assume that the one major strike in its favor is that, unlike wireless USB, wireless PCIe addresses use cases that basic boring ethernet/wifi do not.

The performance of early wireless USB hardware was pretty shit, and it was uncommon and ill standardized, so you usually still had to plug a dongle in, just to get performance worse than plugging in a cable. Plus, basic NAS/print server boxes had become really cheap and fairly easy to use. Anybody who wasn't a technophobe or living in a box(and thus not the target market for pricey and sometimes flakey wireless USB) already had his mass storage and printers shared over a network, wired or wireless, and his human interface devices wireless via bluetooth or proprietary RF, if he cared about that. Wireless USB didn't really enable any novel use cases that anybody cared about.

On the other hand, there is basically no way of plugging in "internal" expansion cards over a network(in the home context, I'm sure that some quite clever things have been done with I/O virtualization over infiniband, or whatever). Particularly with the rise of various "switchable graphics" technologies, I assume that the use case is basically this: User has nice thin, light, long-running laptop. They come home, sit within a dozen meters of a little box(containing a graphics card or two, with one head connected to their TV), and suddenly their laptop has a screaming gamer's graphics card supplementing the onboard card, either usable on the built-in screen, or via the second head connected to the TV, or both.(Analogs could be imagined for professional mobile workstation applications, where simply sitting near your desk connects you to a quartet of CUDA cards and an SAS controller with 4Us worth of drives hanging off it.

Will the market care, enough to bring the volume up and the price down? I have no idea. However, it at least has the advantage of allowing things not otherwise possible, unlike wireless USB, which pretty much covered the same ground as a mixture of bluetooth peripherals and resource sharing protocols over TCP/IP; but years later and without the standardization..

Re:Question (1)

Urza9814 (883915) | more than 4 years ago | (#32907514)

Yes, but remote graphics is much more difficult to do over WiFi. When you already have a router, why buy wireless USB or UWB devices, which you need a special dongle or card for, when you can just buy one with WiFi and be done with it. Plus, if you're doing wireless, you're likely using it for multiple PCs, which is even more reason to go with something you already have. Who's going to buy a separate $20-$50 dongle for every computer they want to print from, for example, when they don't need to? 802.11g offers plenty of speed for printing, cameras, and while it's a far cry from USB2.0 for remote storage, I would imagine that 54Mb/s is good enough for a large portion of users.

Re:Question (1)

johnhp (1807490) | more than 4 years ago | (#32908004)

I disagree that the need for this kind of thing is esoteric.

I live in a house where there are three computers and three people who use them to play games. As it is, if we're all going to play the newest games (which we don't) we would need to keep three computers upgraded. If the graphics processing was shared by all three from some kind of household graphics server, it would certainly be cheaper to meet everyone's needs.

Hardware graphics acceleration is definitely catching on for general use. Windows 7 and Ubuntu both use it in their regular desktop environment.

Good news, everyone! (4, Insightful)

Drakkenmensch (1255800) | more than 4 years ago | (#32906998)

We'll soon have ONE MORE wireless signal to keep track of, when all those we already have work so well together!

Re:Good news, everyone! (1)

sharkey (16670) | more than 4 years ago | (#32907528)

Even better, no more holding your balls in the microwave to get cancer! Getting that medicinal marijuana prescription will be easier than ever.

Re:Good news, everyone! (1)

Fallingcow (213461) | more than 4 years ago | (#32907718)

Yeah, this'll be awesome for early adopters and then start to suck as soon as their neighbors get it, kind of like how 802.11g sucks but N is still OK (for now).

Also, people getting pwned in online games will stop saying "fucking lag!" and start saying "goddamn microwave!"

Watch out! (1)

beaviz (314065) | more than 4 years ago | (#32907012)

This could take rickrolling to a whole new level.

"Band"-aid (1, Interesting)

Ostracus (1354233) | more than 4 years ago | (#32907014)

Nice but what's the range, and is the spectrum licensed or will we end up dealing with a "tragedy of the commons" much like the 2.4 Ghz band?

Re:"Band"-aid (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32907150)

Wait, are you saying that not requiring a license for 2.4GHz was a bad thing? That's the only reason wifi took off.
What sucks about wifi is the minuscule width of that unlicensed band.

Re:"Band"-aid (1)

Peach Rings (1782482) | more than 4 years ago | (#32907212)

It's unlicensed. If it were wider, wireless phones and stuff would just use the entire wider band. We've seen this before with 802.11n: "Why let different carriers broadcast simultaneously on different bands when we can just take the entire spectrum and make our network super fast?"

Re:"Band"-aid (4, Informative)

balbus000 (1793324) | more than 4 years ago | (#32907220)

Very short. It has a hard time going through air; walls - forget it.

Just skip the video card (0)

Anonymous Coward | more than 4 years ago | (#32907022)

Beam it into my brain.

Want, but... (0)

Anonymous Coward | more than 4 years ago | (#32907026)

For an external graphics card, I have to wonder if its 7gbps throughput going to be limited by the time necessary to transform wireless packets into FSB data streams on either end. I suspect the bottlenecks there might be substantial.

That said, if it works as advertised, this could be the new holy grail for laptop gaming: no more would graphics power be limited by soldered-in cards and upgrades requiring a replacement laptop; just add a few more wireless graphics cards.

Re:Want, but... (1)

Peach Rings (1782482) | more than 4 years ago | (#32907234)

Just change a few physical constants to open up some more bandwidth.. emacs has a command for that right?

Great Breakthrough, Limited Performance (1)

Revotron (1115029) | more than 4 years ago | (#32907042)

That sounds like a wonderful idea and the thought of having a wireless graphics card for a laptop is very tempting.

But how much performance can we really squeeze out of it? I mean, for a power user who wants a higher resolution than his integrated card can offer, it's a godsend. But for gaming? No way.

Also, I'll admit I'm not very wise on the technical details of PCIe, but if you're putting all of the above-mentioned devices in contention for 7Gbps of bandwidth, there's really not a lot you can milk from it in terms of real horsepower. One PCIe 2.x graphics card would shut out everything else, or be severely lacking in performance. If you want PCIe 3.x, forget about it.

OTOH, anyone gaming on a laptop and expecting "performance" comparable to a desktop is daft.

Re:Great Breakthrough, Limited Performance (1, Flamebait)

geekoid (135745) | more than 4 years ago | (#32907084)

" But for gaming? No way."

"I'll admit I'm not very wise on the technical details of PCIe,"

well, at least you didn't let your ignorance stop you from making up your mind.

Re:Great Breakthrough, Limited Performance (0, Flamebait)

Revotron (1115029) | more than 4 years ago | (#32907470)

At least your arrogance is actually assisting you in being an asshole.

When I say technical details, I mean highly technical details above even most of the Slashdot crowd. I know bus speeds, versions, bandwidth and what the fuckin' connector looks like. And that combined knowledge is enough to tell me that 7Gbps isn't enough to effectively run a modern PCIe 2.0 graphics card on all 16 lanes at full blast. (which would take 8Gbps, and sometimes even that's not enough, hence PCIe 3.0 running at 16Gbps)

Re:Great Breakthrough, Limited Performance (0, Offtopic)

geekoid (135745) | more than 4 years ago | (#32907912)

It wasn't an asshole statemnt. Based on YOUR post, you didnt know the technical details of PCIe.

And you made up your mind about the technology YOU said you didn't know.

" I know bus speeds, versions, bandwidth and what the fuckin' connector looks like."
well you didn't say that in your post now, did you?

Many games, if not most, don't need it to run at 'full blast'

It's in no way arrogant to point out that, based on your post, you didn't know about it but still made up your mind.
You are being arrogant in making the assumption every knows you and know what you know.

Re:Great Breakthrough, Limited Performance (0, Flamebait)

geekoid (135745) | more than 4 years ago | (#32908022)

since I am still fuming from my flaimebait mod and your arrogance to assume I should have known everything you know, there is more.

pcie 2.0 actually data transfer is 5G plus 20% due to over head. A significant portion of the 3.0 speed increase comes from scramble. much less over head. So you could get more speed then the PCIe 2 in terms of real data using PCIe3 Overhead model. Meaning if it uses PCIe 3, you could probably get 6G prior to overhead; which is more the PCIe 2 in terms of usable data delivered. Granted, to get to pcie 3 speeds, they would need to do some channel magic.

But hey, since you can recognize the connector, I guess you know that... see, NOW i'm being an asshole.

Re:Great Breakthrough, Limited Performance (1)

geekoid (135745) | more than 4 years ago | (#32907110)

I also forgot, you can get laptops, right now, that are comparable to the desktop. I condier them a complete waste, but that's besides the point.

Re:Great Breakthrough, Limited Performance (1)

MozeeToby (1163751) | more than 4 years ago | (#32907278)

I only paid around $1k for my laptop recently and there are no games currently available that I can't run on at least medium settings. And that's even opting for a better screen than graphics card when I made my purchase. Can I game at super hacker leet graphics levels? No, but I can play all modern games with decent settings and a decent framerate.

Re:Great Breakthrough, Limited Performance (0)

Anonymous Coward | more than 4 years ago | (#32907944)

So can a desktop with a $100 card, your point being?

Re:Great Breakthrough, Limited Performance (1)

Fallingcow (213461) | more than 4 years ago | (#32908130)

I bought one of those a couple years ago, mostly because I planned to do video editing on it; gaming was a nice bonus.

It's a piece of shit. Runs so hot that everyone who uses it comments on the heat, but if you throttle it any it feels crazy-slow. The heat's so bad that if I don't make some sort of special arrangement for it to sit up where it can get airflow, it'll overheat and shut down during games (or sometimes just while playing back a video!). The damn rear vent points down and to the back at a 45 degree angle, meaning that it heats any surface it's on until it's no longer cooling efficiently. Worst. Design. Ever.

TL;DR: I'm sticking to netbooks from now on, and leaving the heavy lifting to desktops and consoles.

Hell, these days I could buy a decent netbook and a decent gaming desktop (by which I mean a brain transplant for my existing, old as hell desktop) for the $900 I paid for this thing. $400 netbook, $500 for parts, done.

For the record, it's an HP Pavillion from the dv6000 series.

Re:Great Breakthrough, Limited Performance (1)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#32907526)

I suspect that the main factor is not the price premium associated with powerful laptops(which is much more modest than it used to be, though still nonzero); but the heat/weight/battery life penalty.

A screaming gamer laptop is actually pretty reasonably priced these days, and only a bit slower than the screaming gamer desktop. However, it is still hot, heavy, and loud, and doesn't get thrilling battery life.

The convenience would be being able to buy a thin-and-light with excellent battery life, that can become a screaming beast if plunked down within range of a supplemental card...

Satan! (0)

Anonymous Coward | more than 4 years ago | (#32907054)

Stalin! Vade retro!

Think of the children (0)

Anonymous Coward | more than 4 years ago | (#32907062)

Just wait until the public interest groups find out. Given some of the other uses [wikipedia.org] for the 60 GHz wavelength it's only a matter of time until WiGig gets accused of promoting child porn...

cool (0)

Anonymous Coward | more than 4 years ago | (#32907082)

wi-pi anyone?

Pci-e x1 is to slow for all of that video will suc (2, Informative)

Joe The Dragon (967727) | more than 4 years ago | (#32907092)

Pci-e x1 is to slow for all of that video will suck at that speed and then you want to add more io to it?

what will this buy you? (0)

Anonymous Coward | more than 4 years ago | (#32907094)

render and then encode the video. Many games can already be played like this. It seems like offloading the video card would only add latency to the process.

hmmmm...... (1)

anexkahn (935249) | more than 4 years ago | (#32907156)

Can I use this with my phone?

Re:hmmmm...... (1)

Locke2005 (849178) | more than 4 years ago | (#32907398)

Yes... if you don't mind your battery life being measured in seconds instead of hours.

I must admit... (0)

Anonymous Coward | more than 4 years ago | (#32907174)

...i am terrified by the security implications of that...and would be totally annoyed if my basic computer devices stopped working when the microwave was running.

Very practical (5, Funny)

ultramk (470198) | more than 4 years ago | (#32907178)

The best feature of this proposed standard is that if you place a ceramic mug directly between your CPU and the external graphic processor, it will keep your (coffee, soup or whatever) steaming hot, all day long! Those days of long WoW raids with only cold beverages and snacks are over!

Re:Very practical (1)

sharkey (16670) | more than 4 years ago | (#32907504)

Those days of long WoW raids with only cold beverages and snacks are over!

Um, that's what Mom is for. Can't you just put in an intercom? [youtube.com]

Re:Very practical (1)

Hurricane78 (562437) | more than 4 years ago | (#32907598)

I “read the book” (as they used to say), and that will only work if you put it on a rotating platter. Or just use a spoon.

Hmm, from what I know, this should actually work (using a spoon to make the fluid rotate in the mug in the field). But I doubt you can buy a 800W wireless transmitter in your normal electronics shop. ;)

Something wireless I might not hate? (3, Interesting)

starslab (60014) | more than 4 years ago | (#32907180)

I will admit some incredulity when I read the title. "Wireless <i>what?!</i>"

Very cool stuff if it materializes.

Imagine a small lightweight machine with say an ULV i3 or i5 CPU, small-ish screen and weak-ass integrated graphics. Place the machine on it's docking pad (No connectors to get bent or boards to break) and suddenly it's got (wireless?) juice and access to kick-ass graphics, and a big monitor, as well as whatever else is in the base-station.

A desktop replacement that remains light and portable for road warriors, with none of the fragility associated with docking connectors. With those transmissions speeds I presume this is going to be a point-blank range affair, so snooping shouldn't be (much?) of a problem.

Re:Something wireless I might not hate? (1)

Gazoogleheimer (1466831) | more than 4 years ago | (#32907412)

Connectors bending and boards breaking are the result of bad design--and not an excuse to use a wireless system where a wired system excels. Well-designed interconnects allow higher speed, significantly greater security, lower cost, and less interference--like wireless charging, WiFi compared to wired Ethernet, and a FM transmitter on your iPod compared to a straight cable--there's no reason to use a wireless system when you don't have to. It's just a waste.

Not exactly... (1)

Ecuador (740021) | more than 4 years ago | (#32907696)

Wilocity told us that wPCIe can push bits at up to 5Gbps (625MB/s), and that the spec should move "quickly" to 7Gbps (875MB/s).

If you consider that PCIe 16x is 16GB/s (128Gbps), this is very underwhelming. Call me a sceptic but I don't see a real-world application of "wireless PCI-E" that is slower than a 1-lane PCI-E. Well, at least a real-world application regarding graphics...

Re:Not exactly... (0)

Anonymous Coward | more than 4 years ago | (#32907876)

You do realize that PCIe 1.0 has more than enough bandwidth for all but the most recent, top-end video cards, right?

Re:Not exactly... (1)

Ecuador (740021) | more than 4 years ago | (#32908162)

You do realize that PCIe 1.0 16x is still 4GB/s, right? The point is, would the integrated graphics of a laptop be slower than a card limited to less than 1GB/s? I bet the answer is no.

Re:Something wireless I might not hate? (1)

w0mprat (1317953) | more than 4 years ago | (#32907730)

Your imagination is not going far enough. Imagine just merely placing wPCIe enabled PC components on a desk, getting power from an inductive pad even. Your rig is a cluster of bits with no connecting wires. wPCIe really means that your system southbridge chip is going to be a kind of wireless access point to whatever devices are a metre or two away.

wPCIe enabled hard drives will completely erase the need for both 'internal' and 'external' HDDs.

You'll have small flat box with a motherboard + CPU and Ram. You'll just have a pile of GPUs, HDDs, SSDs, and other add-in cards heaped up on your desk. That's your system. No technical expertise is needed to reconfigure a computer for whatever needs - anyone will be able to do it.

Much like my desk now, except it'll all be able to work at once.

Me: "Hey Dude could I borrow your nVidia card I just want to try SLI here?"

Dude: "Yeah here you go" *enables wireless*

Me: *wPCIe connected* "Sweet it's working."

So join me in thinking this is truely awesome. They say the PC is dying because of iPads, Netbooks, the alignment of the planets, but this kind of thing will bring desktop hardware back into relevancy.

Re:Something wireless I might not hate? (1)

Gazoogleheimer (1466831) | more than 4 years ago | (#32907892)

sadly, as I just mentioned, however, that's a simple waste of money and electricity.

Re:Something wireless I might not hate? (0)

Anonymous Coward | more than 4 years ago | (#32907998)

Imagine a small lightweight machine with say an ULV i3 or i5 CPU, small-ish screen and weak-ass integrated graphics. Place the machine on it's docking pad (No connectors to get bent or boards to break) and suddenly it's got (wireless?) juice and access to kick-ass graphics, and a big monitor, as well as whatever else is in the base-station.

Wow, so I can now use wireless to go the WHOLE TWO INCHES to the docking station? Fucking retarded.

Light Peak will slap this around so hard bits will fly out.

Re:Something wireless I might not hate? (0)

Anonymous Coward | more than 4 years ago | (#32908170)

or... you could just walk over to your 'base station' and switch on a faster computer. Transfer any data you need to the faster computer.

Yes please (1)

pckl300 (1525891) | more than 4 years ago | (#32907198)

I've been wanting an external graphics card for my laptop for a while now, unknowing that the technology was in development. This is awesome.

Not for me (1)

DigiShaman (671371) | more than 4 years ago | (#32907202)

Seriously?!! With timing issue and precision required by the GPU to interface with the rest of the system, do we really want it bridging over WiFi (60Ghz)? Of all the devices, this is one peripheral I'd want to leave with physical bus access (electron flow). That, and the CPU and RAM.

Re:Not for me (1)

Tapewolf (1639955) | more than 4 years ago | (#32907780)

I think the fun part is when the video card is disconnected or the signal strength drops. Bus dropouts should be especially fun when the GPU is running some kind of program code...

Re:Not for me (1)

geekoid (135745) | more than 4 years ago | (#32908034)

well if you can't see how it would be useful, then clearly it's no good~

Security? (0)

Anonymous Coward | more than 4 years ago | (#32907226)

PCIe over Wireless at 60GHz sounds like a security nightmare. Of course, not one seems to think about tha... oh shiny!

wait, what? (0)

Anonymous Coward | more than 4 years ago | (#32907256)

I thought they wanted to bring the GPU *closer* to the CPU for performance's sake.

So this seems counter-intuitive.

Re:wait, what? (1)

Dunbal (464142) | more than 4 years ago | (#32907362)

But can't you smell the per-monitor pricing scheme coming up?

Everyone keeps thinking GPUs (1)

Irick (1842362) | more than 4 years ago | (#32907318)

I'm thinking wireless monitors. There is more then enough bandwidth there to drive some really high resolution screens. Just pop your laptop near your desk setup and poof, about ten times the functionality. Not to mention popping this sucker into a smartphone would allow another large degree of mobile computing.

Re:Everyone keeps thinking GPUs (1)

tophermeyer (1573841) | more than 4 years ago | (#32907566)

Not to mention popping this sucker into a smartphone would allow another large degree of mobile computing.

Yeah, that was the thought that popped into my head. It would be cool to wirelessly and effortlessly connect my super-powered smartphone to a keyboard, mouse, and monitor. With all the computing power being crammed into smartphones, that would be a really awesome way to set up home and office workstations. I'm not talking about running Crysis, but for web surfing and document editing this would be a cool application.

Just great! (1)

Locke2005 (849178) | more than 4 years ago | (#32907380)

Now hidden cameras will be able to stream up-skirt videos in HD!

This is really supprising! (0)

Anonymous Coward | more than 4 years ago | (#32907424)

Talk about thinking, ahem, out of the box!

Good luck with that.... (0)

Anonymous Coward | more than 4 years ago | (#32907450)

The choice of the PCI bus being wireless is interesting. Essentially they are turning every peripheral into a hot-swap device. Good luck getting ATI/NVidia drivers for that one. The amount of state stored on the GPU is just astronomical (1GB texture memory! not to mention the entire register space, maybe some context switch information). The corner cases on that one would be brutal... Seems like they would be better off using a more stateless interconnect for wireless graphics, such as DVI/TMDS. Overall practical considerations would seem to relegate this to a secondary display, which is useful, but definitely not "killer".

Same concept applies to any PCI peripheral over this sort of interface (well, any PCI peripheral with state on the peripheral itself).

Re:Good luck with that.... (1)

funwithBSD (245349) | more than 4 years ago | (#32907506)

I would bet it would be for a overhead display or something like that, not Crysis.

Neat idea but it'll suck where it needs to shine (2, Insightful)

jtownatpunk.net (245670) | more than 4 years ago | (#32907712)

Let's say I've got even a little building with 50 people who want to use this. Will I be able to pack 50 of these point-to-point units into a building and have all of these systems perform at peak capacity without stepping all over each other? That would be amazing.

And, aside from the technical issues of getting it to work well in a dense environment, there's still one cord that needs to be connected to the laptop. Power. If I have to plug that in, I may as well snap the laptop into a docking station and skip the wireless connection entirely. One connection is one connection and I won't have to worry about interference, security, bandwidth, etc.

Re:Neat idea but it'll suck where it needs to shin (0)

Anonymous Coward | more than 4 years ago | (#32908052)

At 60GHz this will behave more like free space optics than radio waves... just hope nobody holds a sturdy sheet of paper between a pair of endpoints :P

Re:Neat idea but it'll suck where it needs to shin (1)

geekoid (135745) | more than 4 years ago | (#32908074)

what are you doing where all the people need to be pushing 7G constantly across the bus?* If that's the case, the it's probably not for that situation. Most people in most office don't need to be using that kind of data all the time.

You could create a reliable system so you could take your laptop anywhere and have it display on a large screen or projector. so you walk into a meeting room and it links up. You want to display something on your TV, it links up.

Perhaps you have a hand held device and want to share on a bigger screen? and so on

Re:Neat idea but it'll suck where it needs to shin (1)

jtownatpunk.net (245670) | more than 4 years ago | (#32908244)

Did you miss the part where they're talking about docking stations with video cards built in, USB3, network, etc.?

"The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor."

I don't know how your company works but, around here, we expect people to show up at roughly the same time every day and...erm...work. Like simultaneously. And, yes, many of our laptop users prefer docking stations to plugging and unplugging power/network/video/keyboard/mouse/monitor every time they come and go. And, yes, quite a few people use their laptops exclusively. In fact, very few people have both a laptop and a desktop. What would be the point?

So, yeah, the scenario described as the first application of this new technology that we can expect to see involves people pushing lots of data across the connection for long periods of time.

RF bath anyone? (1)

synthesizerpatel (1210598) | more than 4 years ago | (#32907846)

As a skeptical person who usually maintains a scientific 'prove your crazy theory if you expect my buy in' ideology...

I have to say if you had to bet money on what wireless technology actually WILL cause cancer and your options are cell phone, wireless access point or wireless PCIe sounds, I think wireless PCIe would win.

What Killer App? (1)

FrankDrebin (238464) | more than 4 years ago | (#32907930)

The diagram shown at TFA indicates a single PCIe lane (x1) is provided. What PCIe devices would benefit from being wireless?

  • Graphics Card? 1x is hardly the cutting edge in graphics card bandwidth. Modern mainboards often have x16 slots for graphics cards. Decent if you must do *wireless* video, but inferior to say HDMI from a laptop.
  • Wired NIC? Maybe, but 802.11n has similar bandwidth with actual range.
  • USB host controller? Seems kinda silly given the ubiquitous ultracheap wireless mice and keyboards everyone already has.
  • Wireless NIC? Head assplode.
  • small niche items don't seem worth the massive investment to develop the technology

Graphics isn't so interesting-- storage + network (1)

Animal Farm Pig (1600047) | more than 4 years ago | (#32907968)

I wouldn't be so interested in running a graphics card. I'd be more interested in an enclosure/docking station with hardware raid controller that could accept a couple of 3.5" disks. Give me a couple of PCIe slots, so I could plug-in, for example, a quad port NIC. Built-in battery in the docking station would be nice.

I like this though (1)

sea4ever (1628181) | more than 4 years ago | (#32908032)

Wouldn't this allow huge server-like GPU-farm-like devices to be wirelessly hooked up to any extremely lightweight machine that's nearby via wireless? Of course the wireless latency would be horrendous, but if it could be improved somehow then I imagine this would be amazing!
In other news, why don't they work on using cables first? A fibre-optic link to an external GPU sounds brilliant. (since GPU's introduce a lot of heat, and size is a problem, this will eliminate those problems)

Could Revitalize Desktop Market (1)

Fartypants (120104) | more than 4 years ago | (#32908082)

Sounds like this has the potential to revitalize the desktop market.

While the wireless docking application for laptops sounds like it has great potential, the promise of using this technology to supercharge low-cost tablets, netbooks and mobile devices when in range of a desktop seems too good to pass up. This would provide a strong incentive for me to buy everything in that ecosystem.

Re:Could Revitalize Desktop Market (1)

tchdab1 (164848) | more than 4 years ago | (#32908186)

Agree. My immediate thought upon reading the intro was "use with virtually tethered tablet for use near a base station, like in the house."
The speed of the connection allows for an ultralight tablet with "unlimited" supportive stuff off the tablet and located at the base.
Doesn't appear to do anything I can envision for the true classic desktop itself; perhaps improved server connection at home? Dunno.

And it will introduce another wireless standard with which to confuse the masses.
And to add to phones.

"Nutshell" (1)

Etcetera (14711) | more than 4 years ago | (#32908116)

In a nutshell, wPCIe enables a PCI Express switch with local and remote components linked by a 60GHz connection. The first applications, which will start sampling next year, will let you connect your laptop to a base station with all kinds of storage controllers, networking controllers, and yes, an external graphics processor.

I don't know about you, but I don't want to have something operating at 60GHz sitting in my lap, thanks... I'll stick to super-long HDMI or DVI cables if I need to route a monitor signal.

Re:"Nutshell" (1)

mx_mx_mx (1625481) | more than 4 years ago | (#32908194)

You do realize that you now sit right next to something that operates at 400-700 THz??

Awesome, I will own you now. (1)

pclminion (145572) | more than 4 years ago | (#32908122)

Wireless PCI Express? Awesome. I'll just walk by with a specially designed device, master the bus, and DMA the entire contents of your RAM over to a laptop. Then I'll change some interesting bytes here and there, and DMA it back.

This sounds like the dumbest attack vector since FireWire came out with physical DMA support.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?