Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ubiquitous Multi-Gigabit Wireless Within Three Years

Zonk posted more than 7 years ago | from the now-where-is-my-hud dept.

Networking 152

Anonymous Howard passed us a link to the Press Escape blog, and a post about the future of ultra-fast wireless connectivity. Georgia Tech researchers unveiled plans to use ultra-high frequency radio transmissions to achieve very high data transmission rates over short distances. In a few years, the article says, we'll have ubiquitous multi-gigabit wireless connectivity, with some significant advances already under their belts. "GEDC team have already achieved wireless data-transfer rates of 15 gigabits per second (Gbps) at a distance of 1 meter, 10 Gbps at 2 meters and 5 Gbps at 5 meters. 'The goal here is to maximize data throughput to make possible a host of new wireless applications for home and office connectivity,' said Prof. Joy Laskar, GEDC director and lead researcher on the project along with Stephane Pinel. Pinel is confident that Very high speed, p2p data connections could be available potentially in less than two years. The research could lead to devices such as external hard drives, laptop computers, MP-3 players, cell phones, commercial kiosks and others could transfer huge amounts of data in seconds while data centers could install racks of servers without the customary jumble of wires."

cancel ×

152 comments

Sorry! There are no comments related to the filter you selected.

do not underestimate... (1)

LiquidCoooled (634315) | more than 7 years ago | (#19917491)

do not underestimate the bandwidth of a harddrive being passed to the friend next to you.

Re:do not underestimate... (1)

Iam9376 (1096787) | more than 7 years ago | (#19917771)

These speeds are basically marketing hype. the need to start declaring 54MB at [b]half-duplex[/b]. We all know marketers love big numbers, but what they dont tell you is it runs at half-duplex, so you lucky to even get half of the rated speed. Although, at higher bandwidth rates (and hopefully increased throughput!) this will become less and less of a problem.

Multi-Gigabit [half duplex] FTW!...(ish)

Re:do not underestimate... (0)

Anonymous Coward | more than 7 years ago | (#19917927)

40MB/s writing to the hard drive. 40MB/s reading from the hard drive. Since this can't be done at the same time, that's an average bandwidth of 20MB/s. Not so great if you ask me. Even slower if you're using 2.5" drives or a cheap controller.

Re:do not underestimate... (1)

lostguru (987112) | more than 7 years ago | (#19920127)

but the point is that if you throw a hardrive to a friend, say containing 40gb of data and the throw takes 4 seconds, then you have just achieved 10gigabytes per second bandwidth wirelessly!

now if only we could throw faster!

Re:do not underestimate... (1)

powerpants (1030280) | more than 7 years ago | (#19920355)

4 seconds would actually be a pretty long hang time for a thrown object, unless you're counting the Rollie Fingers windup. :)

Re:do not underestimate... (4, Funny)

Doctor Memory (6336) | more than 7 years ago | (#19920409)

Maybe your friend's in the back yard, and you're on the second floor. Oh, and the RIAA's at the front door...

Does it use Fullerton's AMAZING WIRELESS TECH ? (0, Troll)

posys (1120031) | more than 7 years ago | (#19918029)

Crowding Issues resolved by the AMAZING WIRELESS technology from Larry Fullerton, read this first: http://www.engology.com/eng5fullerton.htm [engology.com] then this http://www.timedomain.com/ [timedomain.com] This is like nothing you have ever heard of, in that it is ULTRA LOWPOWER, ULTRA WIDE BAND not in the typical sense, UNDETECTABLE, and so super scary that the military has embargoed its FULL use precisely because it is undetectable by conventional frequency scanners... Also please check out http://teaminfinity.com/robo_economy_wire [teaminfinity.com] for great news on ROBOTIC FRONT

metric? (0)

Anonymous Coward | more than 7 years ago | (#19917509)

"per" is written /, not p. So people should write Gbit/s or Gb/s.

Re:metric? (4, Informative)

Anonymous Coward | more than 7 years ago | (#19917825)

Says some who's obviously not old enough to drive a car and see "mph" or "kph" on the dash.

Posted anon cuz you ain't worth the karma. ;-)

A long way to go (1)

mathmatt (851301) | more than 7 years ago | (#19917997)

15 gigabits per second (Gbps) at a distance of 1 meter, 10 Gbps at 2 meters and 5 Gbps at 5 meters
...that means 19 Mbps at 100 meters and 4 bps at 1 km
dialup anyone?

Channeling Homer Simpson (0, Offtopic)

egyptiankarim (765774) | more than 7 years ago | (#19917521)

Hey, they've got the Internet on computers now!

Microwaving your privates? (2, Funny)

Aqua_boy17 (962670) | more than 7 years ago | (#19917537)

we'll have ubiquitous multi-gigabit wireless connectivity, with some significant advances already under their belts.
If they're running this from laptops for extended periods, that may be the only thing remaining under their belts.

Re:Microwaving your privates? (4, Funny)

Moby Cock (771358) | more than 7 years ago | (#19917719)

Technically speaking, at 60GHz, you'd be millimetrewaving your privates.

...for that matter... (4, Insightful)

drakaan (688386) | more than 7 years ago | (#19917747)

while data centers could install racks of servers without the customary jumble of wires

Somehow I don't see "whole data centers" using a data transmission method where any device can potentially intercept the data going to and coming from any other device. Might make your hosting clients a bit nervous.

Re:...for that matter... (1)

wurp (51446) | more than 7 years ago | (#19919807)

uh, encrypt the traffic...

Re:...for that matter... (1)

Doctor Memory (6336) | more than 7 years ago | (#19920213)

I didn't read TFA, but I find it hard to believe that there's enough spectrum available to permit a dozen or so racks of 1U servers to communicate with UHF signals. Especially if (as is becoming common) they're hooked up to both a SAN and a router. Couple the bandwidth required for both signals with frequency separation requirements (so signals don't interfere with each other) and pretty soon you've got signals spread across more spectrum than one antenna can handle effectively. Then what? Do you install multiple transceivers, each handling a slice of the spectrum? Put multiple antennas on a single box and try to multiplex them (yeah, good luck with that)? Divide your data center into "zones" with some kind of shielding to prevent signal bleed?

This sounds neat for near-field applications (everything on your desk can talk to everything else), but I can't imagine it would scale well enough for something like a data center.

Re:...for that matter... (1, Funny)

Anonymous Coward | more than 7 years ago | (#19920237)

Worry not! We'll also implement lead walls!

Re:Microwaving your privates? (3, Interesting)

BlueParrot (965239) | more than 7 years ago | (#19919301)

It doesn't make any sense to make a network card emit microwaves at intensities similar to microwaves because not only would you get a huge power consumption, it is also massive overkill unless you plan to search the sky for stelth bombers. The FCC ( or local equivalent ) would probably have a few things to say about it as well. The scaremongering about radiation from comunications equipment is simple unbeleivable. You are more likely to get hurt from tripping in a cat5e cable.

Call me a luddite, but... (-1, Troll)

tjstork (137384) | more than 7 years ago | (#19917539)

I have this weird feeling that pervasive, high frequency radio needed to make wireless work is going to wind up with some unforseen bad side effect, the same way every other technology that we used too much had. My concern is that we lack the science to even understand the implications of all of this radiation we're creating upon our environment. Sure, you can put a frog in a box next to a wireless system and say, "oh, the frog lived", or jack up the energy by 100 times as some sort of a proxy for exposure over time, and say "the frog did not get cancer", but that's not really the same as saying that we will now saturate the biosphere with radiation of our own making.

Why not get a Government grant to research it? (1)

jd (1658) | more than 7 years ago | (#19917671)

The Government spends money on everything else, why not spend it on something useful?

Re:Call me a luddite, but... (4, Insightful)

Anonymous Coward | more than 7 years ago | (#19917737)

Yeah! Cause the biosphere wasn't already inundated with electromagnetic radiation. Its a good thing the rest of the universe doesn't spew loads of it towards the Earth. Oh wait...

Yes, and 99% of all CO2 on the earth is natural (2, Insightful)

tjstork (137384) | more than 7 years ago | (#19918835)

99% of all the CO2 in the atmosphere is natural, and we chalk up a change in climate to our 1% fluctuation, as if, that vast lion of 99% doesn't fluctuate on its own. So, why not worry about radiowaves in a radioactive universe.

Re:Yes, and 99% of all CO2 on the earth is natural (0)

Anonymous Coward | more than 7 years ago | (#19919679)

Mod parent insightful.

Re:Yes, and 99% of all CO2 on the earth is natural (0)

Anonymous Coward | more than 7 years ago | (#19920151)

Mod parent "modappeal."

Re:Call me a luddite, but... (3, Interesting)

orclevegam (940336) | more than 7 years ago | (#19917929)

but that's not really the same as saying that we will now saturate the biosphere with radiation of our own making.

As opposed to all that radiation saturating the biosphere not of our own making? You do realise that light is radiation right? Also, in case you're worried about all the terrible WiFi access points, your average 60 watt bulb puts off far more energy (radiation) than any WiFi AP in use. Now, admittedly, not all radiation has the same effect on everything (such as UV), but the key thing with EM radiation like light and radio waves is the total power and the distance from the source. Remember, power dissipates with the square of the distance, so if you're anything but sitting on top of the transmitter, and even then if it's relatively low power, you've got more to worry about standing outside on a sunny day. The fact that they're talking about such short distances with this tech leads me to believe this will probably be a very low power device, much the same as bluetooth and RFID are.

Re:Call me a luddite, but... (0)

Anonymous Coward | more than 7 years ago | (#19918205)

Remember, power dissipates with the square of the distance, ...

That's only for an isotropic source. If what you said was true for general sources, there would hardly be a point to making spotlights or low dispersion lasers or directional antennae. Fiber optics would be impossible.

Dispersion lowers your energy density, which lowers your power at a point.

Re:Call me a luddite, but... (-1, Troll)

Urza9814 (883915) | more than 7 years ago | (#19919893)

Great analogy there with the 'it's safer than standing outside on a sunny day'. Yea. It'll burn me and give me skin cancer then?

First of all, with those kinds of speeds, I'm not sure this will be a very low power device. I admit I didn't RTFA and didn't look into how it works, but I'm thinking it's gotta use many frequencies. Sure, one 60 watt bulb might not hurt you, but a large group of them could cause some problems. Plus, if it is low power, that has it's own problems. When you get hit by a large dose of radiation, there are many things your body does to protect itself. When you get slowly cooked by a continuous small dose, none of that happens. Your body doesn't protect itself against low levels of radiation.

Basically, I wouldn't want to be around these things. But then, I don't trust my cellphone much either after seeing studies linking cell phones in pockets to testicular cancer.

Oh, and I'm not sure about other countries, but in the US safety regs for radiation are based only on heat produced by the waves, when there are other ways they cause damage. Studies have shown cells can be damaged by waves 60 times weaker than what the government says is safe.

Re:Call me a luddite, but... (2, Insightful)

Anonymous Coward | more than 7 years ago | (#19918035)

My concern is that we lack the science to even understand the implications of all of this radiation we're creating upon our environment. Sure, you can put a frog in a box next to a wireless system and say, "oh, the frog lived", or jack up the energy by 100 times as some sort of a proxy for exposure over time, and say "the frog did not get cancer", but that's not really the same as saying that we will now saturate the biosphere with radiation of our own making.
we understand electromagnetic radiation in great detail, it isn't magical or anything, just because it is a type of radiation doesn't mean it's going to give you cancer. the main reason some types of radiation make it more likely for an organism to get cancer is that the radiation is high enough in energy to damage or shatter DNA and proteins in the cell. this is the case with UV, X-ray and gamma radiation but not generally the case with lower energy electromagnetic radiation. the second thing is that microwaves/radio used in these wireless connections are nearly a million times less energetic per photon than UV is, this means it is essentually incapable of breaking DNA to even raise the chances of cancer. it is interesting to note that you have such a fear over wireless connections but have no problem using everything else, some of which do emit low levels of the same electromagnetic radiation the wireless connections do. it is also interesting that we live in a time where people live longer than ever recorded in human history yet somehow the fear mongerers want you to believe that we dont know wtf we are doing. just goes to show that constant irrational fear sells better than the truth.

Re:Call me a luddite, but... (1)

Detritus (11846) | more than 7 years ago | (#19918267)

You've convinced me. We must destroy the Sun!

Re:Call me a luddite, but... (4, Interesting)

physicsnick (1031656) | more than 7 years ago | (#19918623)

UHF frequencies (millimeter waves and microwaves) cannot cause cancer. The photon energy is not high enough to break chemical bonds in biological tissue.

When a chemical bond is formed (say, in DNA), a certain amount of energy is released. To break that bond (and cause cancer), you need to put that energy back. The catch is, because of quantum mechanics, the energy can't be accumulated. You can't pile in more and more photons until it finally snaps; you have to get one big photon to come in and snap it. When you state the frequency of a photon source (e.g. 60 GHz), that indicates the energy of each individual photon (0.00024 eV). Typical bonds in DNA are on the order of hundreds of eV. It's physically impossible for this to cause cancer.

Even if you put your cat in a microwave oven, it won't get cancer (though it will die a pretty horrible death).

The danger with electromagnetic waves is heat and depth. UHF electromagnetic waves have far less energy per photon than visible light (~2.5 eV), but they have much greater depth penetration. They go deeper before they collide with your molecules, so they deposit heat deeper into your flesh than visible light or UV radiation. This is why putting your cat in a microwave is very bad; it essentially gets "cooked from the inside out". But the energy outputted by wireless devices is barely enough to cause even measurable changes in the temperature of human flesh. How much heat can you apply to a glass of water with a 1.5 V AA battery? Not much. Now spread that out spherically in a 100 meter radius. Almost zero.

Even then, biological organisms are very good at regulating their temperature; humans live across a wide variety of climates all across Earth, and yet still manage to balance their internal temperature.

Hence, UHF communications are not dangerous.

Re:Call me a luddite, but... (1)

drinkypoo (153816) | more than 7 years ago | (#19918777)

I have this weird feeling that pervasive, high frequency radio needed to make wireless work is going to wind up with some unforseen bad side effect, the same way every other technology that we used too much had.

Even if for no reason other than having security of communications, it would be preferable if data were communicated via fiberoptic cable. Bonus points for creating optical transceivers that don't broadcast their signals all over the RF spectrum as a side-effect of operation.

Re:Call me a luddite, but... (1)

Goaway (82658) | more than 7 years ago | (#19919895)

You're a Luddite. And although you might lack it, the rest of the world do have that exact science that you never heard of.

And the default settings will still be nasty (0)

Anonymous Coward | more than 7 years ago | (#19917547)

...and let me guess... these multi-gig wireless routers will STILL come configured by default with WEP
(because it's easy to set up)

Multi-Gigabit? (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19917565)

Burbage dies on pg. 12
Hedwig dies on pg. 56
Mad-Eye dies on pg. 78
Scrimgeour dies on pg. 159
Wormtail dies on pg. 471
Dobby dies on pg. 476
Snape dies on pg. 658
Fred Weasley dies on pg. 637

Harry gets fucked up by Voldemort on pg. 704 but comes back to life on pg. 724

Tonks, Lupin, and Colin Creevy have their deaths confirmed on pg. 743

19 years after the events in the book:

Ron has married Hermione, their two children are named Rose and Hugo

Harry has married Ginny, their three children are named Lily, James, and Albus Severus.

Draco Malfoy has a son named Scorpius

                The epilogue shows all of the children boarding the train for Hogwarts together.

The final lines of the book are: "The scar had not pained Harry for 18 years. All was well."

Plot Spoilers
Part of Voldemort's soul was implanted into Harry whenever he used Ara Kadvara on him when he was a baby. Harry then sacrafices himself a la Lilly Potter style, which allows him to kill Voldemort without killing himself. He also has hacks (stone to bring him back to life, and an uber wand).

Snape went to the good side (Hogwarts, etc.) because he was all emo that Voldemort killed Lilly Potter.

Harry has three kids with Ginny. Ron and Hermoine fall in love.

I am a data center manager (5, Insightful)

Travoltus (110240) | more than 7 years ago | (#19917591)

Maybe some lower security data centers might enable wireless, but I doubt it. Being that we're a financial institution (a small one, mind you), there's no way in the h to the e to the double hockey sticks that I'd ever enable any kind of wireless anything in our data center.

I'd rather deal with a network cable gone sentient and whipping around like a snake and attacking people, than go wireless at the data center.

Only an idiot thinks there's a wireless transmission that's invulnerable to being intercepted. Heck, wired communications aren't 100% secure, either, but my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

Re:I am a data center manager (1)

tedrlord (95173) | more than 7 years ago | (#19917753)

I figure you could put it in a faraday cage of some sort. Still, I'd prefer a little planning and cable management to several hundred machines and peripherals transmitting wirelessly anyday. Especially since I have to spend days on end in there every so often.

Re:I am a data center manager (1)

markov_chain (202465) | more than 7 years ago | (#19918753)

Faraday cage wouldn't work. Any openings need to be smaller than the depth of the skin current, or the signal induced on the inside surface will just flow out through the cracks and re-radiate.

It would suck anyway (1)

Wesley Felter (138342) | more than 7 years ago | (#19917773)

Sure you can get 15Gbps, but if you start sharing that bandwidth among dozens of servers it wouldn't be all that fast anyway.

Re:I am a data center manager (3, Insightful)

walt-sjc (145127) | more than 7 years ago | (#19917881)

My little cage at the colo doesn't have 5 servers. It has hundreds. I'm also sharing that datacenter with many many other companies that have cages with hundreds of servers. We deal with SAN / iSCSI, NAS, backups over networks, etc. With the noise and limited bandwidth available in a shared frequency space, I seriously doubt any type of wireless will be very useful in a datacenter - especially since everything is already connected via hard-wired connections.

It also won't be very useful in my home, where wires are already easy to run for the short-distance devices, and noise / distance prohibits the use in cases where I could really use and WANT high-speed wireless.

So it does sound like a neat trick, but what is a valid, viable use case for it?

I could REALLY use something much different. I want to get rid of the 20 or so wall-wart power supplies under my desk. I want one larger power supply that I can run small cables to all the devices. Why can't devices negotiate for how much voltage / current they need?

Re:I am a data center manager (1)

The Cisco Kid (31490) | more than 7 years ago | (#19918507)

I want something different too. I dont want higher speed, I want more range. I want one or two megabits at 30 miles NLOS. Either simple point-to-point, with many different 'channels' for seperation, or point-to-multipoint. Of course, the question of wether such a thing is technically possible is irrelevant, because the telco's would kill it in its crib anyway.

Re:I am a data center manager (0)

Anonymous Coward | more than 7 years ago | (#19918813)

> I want one or two megabits at 30 miles NLOS.

I'd be happy with 30 feet! I help manage about 20 data centers, and wireless Ethernet doesn't work in most of them reliably for even 30 feet. There is just too much noise and the available wireless equipment is such garbage that it is not reliable. Having a set of frequencies as low as possible dedicated for wireless Ethernet is the only solution that would help. When analog TV stations go away, those frequencies would give us reliable 100mW communications for 100 feet or more because they're low enough to not require line of sight. Unfortunately, the FCC has already said that they will make it a crime for the public to use that public property. It's going to be given to corporate interests.

Your 30 miles is just unreasonable.

greater range, longer distances (1)

falconwolf (725481) | more than 7 years ago | (#19920263)

I want something different too. I dont want higher speed, I want more range. I want one or two megabits at 30 miles

Thirty miles is an alright start but I'd like at least 100 or 200 mile range. I love hiking and photography and would like to be able to upload, transmit my photos wirelessly to a server. While it may be possible to do so with a 30 mile range that would require a lot more tower transceivers.

Falcon

Re:I am a data center manager (1)

phallstrom (69697) | more than 7 years ago | (#19918787)

> So it does sound like a neat trick, but what is a valid, viable use case for it?

Maybe for your AV stuff? No wires b/n your dvd, receiver, and tv would be nice.

Re:I am a data center manager (1)

drinkypoo (153816) | more than 7 years ago | (#19918999)

My little cage at the colo doesn't have 5 servers. It has hundreds. I'm also sharing that datacenter with many many other companies that have cages with hundreds of servers. We deal with SAN / iSCSI, NAS, backups over networks, etc. With the noise and limited bandwidth available in a shared frequency space, I seriously doubt any type of wireless will be very useful in a datacenter - especially since everything is already connected via hard-wired connections.

I've seen security rooms inside datacenters that had copper cloth over the windows, etc etc. What if every cage in the colo were a faraday cage? In theory, wouldn't that permit this? Or, how about UWB? Isn't UWB supposed to allow an effectively infinite number of transmitter/receiver pairs to operate together? If the whole building were shielded so that it wouldn't penetrate, it would eliminate interference issues.

I still think that fiber is more desirable. I wish it were cheaper (although it's getting cheaper all the time.) It's not as versatile as copper (what with the cut and polish issues) but it has the potential to be even cheaper - plastic fiber is acceptable for most purposes, like short runs. And you can group connections for longer hauls to cut down on the amount of expensive (and expensively terminated) fiber in use. And it's also not an antenna (twisted pairs aren't everything.)

Re:I am a data center manager (1)

Peter La Casse (3992) | more than 7 years ago | (#19919693)

So it does sound like a neat trick, but what is a valid, viable use case for it?

When my PC converges with my PDA, I want to be able to walk up to any display and use it without hauling a cable around. Better yet: I want HD video in my glasses, connected wirelessly to my PDA PC.

I have an external hard drive for my work laptop. It would be nice to be able to connect to it wirelessly. I'd also like to be able to sync my laptop to a docking station wirelessly.

There are all kinds of nifty things you can do when your network is faster than your hard disk, even if that network is only a few meters across, like keep all your hard disks in the next room or build computers out of all-"external" hardware.

Re:I am a data center manager (1)

Doctor Memory (6336) | more than 7 years ago | (#19920631)

Why can't devices negotiate for how much voltage / current they need?
They do — via the wall wart.

I wouldn't be surprised if someone proposes a standard for low-voltage DC distribution in the home. You'll wind up with dual-socket outlets, with your standard AC socket and two to four 12V sockets. Maybe use a multi-bladed plug to determine how much current the device can sink (each blade signifies 500ma, so a four-bladed plug can sink 2A). Somebody else has probably already thought this out in detail, so I'll just wait for someone to post a link to a complete spec...

Re:I am a data center manager (1)

interstellar_donkey (200782) | more than 7 years ago | (#19918325)

That's exactly why, when I first read about this, that I thought that the appeal of high speed wireless would mostly be on the consumer end. Most businesses are bound to see the potential security risk of wireless and stay away from it, regardless of how fast it is.

As I don't manage any data centers, I'd love it. Mostly because the wife has forbade me from running CAT5 through the house and I'm stuck with 802.11g connections. It's annoying to try to transfer a large file from the office upstairs to, say, the HTPC downstairs and have it go down so slowly, especially knowing that both machines are capable of gigabit transfers over a wire.

Granted, most home consumers don't really need that kind of speed over their networks, wireless or otherwise. But it'll play well in the marketing to say "this is faster!"

Re:I am a data center manager (1, Funny)

Anonymous Coward | more than 7 years ago | (#19918457)

I'd rather deal with a network cable gone sentient and whipping around like a snake and attacking people, than go wireless at the data center.
That is IT! I've had it with these muthafuckin' snakes on this muthafuckin' (Ethernet) frame!

Ordered by upper mgt. (0)

Anonymous Coward | more than 7 years ago | (#19918509)

I'm a data center manager too (for a moderate sized govt organization with just under 100 servers and 6 separate network backbones). Posting AC because... well, here's your tax dollars at work :-}

I fought off enabling any wireless on my turf for as many years as I could, but now my superiors have *ORDERED* me to install 802.11 devices all over the place, despite the fact that I consider installing wireless to constitute a deliberate removal of security from our network. Oh well, at least I get to dictate how the wireless will be "secured", so everything will be WPA2/AES encrypted with EAP-TLS certificate-based authentication wherever feasible and EAP-TTLS wherever client-side certificates are unfeasible. Also, just for grins, I'm going to also employ MAC-address filtering too, even though that's not really a serious form of security in and of itself, it does least throw one more time-wasting stumbling block at anyone trying to break in wirelessly. Any wireless client devices that are unable to play by these rules are simply deemed "unsupported" on the WLAN that's being rammed down my throat.

Wireless will just have to be a "managed risk" that I must deal with.

Yeehaw :-/

Quit now (1)

Travoltus (110240) | more than 7 years ago | (#19918895)

before you get caught up in a security breach scandal and the orders they gave you to implement wireless networking get sealed up in one of Dubya's supersecret war on terror files.

Polish up your resume and quit now.

Really, I'm not kidding.

Nope, they pay me too well. (0)

Anonymous Coward | more than 7 years ago | (#19919187)

I work at a state govt outfit, not federal, and they pay me very well... equal to, or slightly better than a private company would, and the benefits package is unbeatable. My job is also virtually guaranteed until I retire, and any wireless security breach will not be held against me personally, especially since the dictatum to install the wireless and my protests about the security issues are well documented in memos that are now official records. I will be able to say "I told you so", if a wireless breach happens.

The point, I guess I was trying to make is that there's not really such a thing as realistic true security in any kind of network anymore, except perhaps for one that is unplugged and powered off. It's all a system of "managed risks" now. That's just a part of life in this IT business.

Silly Fears. (1)

twitter (104583) | more than 7 years ago | (#19919211)

there's no way in the h to the e to the double hockey sticks that I'd ever enable any kind of wireless anything in our data center. ... my boss's business is about minimizing risk, and wireless networks even inside a data center is not minimizing risk.

Your network is on the internet. That and any non free software you have are bigger threats than sftp over wireless.

Re:Silly Fears. (1)

Macthorpe (960048) | more than 7 years ago | (#19919647)

That and any non free software you have are bigger threats than sftp over wireless.
Going to cite anything for this one? Or am I going to be left waiting like I have with this gem? [slashdot.org]

Re:I am a data center manager (1)

ffejie (779512) | more than 7 years ago | (#19920161)

Agreed, but it's less about security and more about speed and troubleshooting, I would think. Sure, my home datacenter (a NAS and a Xbox360) might like to use wireless, but tell that to a guy trying to get 10-40Gbps out of his servers. I don't think that 15 Gbps is going to do it across his datacenter.

FTFA (2, Interesting)

SighKoPath (956085) | more than 7 years ago | (#19917617)

Pinel is quick to point out that a multi-gigabit wireless system would present no health concerns as the transmitted power is extremely low, in the vicinity of 10 milliwatts or less and the 60 GHz frequency is stopped by human skin and cannot penetrate the body. The team admits that the fact that multi-gigabit transmission is easily stopped means that line-of-sight is essential, and this could be a stumbling block in practical settings.
Doesn't this make it being wireless kinda pointless? It's like a wired connection where you can't step over the cable or drill a hole through the wall!

Re:FTFA (1)

MontyApollo (849862) | more than 7 years ago | (#19917805)

You could wire a transmitter/access point into everyroom near the lights. You would still have to wire indoors, but you would have untethered movement.

Re:FTFA (1)

MontyApollo (849862) | more than 7 years ago | (#19917999)

Or you could just put relays in line of sight of one another. You wouldn't need too many if each device also could relay.

Re:FTFA (3, Insightful)

Moby Cock (771358) | more than 7 years ago | (#19917807)

Useless? No. But very application specific. However, there is a great appeal in making Personal Area Networks.

That and being able to connect a DVD player to a TV without a cable would be, in a purely geek way, quite elegant.

Re:FTFA (1)

gatzke (2977) | more than 7 years ago | (#19919661)


Until someone turns on a microwave.

Or you live in an apartment and your n nearest neighbors compete for bandwidth.

Or somebody nukes us and the EMP keeps you from watching American Idol.

Re:FTFA (0)

Anonymous Coward | more than 7 years ago | (#19917969)

If these are cheap, I would love to have a couple tiny AP's mounted on the ceiling in each room running on power-over-ethernet. For the latop itself, low power = longer battery life.

Re:FTFA (1)

cpaalman (696554) | more than 7 years ago | (#19918767)

Wireless does not imply distance. Setting next to my TV is a TiVo, Stereo Receiver, CD Player, DVD Player, VHS Player, PlayStation 2, Wii, Cable for a Video Camera, Digital Camera... and they all sit within inches of each other. If I could de-clutter the cables to make all this work maybe I would not have to create a Visio Diagram of the cables at my wife's insistence. She's planning my early demise but having to deal with the cable clutter keeps me around a little bit longer. Five foot range should do just fine for me thanks.

Re:FTFA (1)

ityllux (853334) | more than 7 years ago | (#19918887)

...or it's like a smail mail connection where you can't send messages as letters from the Post Office!

Just because it has limitations relative to another method doesn't mean it's pointless -- it just has different uses. And as was pointed out in a recent Scientific American article here [sciam.com] , line-of-sight is probably ok in office settings as long as your signals can bounce around the room.

Remote display and input (3, Interesting)

MontyApollo (849862) | more than 7 years ago | (#19917631)

Could this kind of bandwidth run a remote display?

I always thought it would be cool to have a pad that was nothing more than a screen and input device that you could carry around the home instead of a full-fledged laptop. You would be actually "running" your powerful desktop off basically a second screen that you could carry around with you in the house.

Re:Remote display and input (1)

Moby Cock (771358) | more than 7 years ago | (#19917831)

Didn't Capt Picard have one of those?

Re:Remote display and input (0)

Anonymous Coward | more than 7 years ago | (#19918221)

You mean like this [viewsonic.com] ?

Re:Remote display and input (2, Informative)

Yosho (135835) | more than 7 years ago | (#19919049)

Well, let's do some math. Let's say we've got a 1680x1050 display at 24 bpp and an update rate of 60 Hz. That's 1680*1050*24*60 bits per second -- in other words, 2.37 Gbps. So, yes, a connection like this could conceivably run a remote display.

This shit means nothing... (0)

Anonymous Coward | more than 7 years ago | (#19917647)

...if the server sending that mp3 down is doing so at 5KB/sec. Seriously I still can't even max out my year 2000 Linksys router.

Great! (1)

MightyMartian (840721) | more than 7 years ago | (#19917691)

You'll be able to watch pr0n through your neighbors open wireless network *and* fry up a steak by positioning the frying pan between the access point and your notebook. Don't worry, the sunburn should fade in a few weeks.

Not for the data center (3, Insightful)

nincehelser (935936) | more than 7 years ago | (#19917707)

I can't see any real application for this in a data center. They'll always use wires, switches, and routers. One simple reason is that one bad wireless transmitter could jam a whole bunch of nearby servers, which probably wouldn't be good. Wires have their uses. Sometimes it's good to keep your data flow contained and controlled.

Re:Not for the data center (1)

suggsjc (726146) | more than 7 years ago | (#19920253)

Don't get me wrong, I don't see this happening any time soon. But to go so far as to say words like "always" or "never" is just begging your foot to be inserted into your mouth at least sometime down the road.

They'll always use wires, switches, and routers.
Well, two out of the three you just mentioned (and I guess conceivably even the wires too) are subject to failing. So just because having a physical connection makes you feel all warm and fuzzy inside (and rightly so with the current state of wireless tech), doesn't mean that we can't look to alternatives for the future.

It is closed mindedness like this that can keep good tech from even having a chance. You really don't have to defend your stance from what is currently available, but to say that nothing will ever be good enough to replace those good old fashioned, tried and tested wires is simply ludicrous.

Re:Not for the data center (1)

bob_herrick (784633) | more than 7 years ago | (#19920635)

You really don't have to defend your stance from what is currently available, but to say that nothing will ever be good enough to replace those good old fashioned, tried and tested wires is simply ludicrous.
That buggy out there was good enough for grandad, and, consarn it, it is good enough for me! Now where did that horse get off to?

Still some teething troubles... (0)

Anonymous Coward | more than 7 years ago | (#19917713)

data-transfer rates of 15 gigabits per second (Gbps) at a distance of 1 meter

Three years away because it will currently roast a baby at the same distance.

ubiqutous, multi gigabit pornography (3, Funny)

Anonymous Coward | more than 7 years ago | (#19917735)

great. now ill never have a reason to meet girls

2 ways to increase thruput (4, Informative)

ookabooka (731013) | more than 7 years ago | (#19917793)

There are 2 ways to increase the amount of data that can be sent. Increase the carrier frequency or increase the bandwidth. What these people have done is increase the carrier frequency. Wireless today runs on 2.4ghz, these devices run up to 60ghz. What does that mean? Well it'll take more energy, higher frequency means higher energy, also it attenuates more, meaning shorter range. Not only that, but it can will be more readily absorbed by things like bricks, desks, your foot, etc.

The alternative to this is to increase bandwidth, say use 2.1ghz through 2.6ghz for 1 signal. The obviously downsides to this are you can't run many concurrent streams.
All in all wireless data transfer has a very real ceiling on the amount of data that can be transferred, lower frequency means longer range and ability to go through obstacles, at the cost of reduced data-carrying capacity. I guess the point of this post is to point out that there is only so far we can go with wireless data transfer. I don't think it will be able to keep up (over the long run) with the increasing size of traffic to be a viable alternative to cables when it comes to things like comptuer networking. Anyone have any thoughts on this?

Re:2 ways to increase thruput (1)

Moby Cock (771358) | more than 7 years ago | (#19917945)

The use of multiplexing codes has not been fully exploited, yet. MIMO and others are used extensively in cellular networks (which are, let's face it, wireless networks too) but are less common in 802.11 and similar networks.

Perhaps the next generation of wireless will include UWB/CDMA based transmission.

Re:2 ways to increase thruput (1)

ookabooka (731013) | more than 7 years ago | (#19918175)

Even with multiplexing there is still a very real limit to the throughput of a certain frequency. I suppose my point is that there are clever ways to allocate bandwidth to users depending on how much they need, or to combine a bunch of frequencies to get the throughput you need, but it just isn't realistic to think that one day everything can be wireless and sending movies to and from each other no problem. Basically with wires you can do intelligent switching, but wireless requires you to broadcast and take up the whole frequency. Also wires have a much higher ceiling. The more stuff we try to make wireless the more problems there are giong to be. Also if you RTFA you see that this thing basically requires line of sight anyways. . .I love my wireless keyboard and laptop with WIFI and my cell phone, but I don't think its necessary to make everything wireless, lets leave the airways to the things that really need it.

Disclaimer: I am a ham radio operator :-D

2.45 GHZ, modulate power (1, Funny)

Anonymous Coward | more than 7 years ago | (#19918055)

You can run at 2.45Ghz, and instead of keeping constant power of a few milliwatts, instead, say, modulate the power output from, you know, 1000 watts to 1.21 gigawatts, you can use the resulting modulation to carry more information per wave. This would be really hot new technology, and really start the economy cookin'.

Re:2 ways to increase thruput (1)

Detritus (11846) | more than 7 years ago | (#19918191)

The relevant parameters are bandwidth and signal-to-noise ratio, not the carrier frequency. See the Shannon-Hartley theorem [wikipedia.org] for details.

Re:2 ways to increase thruput (3, Informative)

rcw-work (30090) | more than 7 years ago | (#19918601)

There are 2 ways to increase the amount of data that can be sent.

There are actually four:

  • Increase the signal strength (using a directional antenna or amplifier)
  • Decrease noise (use higher-quality components, shut off interfering transmitters, use directional antennas)
  • Increase the signal bandwidth
  • Increase signal spectral efficiency (for example use OFDM instead of FSK)

Changing the carrier frequency has no effect, except that there's more room for higher-bandwidth signals at higher frequencies. 2.400-2.422GHz seems like a smaller chunk than 400-422MHz, but it can carry the same data.

The formula for how many bits you can send and receive error-free is the Shannon-Hartley theorem [wikipedia.org] , and spectral efficiency is typically stated as a percentage of the theoretical.

Re:2 ways to increase thruput (1)

ookabooka (731013) | more than 7 years ago | (#19920031)

Yeah I was going to include that in my post but didn't seem relevant as I see improving signal to noise (directional atennas, shutting off other equipment nearby) as "cheating". For example, assume my cellphone has the best electronics available, how can I increase the signal to noise ratio? Standing at the focal point of a dish aimed at the nearest tower? I just sort of assumed the S/N ratio is going to be essentially constant, or probably worse in the future. You could increase the number of symbols (OFDM vs FSK) but if the signal to noise ratio is essentially constant (which was my assumption) then you really can't, you can only increase the number of symbols (increase frequency).

Nonetheless your point is quite valid, my post was merely incomplete.

Re:2 ways to increase thruput (0)

Anonymous Coward | more than 7 years ago | (#19919367)

Not quite true. For a given channel only the bandwidth and signal to noise matter, not (directly) the frequency. This is the Shannon theorem.

But it is true that more spectrum is available at high frequencies, enabling the use of larger bandwidth. So in practice there can be a correlation between frequency and throughput, it's just indirect.

Now tin foil hat actual function performing! (1)

The Media Mechanic (1084283) | more than 7 years ago | (#19917877)

Tin foil hat act like antenna and capture all of multi-gigabit signal and route all of data direct to cerebral cortex, where corpus medula hippocampus cerebellum act like giant "Google" and put all "byte" into main storage. some time Often cause all of sound to "ears" like bad technical translation Chinese goto English, like bad video game of the cheap PC accessories. when All of signal "Scramble" brainwave, error message to help tech support gets to you responding quickly. Zipping all of signs to Brain cause core dump

Interesting technology (3, Insightful)

retro128 (318602) | more than 7 years ago | (#19917891)

This technology could be used in applications besides just strict data transfer. 15Gbs should be fast enough to drive a display, as well. The proverbial rats' nest behind your computer could completely disappear with this technology. Keyboards, mice, displays, network - Just about cable plugged into the back of your computer could be replaced with wireless this fast.

But if only it were so simple. Of course now the problem we have is with security. Never mind TEMPEST [wikipedia.org] . If you had a big enough antenna and you could decrypt (it IS encrypted...heavily...right?) the datastream emanating from this technology from a distance - you could see the display, keystrokes, data transfers, everything. Obviously, strong encryption is very important - But the overhead from strong encryption will reduce the theoretical bandwidth because of the extra baggage on the packets, and increase costs significantly because of the very specialized ASICs that will likely be required to encrypt a stream at that speed. And they'd have to be standard across all devices. AND an exploit had better not be discovered in the algorithm. Then there's the issue of the 60GHz band. A frequency that high is very unforgiving of obstructions, even at the short ranges we're talking about. If you have a metal desk, forget it. And what about jamming from computers in close proximity? What about from a "l33t hax0r" with some time on his hands and an inclination to make trouble?

Re:Interesting technology (0)

Anonymous Coward | more than 7 years ago | (#19918615)

If you understood how frequencies and attenuation and the associated data loss worked, you'd realize that regardless of how big the antenna is, you'd need a repeater at least every 5 meters away from these devices in order to get any recognizable signal. Plus the fact that with the ultra high frequency the signal will likely be completely stopped by a window, the only way to log keystrokes from this wireless tech is to have your antenna, regardless of size, within 5 meters or less of the keyboard itself. Even closer for things like hard drives and video signals, as they use higher frequencies for the increased data, and therefore the signal deteriorates much faster.

Besides, wireless keyboards are already here, and have been for a while, and work pretty darn well with similar limitations, only minus the ultra high freq.

This technology should be relatively secure, I mean if a guy can get close enough to grab the wireless signal then he can get close enough to plug in via cable anyway. This is an EXTREMELY short range technology. It can only eliminate cables essentially, and even then only shorter cables.

Re:Interesting technology (1)

drinkypoo (153816) | more than 7 years ago | (#19919061)

If you had a big enough antenna and you could decrypt (it IS encrypted...heavily...right?) the datastream emanating from this technology from a distance - you could see the display, keystrokes, data transfers, everything. Obviously, strong encryption is very important - But the overhead from strong encryption will reduce the theoretical bandwidth because of the extra baggage on the packets, and increase costs significantly because of the very specialized ASICs that will likely be required to encrypt a stream at that speed.

The real problem here therefore is one of cost. You can have as much bandwidth as you can pay for (because this is the kind of problem that responds well to parallelism. The penalty for that parallelism need not be all that significant. You can have no encryption cheaply, but uh, yeah. Next.

I don't suppose anyone out there knows of any properties of physics that would allow for linked "random" number generating systems that were consistent? :)

Re:Interesting technology (1)

RajivSLK (398494) | more than 7 years ago | (#19919663)

The proverbial rats' nest behind your computer could completely disappear with this technology.

No, the problem you will then have is power. Everything still needs power. Keyboard, monitor, mouse etc.

Re:Interesting technology (1)

retro128 (318602) | more than 7 years ago | (#19919809)

No, the problem you will then have is power. Everything still needs power. Keyboard, monitor, mouse etc.
So they do, so they do. [gizmodo.com]

Bull (1)

Conspiracy_Of_Doves (236787) | more than 7 years ago | (#19917963)

It might be commercially 'possible' in a few years, and I'm sure that countries other than the US will even have it, but the US ISP monopolies will never make it available.

I think the summary went off the deep end.. (2, Insightful)

Kjella (173770) | more than 7 years ago | (#19918111)

...when it said wireless in the data center. Yes, I've heard the theoretical figures for wi-fi. Try dropping a bunch of access points and various clients in tight proximity and see what it's really like. In a datacenter you can run 10x 10Gbps wires right next to eachother without problems. Can you do that with wireless? Hell no. I imagine the speeds quoted are ideal with free line-of-sight and no interference, good luck trying to achieve that in that bunch of wires. Personally I was fed up with wireless when I realized one AP couldn't even cover the ground floor of my parent's house. It'd take probably three to cover the whole house. Great... not.

How is this really at that useful? (1)

Starteck81 (917280) | more than 7 years ago | (#19918207)

I fail to see how this would be useful in anything but a few specialized applications. Most of the time if you need that kind of speed you're not moving around all that much.

Unless there is going to be a huge increase in available bandwidth in the home ISP market I can't see how having that kind of speed for the average user would be useful. Even the fastest connection that could be considered widely available is Verizon's FIOS and that's only 50 Mbps.

The only thing I can think of that would make this useful is a direction antenna setup to link two sites together.

Why this is partially a crock (1)

postbigbang (761081) | more than 7 years ago | (#19918269)

The i-squared-r law at 60ghz means that even if the spectra was available (it's not) then you'll need both line of site (reflections won't help and will slow the data rate considerably) and you'll need the will to gulp content that fast. Of course, a shared fixture like an access point in WiFi suffers from duty-cycle problems and raw bandwidth will help. But we could also use spread-spectrum and/or advanced coding techniques like n-Pole modulation to accomplish the same thing.

Therefore, with all due respect to the geeks in Georgia, this is like saying: Hey-- wireless is going to be way faster!!!!! in some breathless sort of way.

No duh. Now jump over the obstacles. There are huge numbers of them, and only the surface ones are seemingly scratched here.

Humbug (1)

kaaona (252061) | more than 7 years ago | (#19918285)

I don't mean to be a wet blanket, but all of the advantages of the latest whiz-bang technology don't amount to a bucket of warm spit unless and until the major carriers adopt it. If I live to be a hundred, I'll never see Gigabit data service where I live in the St. Louis MetroEast area of Illinois because no one will force our regulated monopoly (AT&T) to provide it. Until Universal Service is expanded to include broadband, and regulatory bodies set the definition of the term broadband to be 2 Mbits/sec or higher, AT&T will continue to offer only POTS and dial-up service to my established suburban neighborhood.

Not terribly relevant technology (1)

tomkost (944194) | more than 7 years ago | (#19918331)

Ok so I can theoretically get 5-15 Gbs at 5-1 meters. That's not an incredibly useful distance for most people. Also, the broadband connection to the home's speed is currently ~1000 times less. I do not personally see the need for much higher speeds in the home then are available with 802.11n (74Mbs typ)

I'm more concerned that we have dropped from 4th to 13th in broadband penetration. Let's get a faster pipe TO the home first.

How about using the wired bandwidth first? (1)

Harald Paulsen (621759) | more than 7 years ago | (#19918551)

The research could lead to devices such as external hard drives, laptop computers, MP-3 players, cell phones, commercial kiosks and others could transfer huge amounts of data in seconds
How about enabling my external USB-drive to use the 480Mbit available first? Or what about a NAS that can fill up a 1GBps ethernet? Wired isn't slow, it's just not used right.

And yet... (0)

Anonymous Coward | more than 7 years ago | (#19918825)

I have a 100mbps ethernet connection that not many devices use to its full potential.

And we have 40gbps OC-192 cables...

Wireless is plain stupid except for laptops. I don't see the point in making everything wireless, just like I don't see Big Iron dying tomorrow.

Marketing lies as usual (1)

gweihir (88907) | more than 7 years ago | (#19919231)

First, successful lab demonstration of multi-gigabyte speeds with mass-market capable technology is still missing. Call that at least 5 years to a real product. Then deployment. Who needs this stuff enough to deploy it immediately? Right, allmost nobody. Also the first product generation will not really be usable. Call it another 5 years to wide-scale deployment. That gives me an estimate of at least 10 years, but more likely 20 years. The 3 years are a direct lie, plain and simple.

I hope these ethically challenged scumbags get what they deserve.

Useless (1)

hcdejong (561314) | more than 7 years ago | (#19919363)

I expect the line-of-sight requirement is a dealbreaker for 'personal area network' type situations. I've got my computer underneath my desk, and all gadgets that could possibly benefit from high-speed wireless links are above the desk. Reconfiguring my desk to provide LOS for everything (including keeping the desk clean, no stacks of paper between the computer and the gadgets) would be a major PITA. I'll stick with wired connections, thank you.

High-speed wireless could be useful for 'last mile' connections, but I doubt 10+-GHz networks will take off for home or office use.

I don't think most of you understand what this is (0)

Anonymous Coward | more than 7 years ago | (#19919561)

I think a lot of people reading this are confused.

This is NOT wifi! These ultra high frequency devices are NOT for connecting to internet/lan networks!

This is a technology for replacing wires from the back of your devices, like instead of driving your TV through VGA, S-Video, or HDMI, it would be wireless. Or, instead of AV cables between your VCR and cable box, they use a signal in the 60ghz range to communicate wirelessly instead.

The ONLY way a 60ghz device is going to connect you to the internet is if someone builds an external NIC that sits on top of your computer instead of in it, and then connects via 802.11 or a network cable to the rest of your network.

The uses this technology is designed for will, so long as manufacturers aren't retarded, never require anything like an access point, and it sure as hell isn't ever going to require a router, unless of course it is designed to sit within a few feet of another device and connect wirelessly to it.

It's like replacing the cable from your hard drive to the motherboard, NOT the cable from your NIC card to the router. A new type of fiber optical cable will likely replace that. :)

It's basically to get rid of the growing rats nest of cables all our devices create. Think setting your ipod on top of your computer and streaming music from it, that kind of stuff.

That's pretty much every type of example I can come up with to make it clear what this is for. The ISPs and cellular carriers will NEVER have anything to do with this, at best it will be the cell phone manufacturers who impliment this for connecting to your other devices.

By the way, there is a similar tech for wirelessly transmitting power over very short distances as well, so we may soon just be setting devices near each other and they start working. Talk about making things easy!

sdfg (1)

AppahMan (992506) | more than 7 years ago | (#19919597)

so this goes from firewire to fire air? :)

iPhone loads webpage in less than a day? (1)

peter303 (12292) | more than 7 years ago | (#19919747)

OK, not that bad. I'm a little disappointed in ATT 2.5G.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?