Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

802.11ac 'Gigabit Wi-Fi' Starts To Show Potential, Limits

Soulskill posted about a year ago | from the good-for-streaming-that-4k-video-nobody-makes dept.

Wireless Networking 101

alphadogg writes "Vendor tests and very early 802.11ac customers provide a reality check on 'gigabit Wi-Fi' but also confirm much of its promise. Vendors have been testing their 11ac products for months, yielding data that show how 11ac performs and what variables can affect performance. Some of the tests are under ideal laboratory-style conditions; others involve actual or simulated production networks. Among the results: consistent 400M to 800Mbps throughput for 11ac clients in best-case situations, higher throughput as range increases compared to 11n, more clients serviced by each access point, and a boost in performance for existing 11n clients."

cancel ×

101 comments

Sorry! There are no comments related to the filter you selected.

802.11ac (5, Funny)

Anonymous Coward | about a year ago | (#45079683)

The anonymous coward version of wi-fi.

Re:802.11ac (-1)

Anonymous Coward | about a year ago | (#45079727)

But how frosty is its piss?

Re:802.11ac (1)

Anonymous Coward | about a year ago | (#45080115)

Does it mean there's another layer of privacy for surfing the net??

Re:802.11ac (-1)

Anonymous Coward | about a year ago | (#45080139)

The anonymous coward version of wi-fi.

does it rant about "niggers" and tadpoles and xkcd?

not exactly gigabit (0)

Anonymous Coward | about a year ago | (#45079699)

Not bad though.

Re: not exactly gigabit (-1)

Anonymous Coward | about a year ago | (#45079745)

Actually it is. You don't really get 1000 on a wired gbit network. 800 is quite good.

Re: not exactly gigabit (5, Insightful)

Marco Tedaldi (3390641) | about a year ago | (#45079915)

Actually it isn't. By far! 1. On a gigabit wired network you get 1Gbit of transfer speed. There is a very small percentage lost to coding but you get well over 100MB/s (up to about 120MB/s) trough a Gbit connection. If you get slower speeds and don't know why, than start searching for the bottleneck! 2. The 400Mbit to 800MBit in a WLAN is the "wire speed". I've never seen transfer rates that are more than 70% of this. So, I expect to get maybe 56MB/s (which is already quite good) out of "GBit WLAN" while I get 120MB/s out of an Ethernet connection almost all the time. Still impressive how they even reach such speeds! That's engineering at it's best!

Re: not exactly gigabit (0, Funny)

Anonymous Coward | about a year ago | (#45080173)

That's engineering at it's best!

"That's engineering at it is best" ... what?! Don't leave us hanging!

Re: not exactly gigabit (3, Interesting)

Sarten-X (1102295) | about a year ago | (#45080245)

Reaching far back to my Cisco knowledge from 2003-ish, that's because 802.11 requires acknowledging every single packet, whereas wired Ethernet allows a larger window, so several packets get sent before an acknowledgement. I don't know if that's still the case (perhaps a modern network engineer will confirm, please), but that could be the reason for seeing just about double the transfer speed through a wire. On wireless, you're using almost twice as many packets to receive the same data.

Re: not exactly gigabit (1)

Anonymous Coward | about a year ago | (#45080763)

AFAIK, the WiFi standard allows you to do accumulated bulk transfers similar TCP Nagle's algorithm before being ack'd. I believe this dates back to 802.11g because they (the WiFi alliance) realized that the overhead for WiFi is more expensive in certain cases.

Re: not exactly gigabit (2)

MobyDisk (75490) | about a year ago | (#45081393)

That's not the Nagle algorithm. Nagle is about delaying before sending packaets, to prevent lots of small packets instead of one big one.

Re: not exactly gigabit (0)

amorsen (7485) | about a year ago | (#45082737)

Wifi is half duplex. Ethernet is full duplex. Ack packets are not a problem for speed tests on ethernet, because they flow in the otherwise-unused 1Gbps return bandwidth, whereas with Wifi they steal bandwidth from the useful traffic. It gets worse, because Wifi has a non-zero "turnaround-time", so every time the client has to ack packets, it leaves the spectrum empty for a moment.

Re: not exactly gigabit (3, Funny)

coofercat (719737) | about a year ago | (#45080383)

To get those higher speeds outside the lab, you'll need some wifi spray [j-walk.com]

Re: not exactly gigabit (0)

Anonymous Coward | about a year ago | (#45081791)

I am getting 25-32MB per sec on 802.11n with an ASUS-AC66U. Do not even have any AC devices yet. That is with 2 channels 300mb at 5ghz. I get about 15-20MB in the 2.4ghz range with the same setup.

Now if I could just keep the router from flaking out and deciding everything should be 7MB and having to reset the thing to get back to good rates...

It took quiet a bit of experimenting with different channels and settings to get to that point though. 'auto' does not work at all in picking good rates.

I want longer range, not faster speeds. (0)

Anonymous Coward | about a year ago | (#45083673)

I want to be able to pull into any McDonalds parking lot and always have my laptop get a solid lock on the Wi-Fi -- as opposed to the current situation, which is rather hit-or-miss.

Re: not exactly gigabit (0)

Anonymous Coward | about a year ago | (#45087481)

My 802.11ac is currently running at 1167mbps. (To a media extender). The box claims that it will do like 1450mbps, but I haven't seen it.

Re: not exactly gigabit (1)

Anonymous Coward | about a year ago | (#45080289)

I get 114MB/s over SMB with non-jumbo frames. That is 912mb/s of effective bandwidth. I get about 996mb/s of raw bandwidth when doing an iperf test. 800mb/s on a 1gb wired network is HORRIBLE.

Re: not exactly gigabit (1)

Mark of the North (19760) | about a year ago | (#45080855)

I can echo this, almost exactly.

Re: not exactly gigabit (1)

zlives (2009072) | about a year ago | (#45083793)

yes but its only half duplex.. o wait

Re: not exactly gigabit (0)

Anonymous Coward | about a year ago | (#45083439)

I get 122MB/s trough my Gigabit network (7 workstations, 3 laptops, 2 access points, 4 tablets, 2 smartphones and one NAS).

Of course if I use some crappy Samsung tablets, they get only 1.7MB/s at best but even 802.11g should give a better results for 4-5MB/s. When using 802.11n capable router I can get 15-18MB/s with laptops.

Still it is worth to have workstations and laptops with cable to NAS when working with big data (or are stationary) as over 100MB/s speed is great when needed to share data.

Ideal situations (2, Interesting)

Anonymous Coward | about a year ago | (#45079747)

"Among (sic) the results: consistent 400M (sic) to 800Mbps throughput for 11ac clients in best-case situations"

Best case being: the only device on the network; inside a Faraday cage; on the dark side of the Moon; 3 centimetres away from the antenna.

BTW: Google.... fuck your dictionary. It IS centimetres.

Re:Ideal situations (-1)

Anonymous Coward | about a year ago | (#45079897)

If you want it to say centimetre, live in a country that contributes anything whatsoever to the internet.

Re:Ideal situations (2)

neokushan (932374) | about a year ago | (#45079975)

The world wide web was invented by a Britt, you ignorant twat.

http://en.wikipedia.org/wiki/Tim_Berners-Lee [wikipedia.org]

Re:Ideal situations (0)

Anonymous Coward | about a year ago | (#45080011)

Weirdly, the parent post appears to have been implying that the US is not "a country that contributes anything whatsoever to the internet.". (i.e. reads as having a go at the Yanks rather than the Brits - but I suspect that's not what they meant!)

http://en.wikipedia.org/wiki/Metre [wikipedia.org]

The metre (International spelling as used by the International Bureau of Weights and Measures) or meter (American spelling)

Re:Ideal situations (0)

Anonymous Coward | about a year ago | (#45080203)

The biggest contribution to the world by the yanks is the rhetoric employed in .................. social,industrial,and technological recidivism.

The UK not need billion-pound 4g mobile network, when good wifi network exists. Tracking?

Also, better not insert bezeqint simcard in device on wifi network..... for reasons not yet understood!

Re:Ideal situations (1)

Russ1642 (1087959) | about a year ago | (#45083145)

metre - correct spelling
meter - incorrect spelling

Re:Ideal situations (1)

neokushan (932374) | about a year ago | (#45083951)

Ah yes, I can see how that could be read differently. I fear I may have jumped the gun on this one (apologies to the above AC if that was your intention all along).

Re:Ideal situations (0)

Anonymous Coward | about a year ago | (#45080693)

The world wide web was invented by a Britt, you ignorant twat.

http://en.wikipedia.org/wiki/Tim_Berners-Lee [wikipedia.org]

http://en.wikipedia.org/wiki/Robert_Cailliau

Re:Ideal situations (1)

517714 (762276) | about a year ago | (#45083477)

He said, "Internet," which predates Berners-Lee's contributions by fifteen years [w3.org] . And we all know Al Gore invented it! Although some maintain it was Vint Cerf and Bob Kahn.

Re:Ideal situations (1)

neokushan (932374) | about a year ago | (#45083935)

Yes and I suppose when he said "contributes", he wasn't talking about anything like the most powerful economic and social tool ever invented.

Re:Ideal situations (0)

sandman_eh (620148) | about a year ago | (#45080061)

If you want to spell know how to spell "metre", perhaps you should adopt the spelling of a country which uses the metric system..

Re:Ideal situations (-1)

Anonymous Coward | about a year ago | (#45080161)

If you want to spell know how to spell "metre", perhaps you should adopt the spelling of a country which uses the metric system..

Wanting to argue about shit like this is why the jocks fucked the whole cheerleading squad while you fucked your hand for so many Friday nights.

Re:Ideal situations (0)

Anonymous Coward | about a year ago | (#45080465)

I dunno, america's world contributions are now overshadowed completely by the amount of money it owes everyone.

Re:Ideal situations (1)

MightyYar (622222) | about a year ago | (#45081897)

Wait 'till we default. You're coming along for the ride with us... yeeeehaw!

Re:Ideal situations (-1)

Anonymous Coward | about a year ago | (#45080001)

It is not centimetre. Maybe Centimètre, but not without that accented e.
If you want google to correct it, change your language settings to French.

If you want to write a post in English, then the correct spelling is Centimeter.

Re:Ideal situations (1)

jawtheshark (198669) | about a year ago | (#45080033)

"centimeter" is American English. In British English it is indeed "centimetre". If the original posted had bothered to install both dictionaries (which I do, just to avoid this stuff), he would have found out the reason. So, I'll keep writing "centimetre", just like I keep writing "colour", "honour", "programme" and "centralization".

Re:Ideal situations (1)

jenningsthecat (1525947) | about a year ago | (#45080147)

"centimeter" is American English. In British English it is indeed "centimetre". If the original posted had bothered to install both dictionaries (which I do, just to avoid this stuff), he would have found out the reason. So, I'll keep writing "centimetre", just like I keep writing "colour", "honour", "programme" and "centralization".

In Canadian English also, it is "centimetre", although our version of English is a mongrel mix of British and American spellings.We too tend to insist on '-our' endings, yet we usually spell "program" the American way. For me, the most jarring Americanisms are referring to a negotiable instrument as a "check", and the words "nite", "lite", and "thru".

IIRC, "centralization" is the only correct spelling in the States, whereas both spellings are acceptable in the UK, with "centralisation" being the historical favourite.

Re:Ideal situations (1)

Anonymous Coward | about a year ago | (#45080201)

"nite", "lite", and "thru" are slangy colliqual spellings. You usually only see them in advertising. If someone sent me an email with any of those words spelled like that, I'd think they were an idiot.

You are correct about "centrali[sz]ation".

Re:Ideal situations (1)

jawtheshark (198669) | about a year ago | (#45080221)

Thanks for telling me about centralization. I always took the one that felt most wrong to me (I'm obviously not a native speaker). Good to know I can use the one that feels right.

Re:Ideal situations (-1)

Anonymous Coward | about a year ago | (#45080701)

You people really need help.

Re:Ideal situations (1)

mcgrew (92797) | about a year ago | (#45082617)

For me, the most jarring Americanisms are referring to a negotiable instrument as a "check", and the words "nite", "lite", and "thru".

Illinoisian here. The "nite, lite, and thru" are very new Americanisms used only by the young and ignorant, coined by marketers (Miller Lite) and annoy me, "thru" being the worst. "I thru the ball thru the hoop"? Fucking illiteracy. Threw the ball through the hoop, damn it. Marketers are abysmally ignorant of language, I've seen "carline" in auto commercials. It's pathetic.

Re:Ideal situations (0)

Anonymous Coward | about a year ago | (#45085033)

"I thru the ball thru the hoop"? Fucking illiteracy.

I've never seen someone intentionally replace threw with thru. I use thru as an abbreviated form of through in personal correspondence or when taking notes, which seems pretty common.

Re:Ideal situations (0)

Anonymous Coward | about a year ago | (#45080585)

What? No SI? CIPD states that the correct invocation is centimeter.

Re:Ideal situations (0)

h4rr4r (612664) | about a year ago | (#45080867)

Look at that the brits surrendering to the french, will wonders never cease?

Re:Ideal situations (1)

WillAffleckUW (858324) | about a year ago | (#45083845)

There are subtle variations between British English, Canadian English, Australian English, and the bastardized incorrectly spelled American English.

In most of those, yard is correctly spelled either meter or metre.

Re:Ideal situations (1)

johnw (3725) | about a year ago | (#45081373)

If you want to write a post in American English, then the correct spelling is Centimeter.

FTFY

If you want to write a post in English English, then the correct spelling is Centimetre.

HTH

Re:Ideal situations (2)

ColdWetDog (752185) | about a year ago | (#45082259)

If you want to have a long pointless argument about spelling or grammar, you want to post on Slashdot.

Stability? (0)

Anonymous Coward | about a year ago | (#45079799)

When I switched from g to n (same router, same clients, just activated n mode), pings doubled, signal strength dropped by 10% and while the bandwidth is higher, so is the amount of random disconnects.
At 2 meter distance to the AP.

How's ac performing in that regard?

Re:Stability? (1)

Anonymous Coward | about a year ago | (#45079909)

Irrelevant. Your experience with N was likely a bad router or user error. Whatever the reason, your results are extremely atypical. It would be irresponsible to compare any sort of standard to a weird oddball experience.

Seriously, if you're experiencing ANY "random disconnects", it's time to update the firmware or flat out get a new router. That should be your first clue that something is wrong with your setup.

Re:Stability? (1)

skids (119237) | about a year ago | (#45081737)

I would not call those results atypical. Signal strength will drop, though in many cases 5GHz will be cleaner in the first place so it makes up for it in quality.

But the client behavior when presented with multiple APs on both 2.5GHz and 5GHz, and when presented with multiple APs some of which are N-capable and some just a/b/g is generally abysmal. We have lots of clients that students bring from their simple single-AP 2.5GHz home networks and just cannot cut it in a WPA-enterprise environment with lots of infrastrcuture APs around. They jump around between APs constantly, often choose APs based on mysterious metrics which are probably the worst choice of available APs, and very often the worst of them manage to trigger themselves to re-ask for credentials despite being told to remember them -- I don't know how that got into their codebase, but we've got several users that get constant cred popups. To top it off the UI on the devices has been dumbed down to the point where there is no user-level control to the degree of selecting preferred BSSIDs or tweaking any parameters whatsoever. Most cannot even tell the user what BSSID they are currently connected to.

I'm glad 11ac is going to force device manufacturers to start putting 5G antennas in again, but anyone running an enterprise WAN would be well advised to increase their AP density to 5G full coverage, drastically reduce the tx power on their 2.5GHz radios so they look quieter than the 5G radios, and not rely on the devices falling back to b/g/n-2.5 reliably. WiFi driver software is apparently written by conpanies that have invested zero into recreating real-world "BYOD" scenarios for QA purposes.

Re:Stability? (3, Insightful)

_merlin (160982) | about a year ago | (#45079985)

2.4GHz is far too crowded. Switch to 5GHz and you should see an improvement, particularly if you're in the same room as the AP.

Re:Stability? (3, Informative)

wagnerrp (1305589) | about a year ago | (#45080557)

Switch to 5GHz and you should see an improvement

Combined with further reduction in range. With an ASUS N56U, in the middle of nowhere with no interference, 2.4GHz becomes unreliable at around 700ft. 5GHz drops out somewhere around 450ft.

Re:Stability? (1)

amorsen (7485) | about a year ago | (#45082799)

If you are trying to reach an access point 150m away, you are not in a densely populated area. 2.4GHz will work fine for you without interference. The lower range of 5GHz is an advantage, it helps ensure that the band has low interference.

(Although I am sure it will get crowded soon. 60GHz, here we come...)

Re:Stability? (1)

wagnerrp (1305589) | about a year ago | (#45083433)

In a warehouse, actually. The only "interference" is from trying to get a signal through concrete and steel spaceframe. We brought an access point to connect to the equipment we were installing wirelessly, until the customer could get around to installing their own wireless infrastructure. It was a dual-band access point, and the 2.4GHz signal was significantly higher performance at range, for obvious reasons. Strangely enough, when the customer did install their infrastructure, it was 802.11a.

Re:Stability? (1)

denis-The-menace (471988) | about a year ago | (#45081761)

I bet you have a NetGear WNDR3800 with Stock firmware.

Nothing but disconnects and reboots. Especially with N.

Netgear's answer: Discontinue the product.

Now try it in urban neighborhood (1, Interesting)

Anonymous Coward | about a year ago | (#45079823)

With 20+ APs contending for their own slice of half or third of 5 ghz band. 802.11ac took the best feature of the 5 Ghz band, plenty of non overlapping channels, then turned into back into the quagmire of the 2.4 Ghz band by allowing 80 and 160 Mhz spectrum usage. Of course, the router manufacturers are going to enable 160 Mhz by default even when everyone in the neighborhood is on a 25 Mbps cable modem connection.

Re:Now try it in urban neighborhood (5, Interesting)

heypete (60671) | about a year ago | (#45079999)

Fortunately 5GHz penetrates walls very poorly -- I have a 6cm thick concrete interior wall (I'm in Switzerland, after all, they love concrete) that separates too rooms. The 5GHz signal in the room without an AP is so bad that my network card (a PCI-Express card for a desktop with three external antennas) essentially refuses to connect. 2.4GHz works fine. This is in an area with exactly zero 5GHz Wi-Fi users within range, a noise floor of about -95dBm, and no other sources of interference.

Channel bonding on 5GHz makes a lot of sense due to its extremely short range.

Re:Now try it in urban neighborhood (1)

AmiMoJo (196126) | about a year ago | (#45080423)

My experience is that 5GHz penetrates several walls and the floor up to my bedroom only slight less well than 2.4GHz. YMMV etc. but I can also see a few other people slowly getting on to 5GHz as well (fortunately all on the default channel, stupid Virgin routers).

Re:Now try it in urban neighborhood (1, Insightful)

dinfinity (2300094) | about a year ago | (#45082609)

Conclusion: your house is made of crappy materials.

Re:Now try it in urban neighborhood (1)

damaki (997243) | about a year ago | (#45080747)

Radio signal shaping is used to overcome range limitations. As seen in the Canard PC Hardware french magazine, 5 Ghz in *pinpoin*t mode performs mostly better than raw 2.6, even with walls on the way. Though if you're totally out of 5GHz range, you'll use the 2.4GHz range.

Re:Now try it in urban neighborhood (1)

Anonymous Coward | about a year ago | (#45080903)

not my fault you live in a converted bomb shelter.

Re:Now try it in urban neighborhood (0)

Anonymous Coward | about a year ago | (#45081369)

I have a 6cm thick concrete interior wall (I'm in Switzerland, after all, they love concrete) that separates too rooms.

I would bet that the problem isn't the concrete per se, but the wire mesh that goes in that sort of precast concrete element. The wire mesh is like chicken wire and acts as a small faraday cage, which screws up the connection.

Thicker concrete elements such as proper concrete slabs, which are frequently well over 15cm thick, tend to be assembled with a grid of 12mm diameter rebars spaced between 10 and 20cm. That's why you get to connect to the wifi hotspot in the 7th floor of a concrete structure when you are sitting on the building's entrance.

Re:Now try it in urban neighborhood (0)

Anonymous Coward | about a year ago | (#45084155)

Fortunately 5GHz penetrates walls very poorly -- I have a 6cm thick concrete interior wall (I'm in Switzerland, after all, they love concrete) that separates too rooms. The 5GHz signal in the room without an AP is so bad that my network card (a PCI-Express card for a desktop with three external antennas) essentially refuses to connect. 2.4GHz works fine. This is in an area with exactly zero 5GHz Wi-Fi users within range, a noise floor of about -95dBm, and no other sources of interference.

2.5 GHz will not penetrate 6 cm of concrete either.

Re:Now try it in urban neighborhood (1)

neokushan (932374) | about a year ago | (#45080003)

The speed of your internet connection is irrelevant to the speed of your home network connection. There is such a thing as a LAN.

Re:Now try it in urban neighborhood (1)

wagnerrp (1305589) | about a year ago | (#45080589)

No. The AC is saying that even though most people are only using their wireless access point for access to their internet connection, and their internet access is only 25Mbps, they will still feel the need to use an entire 160MHz swath (or be too ignorant to configure it to not use that much spectrum).

Re:Now try it in urban neighborhood (1)

jones_supa (887896) | about a year ago | (#45081375)

Many people use only internet-bound traffic and would be fine with the WiFi speed being capped to the nearest matching speed.

Re:Now try it in urban neighborhood (1)

skids (119237) | about a year ago | (#45081845)

At 80MHz in the US there will be 5 non-overlapping channels. This may sound only 66% better than 3, but the topology of the packing problem makes it many, many times better than 3.

I doubt 160MHz will be in use by people that have to actually manage frequencies, except near the offices of the PHB. I could only see that becoming a problem in dense apartment buildings with many individually "administered" OTS systems -- in residential neighborhoods the 5G will be pretty much stopped by walls, at least to the point of not being strong enough to cause much interference.

wow! that's really useful to (0)

Anonymous Coward | about a year ago | (#45079895)

someone like me with a 400kbps up/down Internet connection - wowsers!

Re:wow! that's really useful to (2)

Marco Tedaldi (3390641) | about a year ago | (#45079923)

Yes. You're absolutely right. Because the only use for a WLAN is using the internet...

Re:wow! that's really useful to (0)

Anonymous Coward | about a year ago | (#45080043)

Reality suggests it's the primary use for 99.9997% of users even if it isn't the only use.

Linux Driver support (1)

Anonymous Coward | about a year ago | (#45079963)

Linux driver support for most of the 802.11ac devices are still iffy which doesn't help.

Why not properly implement 802.11n first? (5, Interesting)

Anonymous Coward | about a year ago | (#45079995)

Hardware manufacturers I'm pointing my my finger at you. The most powerful features of 802.11n are largely unimplemented. Laptop/tablet/phone Support for 3 spatial streams is about as rare and rocking horse shit. Support for even 5 ghz is spotty at best and its hard to find out if whatever piece of hardware you want to consider buying even supports it. Heck even 2 spatial streams at 2 ghz is something your lucky to get unless you spend more than $699 on a laptop. The lowest common denominator for 802.11n and what the "wireless n" wifi support really means for half the devices on the market is a single spatial stream 802.11n at 2 ghz, which is 65 Mbps max. I can buy a mid range smartphone with 4g support and the wifi is still single spatial stream at 2 ghz. Hardware manufacturers have no incentive to put better implementations of 802.11n in their because most customers aren't savvy enough to tell the difference and demand better from device manufacturers. 802.11n is on old specification. There's no excuse why 2 spatial streams can't be the minimum. The silicon to do this is cheap and has been refined for many years.

802.11ac will probably suffer the same fate. The minimum implementation to get the "wireless ac" sticker on the box is going to be what half to three quarters of the devices on the market will support, even 10 years from now.

Re:Why not properly implement 802.11n first? (0)

Anonymous Coward | about a year ago | (#45080069)

I'd say the trend will only worsen as long you're looking at markets dumbed down to common denominator - which is your internet bandwith, sadly.

In south korea 2 streams is considered pretty much standard, 3 streams higher end - both routers and cellphones/tablets. 802.11g is almost unheard of as it is thing of the past 2000s.

Ironically this is the result of asian-style central planning monopoly (both vendors and ISPs) as the government does the right thing - "hey iptime, samsung, kt ... we gave you monopoly over your segments, but your stuff better be fastest in the world or something".

Re:Why not properly implement 802.11n first? (0)

Anonymous Coward | about a year ago | (#45080123)

The notebook I bought in 2008 had 5 GHz support already. Then just recently upgraded it to one with a sandy bridge cpu and that did not have... Went to ebay and ordered the same old wifi card my old notebook had, cost $8, inc shipping from China. The manufacturer probably saved 10 cents on each notebook because he decided to choose the inferior type.

Re:Why not properly implement 802.11n first? (1)

fuzzyfuzzyfungus (1223518) | about a year ago | (#45080227)

The notebook I bought in 2008 had 5 GHz support already. Then just recently upgraded it to one with a sandy bridge cpu and that did not have... Went to ebay and ordered the same old wifi card my old notebook had, cost $8, inc shipping from China. The manufacturer probably saved 10 cents on each notebook because he decided to choose the inferior type.

One quirk in the WiFi market (less noticeable in strictly consumer laptops; but more visible in business ones) was the wrinkle introduced by 802.11A more or less entirely dying a horrible death.

For a while, '802.11A/B/G' was sort of the standard boring-business-laptop wifi card to have; but (I presume) as the volume of consumer wireless increased, and with it shipments of B/G chipsets, A withered and N (while theoretically, if the right boxes are ticked, doing pretty much everything A did, and more) can also be a 2.4GHz-only, incrementally-better-than-G flavor and often is, and so with the decline in setups with explicit A support, you frequently lost features with B/G or B/G/N(lite edition) cards swapped in instead.

Because 802.11A-capable APs always cost excessive amounts it isn't a surprise that A died (thanks to progress, it still tends to cost less to buy a proper 802.11N router and client card to match than it would to do the same for A); but A did have the advantage of specifying a relatively high baseline, unlike N, which is...flexible.

Re:Why not properly implement 802.11n first? (5, Informative)

girlintraining (1395911) | about a year ago | (#45080125)

802.11ac will probably suffer the same fate. The minimum implementation to get the "wireless ac" sticker on the box is going to be what half to three quarters of the devices on the market will support, even 10 years from now.

Every technology will suffer the same fate. Look, the problem isn't the technology, but noise pollution. The noise floor across the whole of the RF spectrum is rising by an average of 1db a year. That means that every three years, the 'room' gets twice as loud. Every new technology we roll out, every new device, is just another nail in that coffin. Like every other natural resource, humans just consume and consume, gorging themselves to excess until eventually there's nothing left.

In the 1930s, a single AM broadcast tower could cover most of a region in the US in the evening. Certain frequencies carried a worldwide range, albeit due to the unpredictable nature of the ionosphere, you never knew just where in the world your low power signal would land. They did this using spark gap radios and shit with vaccum tubes in it. Today, the same feat can only be achieved with DSPs because the noise floor has come up so much most of the signal is trash after only a couple hundred miles.

Cell phone companies are continually trying to keep up with ever denser concentrations of towers; And it's not because of data-thirsty hipster iphones... it's because a few hundred milliwatts barely gets you across the street anymore. It's a regulatory nightmare just finding a spot for a new tower and getting it approved... and companies fall farther behind every year on meeting coverage goals.

We aren't just sucking up bandwidth on a per-frequency basis... every radio device contributes to global noise. Our RF spectrum is dying the death of a thousand papercuts. And all of this we can blame on two things; A complete lack of government coordination to share bandwidth and unify technologies using something like SDR across all wireless devices, brought on by competition by various companies to be the last man standing at the auctions and with technology able to "scream" just a little bit louder than the competition through a dizzying array of RF engineering cheats to increase effective broadcast power in a way the FCC can't penalize.

Your tax dollars at work people.

Re:Why not properly implement 802.11n first? (3, Interesting)

AmiMoJo (196126) | about a year ago | (#45080499)

It's not quite that bad. The demand for low power devices that run a long time on batteries is actually reducing transmit power in many applications. It will take time for people to upgrade, and unfortunately certain devices like wifi routers will still be quite shouty as they advertise to non-existant 802.11b clients, but the trend is generally towards lower power and higher data rates (which mean less time with the transmitter turned on). Strategies for sharing available spectrum are also improving, from basic frequency hopping to things like directional signal shaping in 802.11ac.

We are also starting to use spectrum more efficiently. For example switching to digital TV gave us more channels in less spectrum.

Noise floor isn't necessarily all that important for modern devices either. Consider that GPS signals are actually below the thermal noise floor when received on the ground. DSPs are cheap.

The reason for all the Interference (2)

Ozoner (1406169) | about a year ago | (#45080515)

> The noise floor across the whole of the RF spectrum is rising by an average of 1db a year.

You are correct, but not for the reasons you discussed. If the millions of Transmitters were clean and well designed, they would not cause RF interference to other users (except where they were sharing common frequencies).

The problem is that much of the electronics junk generate spurious harmonics. Plasma TV's, PC's, BPL, etc. all put out a horrendous range of broadband rubbish.

This is compounded by many manufacturers and importers ignoring the existing EMC standards, as well as the corrupt regulatory bodies (FCC etc) turning a blind eye to the cheap plastic junk being imported.

Just one specific example. Once upon a time, manufactures used linear-mode power supplies with large transformers. In an effort to reduce costs, they have universally changed to using switch-mode supplies. These supplies are certainly cheaper, but they almost always generate much higher levels of radio interference.

There's a trade-off: Being able to buy cheap electronics means that there's a good chance you will be unable to enjoy it due to the resulting interference levels.

Re:The reason for all the Interference (2)

wagnerrp (1305589) | about a year ago | (#45081041)

In an effort to reduce costs, they have universally changed to using switch-mode supplies.

And here I thought we switched to the far more complex switched mode power supplies because linear ones have terrible efficiency and power factor.

Re:The reason for all the Interference (1)

Kaenneth (82978) | about a year ago | (#45082361)

In an effort to reduce costs, they have universally changed to using switch-mode supplies.

And here I thought we switched to the far more complex switched mode power supplies because linear ones have terrible efficiency and power factor.

power usage is a cost.

Re:The reason for all the Interference (1)

wagnerrp (1305589) | about a year ago | (#45082791)

That's a cost on the customer, not the manufacturer.

Re:Why not properly implement 802.11n first? (4, Informative)

wagnerrp (1305589) | about a year ago | (#45080653)

In the 1930s, a single AM broadcast tower could cover most of a region in the US in the evening.

That's because back in the 1930s, AM stations like WLW were operating at half a megawatt.

Re:Why not properly implement 802.11n first? (1)

fuzzyfuzzyfungus (1223518) | about a year ago | (#45080191)

I'd be more inclined to blame standards bodies(though, depending on how they are structured, that could go right back to hardware manufacturers, albeit probably not the really low-margin box-slinger ones, more the silicon guys).

When it comes to 'features' that customers can see, manufacturers are crazy responsive (sometimes to the point of just making them up, or lying about them; but so it goes...) You care about the sticker price? We shave every penny that doesn't remove a bullet point from the spec sheet. You want megapixels? Here, have so many damn megapixels that the less-quantifiable shitty plastic optics can't even steer photons on to some of them consistently! More megahertz is faster? How about a NetBurst architecture?

Whenever it is difficult to convey the benefits of a given feature, it tends to get chopped in favor of price. Unfortunately, then economies of scale kick in and help mop up the remaining few people who do care. (In specialty markets, like APs designed for campus WiFi deployment, you are dealing with a situation where buyers really do need all the power they can get, and are often even capable of understanding what to ask for; but that's also the land of $400 access points, so it doesn't trickle down to mass market laptops much...)

Since you can get the 802.11n! label by implementing a single stream at 2.4GHz, and most people don't even know that something else is available (the existence of 5GHz is starting to break through, if only because 2.4 is blitzed into total uselessness in denser areas), the manufacturers don't have much incentive to be 'that one that mysteriously costs $10 more than the other ones, and they are all 802.11n'.

Re:Why not properly implement 802.11n first? (0)

Anonymous Coward | about a year ago | (#45080253)

which is 65 Mbps

then i guess it's a good thing that 99% of users don't "need" even half that much bandwidth via wifi. even just 10 mbit is enough for nearly everyone. wanna go faster? run some friggin cat6.

Re:Why not properly implement 802.11n first? (0)

Anonymous Coward | about a year ago | (#45081667)

10 Mb may be enough for you, but my net connection is 20Mb down and that is the cheapest tier my ISP offers, if I wanted to pay for it I could have 100Mb down. Cat6 is useless for my tablet and phone and rather limiting for my netbook too. And unless your house is built to make it easy, it is difficult to install without having ugly cables running along to the walls.

Re:Why not properly implement 802.11n first? (0)

Anonymous Coward | about a year ago | (#45080345)

Hardware manufacturers have no incentive to put better implementations of 802.11n in their because most customers aren't savvy enough to tell the difference and demand better from device manufacturers.

Cell phones have 8-megapixel cameras with full 1080p HD. This is a direct result of the cell phone becoming the dominant camera in the world, and had zero to do with users not being "savvy" enough.

It's not hard to find justification to make wireless connections better and faster. 100 million customers are screaming for it every day, and are blindly willing to pay another $5 per month or another $50 more for the new tech. Once the consumer bitching hits the noise floor, hardware will have no choice but to advance.

On top of that, you're trying to claim there aren't enough savvy customers in the world? If the electronics customer base is increasing in any particular metric, it is certainly that one. By the time a consumer gets to a store to put their hands on a product they already know how many watts it consumes at startup, what version of OS is on it, and how they can root it, which any 10-year old who knows how to search YouTube can do. That's about as far "savvy" as you need to get these days. Hacking for Dummies on YouTube has changed that forever.

And if you geeks want a bad-ass wifi phone bad enough, there's always Kickstarter. Good luck on getting your battery to last more than an hour. Oh, you mean there's a valid reason we don't have 100Mb streaming capability to cell phones? Oh, users are constantly bitching about battery performance? and now they want it..thinner, but with a huge screen? Gee, I have no idea how dual spatial streams didn't make it to the top of the priority list...

Re:Why not properly implement 802.11n first? (0)

Anonymous Coward | about a year ago | (#45080551)

Hate to break it you, but most purchasers of smartphones are barely computer literate joe six packs, not leet uber geeks. Most people buy based on buzzwords and marketing. Contract is up and they want 4g LTE because everyone else's new phone has it. They buy primarily on the devices appearance, size, and the bullet points on the little card on the shelf is tethered to. The salesman tells them 8 megapixels are better than ones that are less and they believe it wholeheartedly, but they don't mention the fact that the lenses are so shitty and wind up fingerprint smudged that it doesn't make any difference. The stupidity of most smartphone buyers is why we have phones with non removable batteries that will be toast before the contract is up and phones without upgradable microSD storage because they want us to spend an extra 100 dollars for an extra 8 or 16 gb.

Re:Why not properly implement 802.11n first? (1)

AmiMoJo (196126) | about a year ago | (#45080439)

Two spacial streams means two antennas, which is why ultra compact phones don't do it. Larger devices are just being cheap - even if the silicone and the antennas are cheap there are patent licencing fees to pay too.

Re:Why not properly implement 802.11n first? (1)

tlhIngan (30335) | about a year ago | (#45082027)

Hardware manufacturers I'm pointing my my finger at you. The most powerful features of 802.11n are largely unimplemented. Laptop/tablet/phone Support for 3 spatial streams is about as rare and rocking horse shit. Support for even 5 ghz is spotty at best and its hard to find out if whatever piece of hardware you want to consider buying even supports it. Heck even 2 spatial streams at 2 ghz is something your lucky to get unless you spend more than $699 on a laptop. The lowest common denominator for 802.11n and what the "wireless n" wifi support really means for half the devices on the market is a single spatial stream 802.11n at 2 ghz, which is 65 Mbps max. I can buy a mid range smartphone with 4g support and the wifi is still single spatial stream at 2 ghz. Hardware manufacturers have no incentive to put better implementations of 802.11n in their because most customers aren't savvy enough to tell the difference and demand better from device manufacturers. 802.11n is on old specification. There's no excuse why 2 spatial streams can't be the minimum. The silicon to do this is cheap and has been refined for many years

The hardware isn't as cheap as you make it out.

The chips support it, but they require external blocks (each stream requires a T/R switch, a PA, a LNA, and other ancillary RF components). These parts have to be put on RF-impedance checked PCBs and run to separate antennas. It's actually a lot of design work in the end, and it's complex enough that most use WiFi modules that have it all (SD/USB/SPI/etc interface one end, u.FL connector on the other). Of course, the ones that fit in the available space only have single stream support.

As for 5GHz support, the best way I've found is remarkably simple - look for 802.11a support. If the package says it supports "802.11b/g/n", then it does NOT support 5GHz. However, if it says "802.11a/b/g/n", then yes, it does support 5GHz. That's all you need to look for.

There are probably exceptions, like devices that support 802.11n on 5GHz but not 802.11a, and devices that support 802.11n on 2.4 only while supporting 802.11a, but they're rare and honestly, if you're going to the trouble of supporting 5GHz, it doesn't hurt to add the support necessary for the missing a or n.

So, if you see "802.11bgn" - no 5GHz. "802.11abgn" yes it supports 5GHz.

Every device I've seen lists it like that, so it's quick and easy.

Don't judge too much from early implementations (3, Interesting)

mysidia (191772) | about a year ago | (#45080249)

Have they implemented the full 256QAM 5/6 rate yet with full 80+80MHZ bonding yet? (160 MHZ of channel bandwidth) using 8 transmit antennas and 8 receive antennas on both AP and wireless clients?

I expect early APs and early chipsets will not yet fully implement all the advantageous features 802.1AC has to offer

They'll have made compromises to save money.

Re:Don't judge too much from early implementations (3, Interesting)

AmiMoJo (196126) | about a year ago | (#45081279)

Realistically few devices will ever implement 16 antennas. Aside from the impracticality of fitting 16 antennas into a portable device the power consumption of all those LNAs and PAs is going to be considerable, as is the DSP power needed to decode and correlate it all.

Re:Don't judge too much from early implementations (1)

mysidia (191772) | about a year ago | (#45087029)

Realistically few devices will ever implement 16 antennas.

It's 8 antennas. 8 on the AP, and 8 on the client.

I do see APs implementing all 8 Antennas, so they can achieve high throughput to other APs, and achieve a larger number of clients.

There are still plenty of benefits to a 4 antenna device connecting to an 8 antenna AP.

This is awkward (0)

fa2k (881632) | about a year ago | (#45080455)

Seems I can't use speed as an argument for wired ethernet any more, as I'm not getting consistently over 60 MB/s with wired anyway, for file transfers. Technology is finally catching up, or alternatively, wired technology and OS-level file transfer efficiency have stagnated for long enough.

Some possible caveats, why wired may still be good: 1) Does this include client-to-client transfers, or are those half the speed? 2) Is there any directionality such that multiple clients can use more than the corresponding fraction of the badnwidth 3) what is the range...?

Re:This is awkward (0)

Anonymous Coward | about a year ago | (#45081039)

You should be getting over 100MB/s. Check your cables and make sure you don't have a dodgy switch or one of the flaky Realtek chips.

Re:This is awkward and yet... (0)

Anonymous Coward | about a year ago | (#45081123)

and yet ironically in my flat in Oxford earlier this year I couldn't get 802.11abcgn to work through the bloody thick walls!

Cue the 5m ethernet cable my brother lent me and I strung it across the flat (from filerserver to desktop you see)

Re:This is awkward (2)

EmagGeek (574360) | about a year ago | (#45081553)

You're probably being limited by the R/W speed of your hard drive and O/S.

I consistently get 100MB/s over my network between two machines that are capable of reading and writing at least that fast to their storage systems, and this is with cheap Realtek gigabit equipment.

Re:This is awkward (1)

Voyager529 (1363959) | about a year ago | (#45082947)

1.) Wired ethernet will give you more than that; as another responder said you're likely to be limited by the write speed of your storage devices. If you really want to test it, create a RAM disk on both machines, and transfer a file from one RAM disk to another - you'll see yourself saturate the line pretty quickly.

2.) Wireless is inherently half-duplex, because you can't transmit and receive at the same time on the same frequency. Dual-band technology was supposed to help out with that, but it only works if everything on the LAN supports it. Else, one band will be more crowded than the other, and messes will still be made. Additionally, the more devices you've got running on your wireless network, the greater a collision domain you've got.

3.) Pursuant to 1 and 2, when you've got large numbers of devices, it's where wired really shines. no one gets a good signal with a hundred other access points and endpoints on a LAN, but if you've got everyone wired to a half decent switch, everyone can communicate much more efficiently. Additionally, you may be seeing poor speeds if your gigabit switch is low quality and has a poor amount of backplane bandwidth.

Re:This is awkward (0)

Anonymous Coward | about a year ago | (#45083861)

What's your ping time and packet loss on wired compared to wireless?

My speed (1)

Dwedit (232252) | about a year ago | (#45082905)

I'm getting 7-9 MB/sec with my 802.11ac adapter, whether on the 2.4GHz band or the 5GHz band. So at least the theoretical speed of wireless G is finally real.
It is much faster than the 2.5MB/sec I was getting on the so-called "a/b/g/n" adapter.

For comparison, on actual gigabit ethernet, I get about 88-100MB/sec.

Worst case Latency? (1)

xtronics (259660) | about a year ago | (#45083693)

No number for worst case latency - Something needed so VoIP actually works.

I suppose it is not very good or they would have mentioned it.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>