Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel To Include Draft 802.11n In Centrino

kdawson posted more than 7 years ago | from the talk-to-me dept.

Intel 67

filenavigator writes "Intel announced at the Globalcom 2006 Expo that they will be including Draft 802.11n hardware in their Centrino chips. It will be interesting since they said that they will start doing this sometime in the middle of 2007, and the 802.11n standard is not to be finalized until 2008. Additionally Draft 802.11n has been dogged by interoperability problems." From the article: "Although the news caused barely a ripple of reaction in the audience of software and hardware engineers, there are industry analysts who have already warned large buyers of wireless technology to resist the temptation to deploy high-speed IEEE 802.11n devices until the standard is ratified."

cancel ×

67 comments

Eh (2, Insightful)

wiz31337 (154231) | more than 7 years ago | (#17120070)

The only major issues I've seen with 802.11n is the decrease in range and the obvious speed differences. If it is backward compatible with 802.11a/b/g then this should be a big issue.

Re:Eh (1)

wiz31337 (154231) | more than 7 years ago | (#17120132)

So much for proof-reading... This shouldn't be a big issue.

Re:Eh (2, Informative)

Sancho (17056) | more than 7 years ago | (#17120686)

The chipsets appear to be backwards compatible with 802.11g. Apple's been shipping draft-n equipment for awhile now, though only marketing it as 802.11g. Seems to work fine on my network.

Re:Eh (1)

foamrotreturns (977576) | more than 7 years ago | (#17133186)

Hold on... A decrease in range? I thought 802.11n was supposed to boost range.
According to this product description [superwarehouse.com] :
"By overlaying the signals of multiple radios, Wireless-N's "Multiple In, Multiple Out" (MIMO) technology multiplies the effective data rate. Unlike ordinary wireless networking technologies that are confused by signal reflections, MIMO actually uses these reflections to increase the range and reduce "dead spots" in the wireless coverage area. The robust signal travels farther, maintaining wireless connections up to 4 times farther than standard Wireless-G."

I'll wait for... (0)

Anonymous Coward | more than 7 years ago | (#17120136)

802.11p

Why so long to finalize the standard? (1)

b0s0z0ku (752509) | more than 7 years ago | (#17120144)

I recall seeing Belkin "pre-N" routers for sale in late 2004/early 2005. (Not that I'm convinced that the average home user needs more than 54Mbps at that time, especially because most broadband connections are still in the 1.5 to 3.0 range. Actually, I'm doing fine with 802.11b still).

-b.

Re:Why so long to finalize the standard? (1)

DCstewieG (824956) | more than 7 years ago | (#17120280)

For normal browsing, yes, 11b is all you need. But its actual throughput in my experience is around 1 or 2 mbps, which can limit internet connections (I routinely get 4-6mbps over cable with BitTorrent) but more importantly file transfers over the network like streaming video. And if you use a lot of network storage even 11g is pretty painful.

Re:Why so long to finalize the standard? (5, Insightful)

GotenXiao (863190) | more than 7 years ago | (#17120860)

Why does everyone always assume that wireless networks are only ever used for internet access? Am I forbidden from running VNC to my desktop from my laptop? Can I not transfer files to my wifi-enabled Archos? Streaming media from my desktop to a TV downstairs?

Re:Why so long to finalize the standard? (1)

fkamogee (619579) | more than 7 years ago | (#17121588)

exactly! mod parent up.

Re:Why so long to finalize the standard? (1)

willy_me (212994) | more than 7 years ago | (#17125286)

Why does everyone always assume that wireless networks are only ever used for internet access?

.
Because that is exactly what the majority of wireless networks are used for. Not that you don't have a perfectly valid point, but you don't sound like a typical user.

Once more wireless devices become popular (like 802.11 cell phones, streaming media players, printers, etc..), people will start to require faster wireless networks. Right now they aren't required for most users, but here is the catch, they are required for the development/deployment of new wireless devices. So the statements "wireless networks need to be faster" and "wireless networks are only used for internet access" both apply to the majority of users (assuming they're going to want that cool new wireless device in the future.)

Willy

Re:Why so long to finalize the standard? (0)

Anonymous Coward | more than 7 years ago | (#17140344)

Exactly. Are peer to peer networks dead? Why have computers sharing an Internet connection, but not each other?

Am I the only one who writes CDs and DVDs across a LAN (from network drives)? I'd very much like to write a DVD faster than 2x, or a CD faster than 8x - that's about my limit with my old hub. I'd upgrade to a switch and/or gigabit ethernet LONG before I'd consider a wireless router. I move dvd-sized content around my lan frequently, 54mps (like you ever reach that speed anyway!) just wouldn't do squat for me...

I also run apps from my main pc from the other pcs that I have, mostly using ssh and X forwarding. That's already quite fast overall, but it does get taxed with heavy web page scrolling for example. I need that FASTER, not slower.

Call me when wireless speeds exceed 100 ethernet consistently in real world situations...

Re:Why so long to finalize the standard? (4, Informative)

ArchAbaddon (946568) | more than 7 years ago | (#17121250)

"Pre-N" was just a fancy marketing ploy be Belkin; their "Pre-N" products was made well before even Draft 1 was released. It is proprietary, and when the 802.11n draft is standardized, will probably not be upgradeable to the standard, and will only be backwards compatible to 802.11g with other wireless devices.

Funny misread (1, Funny)

Anonymous Coward | more than 7 years ago | (#17120350)

"news caused barely a ripple of reaction in the audience"

I misread that as "barely a nipple reaction".

Re:Funny misread (4, Funny)

b0s0z0ku (752509) | more than 7 years ago | (#17120372)

I misread that as "barely a nipple reaction".

aka "802.11n leaves us cold?"

-b.

Re:Funny misread (1)

eclectro (227083) | more than 7 years ago | (#17123526)

aka "802.11n leaves us cold?"

Spoken and moderated like true nerds that don't show their pasty bodies at the pool and really know a nipple. Because, a cold nipple, is a perky nipple [teenadviceonline.org]

Re:Funny misread (1)

b0s0z0ku (752509) | more than 7 years ago | (#17124212)

Spoken and moderated like true nerds that don't show their pasty bodies at the pool and really know a nipple. Because, a cold nipple, is a perky nipple

"Leaves us cold." Hence, no change from the previous cold state and, therefore, no change in nipple erection. If 802.11n didn't leave us cold, our nipples would rapidly be flattening. As far as the nerd thing: We Have Nipples Too y'realise, so we all know how they work...

-b.

Re:Funny misread (1)

Cygfrydd (957180) | more than 7 years ago | (#17120790)

Extensive testing by the FCC and UL has virtually eliminated the possibility of 802.11n-induced biological effects in mammary tissue. The tingling must be something else.

Re:Funny misread (0)

Anonymous Coward | more than 7 years ago | (#17120898)

Comments like this are why I browse with a -1 modifier for "Funny".

Re:Funny misread (1)

stunt_penguin (906223) | more than 7 years ago | (#17121484)

And comments like yours are why I browse with a -1 modifier for Anonymous Cowards

Sign Me Up! (1)

Lord_Slepnir (585350) | more than 7 years ago | (#17120492)

I want to pay extra money for something that probably won't work very well (draft spec) and probably won't work at all with other 802.11n devices at all once they adhere to the real spec.

Now I'm off to buy some more SCO stock!

Re:Sign Me Up! (1)

Threni (635302) | more than 7 years ago | (#17120542)

I want something that works now, not to wait until some standard is agreed which will make no difference to the kit I've already got. I don't care if my laptop won't work with some stuff in 4 years time, because, like all hardware, it will be laughable in 4 years time - that is, if it's working at all.

Re:Sign Me Up! (1)

cloakable (885764) | more than 7 years ago | (#17129382)

I have a ThinkPad 240 here that would disagree with that.

300MHz, 128MB RAM, and still capable of playing video, music, etc. What more do you need from a ultralight laptop? And if I need more power, I can always use XDMCP to login to my desktop for more power.

Re:Sign Me Up! (1)

Threni (635302) | more than 7 years ago | (#17137818)

> 300MHz, 128MB RAM, and still capable of playing video, music, etc. What more do you need from a ultralight laptop?

Enough power to develop .net/netbeans stuff. Those IDEs are resource hogs!

looks like i'll have to buy the white album again (1)

teh_chrizzle (963897) | more than 7 years ago | (#17120704)

i just got my router, bridge, and laptop moved over to 54g, it figures that things would change.

Re:looks like i'll have to buy the white album aga (2, Insightful)

tomstdenis (446163) | more than 7 years ago | (#17122512)

Why? Is your 54G stuff not working?

Tom

Re:Sign Me Up! (1)

nonsequitor (893813) | more than 7 years ago | (#17121228)

Its my understanding that the hardware is unlikely to change between Draft-802.11n and the final 802.11n spec. Once the spec is finalized you'll need to update your firmware. I believe thats what Apple and Intel are counting on. Apple is in bed with Intel at the moment so I highly doubt the centrino chipset will be incompatible with the one Apple put in the C2d MacBook Pros.

Time's up - Intel is now the standard (4, Insightful)

hirschma (187820) | more than 7 years ago | (#17120510)

Pretty obvious how this plays out:

* Intel will become, pretty much overnight, what all of these routers have to interoperate with,
* Everyone else tweaks their chipsets to work with Intel,
* Intel's interpretation of the draft standard becomes the standard.

Yeah, I'm quite sure that the IEEE will do something to rock that boat.

Re:Time's up - Intel is now the standard (3, Interesting)

MojoStan (776183) | more than 7 years ago | (#17123154)

* Intel will become, pretty much overnight, what all of these routers have to interoperate with,
* Everyone else tweaks their chipsets to work with Intel,
* Intel's interpretation of the draft standard becomes the standard.
As I said in another comment [slashdot.org] (before reading your "Score:5" comment), "the standard" (draft 2.0, due March 2007) will be set before Intel's chipset (due April 2007) is released. Draft 2.0 will be tested and certified by the Wi-Fi Alliance [arstechnica.com] , so Intel will most likely be tweaking their chipset to work with Draft 2.0. In fact, I bet all of the other wireless equipment makers will release their draft 2.0 gear before Intel.

It's both dangerous and misleading to embed N now (3, Insightful)

postbigbang (761081) | more than 7 years ago | (#17120524)

1) it's still a draft, and anything can change between now and then (ask Synoptics)
2) while backwards compatible with G, N requires special antennas (two of the, in diff-mode, so to increase bit-rate); Centrino silicon will be new
3) even though every fab house is trying to get marketshare in N, there's lots unproven about its future, and which technologies might eclipse it
4) it thwarts the draft process of the IEEE; but I guess standards will go to those that buy them.

Many tests have proven incompatibility issues, and the mistakes made. Reserving notebook real estate for a chipset is just a rook move, and nothing more.

Move along, therefore; nothing but PR prattle to see here.

Re:It's both dangerous and misleading to embed N n (1, Funny)

Anonymous Coward | more than 7 years ago | (#17124594)

Gee, you seem to have a firm handle on this. You really should mail those ignoramuses at Intel who have never thought about these kinds of issues, who don't know much about silicon, and who probably aren't good at keeping up with network technology and IEEE standards.



I'm sure your valuable insights could save them hundreds of millions of dollars.

Well let's see... (1)

remmy (102919) | more than 7 years ago | (#17120536)

Obviously there'd be interoperability problems.. the standards not final yet.

802.11n IN the chip? (1)

zaqattack911 (532040) | more than 7 years ago | (#17120552)

I always thought that centrino was the general laptop platform, and the chips were pentium-M? Centrino was just a marketing tech term to imply a laptop with certain capabilities which include wifi.

And if Centrino does literally mean the cpu chip, how the hell do they put a wireless network card IN the chip? Is this just a news report typo, or am I missing something?

Re:802.11n IN the chip? (1)

nukem996 (624036) | more than 7 years ago | (#17120706)

Centrino means the laptop has an Intel Pentium-M, Intel Core Duo, or an Intel Core 2 Duo and an Intel wireless card such as the IPW2100(802.11b), IPW2200(802.11b/g), or the IPW2915(802.11a/b/g).

Re:802.11n IN the chip? (1)

jrockway (229604) | more than 7 years ago | (#17122422)

also ipw3945.

Re:802.11n IN the chip? (2, Informative)

javaxman (705658) | more than 7 years ago | (#17120902)

Your information is starting to get just a tiny bit stale, although you're generally correct. "Centrino" can now be Pentium M or Core Solo, and "Centrino Duo" can be Core Duo or Core 2 Duo. [intel.com]


Actually, they said "chips" not "chip", probably meaning the Centrino platform is made up of a number of ( specified ) chips, and now an 802.11n package is included in the mix. Right now you're still Centrio if you include one of three approved Intel wireless packages [intel.com] ... this probably just means they've announced a fourth option. The real question is will OEMs put it in their laptops, will anyone tell buyers that the standard is not approved yet, and how well will it sell... judging by sales of existing "pre-N" stuff, I'm going to guess it's a real standards nightmare already.

Re:802.11n IN the chip? (1)

gad_zuki! (70830) | more than 7 years ago | (#17122282)

Centrino is a marketing term which more of less mean an intel cpu and an intel wireless chipset . Its an intel marketing strategy which more or less tells oems "We'll advertise this thing, you buy it, there will be demand." And frankly it works. Centrino products tend to be nice little packages.

Regardless, intel probably sees that it costs little to nothing to build in pre-n tech into their newer chipsets. Businesses and home users both want the supposed greater range and bandwidth. Apple has already done this. In their new Macs the wifi chipset is a broadcome pre-n chip. The worst case scenario is this stuff goes unused (or underused) until the final draft comes up.

Can someone explain this? (3, Informative)

troll -1 (956834) | more than 7 years ago | (#17120662)

The technology will someday scale to 600Mbps, according to Bill McFarland, a member of the IEEE committee, with a range 50 percent greater than available with Wi-Fi now.

In physics there's measurement called "skin depth" which is the distance a wave travels before its power level drops by 1/e or about 1/3. The formula is something like (wavelength/2*pi). The FCC regulates the power of 802.11n to something like 1mW per channel. So unless these new chips will have more power than is currently allowed, how can they have a greater range?

Re:Can someone explain this? (4, Informative)

b0s0z0ku (752509) | more than 7 years ago | (#17120858)

So unless these new chips will have more power than is currently allowed, how can they have a greater range?

Better error correction or use of a transmission method that's more robust when faced with a low signal/noise ratio, possibly. With a directional mic and possibly some filtering software, you may be able to hear shouting five miles away, for example.

-b.

Re:Can someone explain this? (4, Insightful)

InakiZuloaga (1022349) | more than 7 years ago | (#17121320)

It depends on how well are you able to receive. There's a parameter that is named the receiver sensitivity and that's the lowest power that it can receive and still get the correct data. If you have a receiver circuit with sensitivity of let's say -90dbm and that allows you to reach 0.1 miles, if you have another device that has a sensitivity of -93dbm that will allow you to reach 0.15 miles without changing the transmit power. The sensitivity depends on how noise inmune is your receiver to noise and that depends on the radio standard used.

Re:Can someone explain this? (1)

Jeff DeMaagd (2015) | more than 7 years ago | (#17121510)

I can't speak for range on its own, but it gets bandwidth in part because it uses the entire width of the band. Instead of three non-overlapping channels, you just get one channel.

It also gets three antennas, and I think there is some sort of RF interferomerty (sp?) or some such bag of tricks to take three signals and get a better signal than they could with just one.

Re:Can someone explain this? (0)

Anonymous Coward | more than 7 years ago | (#17122088)

right. i don't know the details of 802.11n, but it's pretty obvious that with several antennas on the AP you can do all sorts of directional tricks.. the same way a radio tower will use several antennas to beam radio to populated areas, instead of wasting power with isotropic radiation. the power still drops off as an inverse square, but your initial power in the direction of interest is greater. having several antennas on the AP helps with transmitting as well as with receiving (it's a little easier to see how correlating can achieve the effect in this case).

Re:Can someone explain this? (3, Informative)

Andy Dodd (701) | more than 7 years ago | (#17122820)

Ugh, that's the most horrible "let's throw some random terms into my post and make myself look smart" post I've seen in a while.

Skin depth has ABSOLUTELY NOTHING to do with this. Skin depth determines how far an RF signal will penetrate into a conductive or semi-conductive material (usually metal, often used to discuss RF penetration into water). Skin depth is a function of wavelength - The shorter the wavelength, the shallower the skin depth. Remember, this is a term of RF penetration *into a conductive or semi-conductive material* and is usually measured in fractions of a millimeter for most metals. It can be a matter of meters for water though, which is why submarines usually are contacted via VLF or ELF (very low frequency/extremely low frequency) - skin depth of VLF/ELF into water is pretty large due to the long wavelength. Still, in general, as far as Wi-Fi goes, skin depth is irrelevant and meaningless.

Freespace RF propagation follows the inverse square law, just like any other electromagnetic radiation.

That said, indoor wireless is typically NOT free space. The nature of indoor wireless means that a signal can take multiple paths between transmitter and receiver. Unfortunately, these paths can sometimes result in the signals at the receiver interfering destructively with each other, causing a significant reduction in signal strength. The best example you might be familiar with is FM radio - have you ever been sitting at an intersection in your car and the reception of the station you were listening to completely dropped out, only to come back to full strength when you moved your car a few feet? That's classic multipath fading.

One solution to multipath is to use two or more antennas to provide what is called diversity. Usually, if one antenna is in a "dead spot", an antenna a half wavelength or so away (or closer but with a different polarization) won't be. This is why almost all normal 802.11a/b/g routers have dual antennas and most PC cards and built-in WLAN cards have dual antennas. The card (usually) selects the one antenna that gives the best reception and uses it. (This is called selection combining. There are other diversity techniques that are better than selection combining but a bit more complex.) Some newer cards may use other diversity reception methods to improve 802.11a/b/g performance.

Now, 802.11n takes diversity to whole new levels. It uses what is commonly called "multiple input multiple output" or MIMO. Fundamentally, MIMO takes multipath and turns it from a disadvantage to an advantage by transmitting different data on each path. Thus, a MIMO system can achieve higher data rates by effectively using multipath to create multiple independent channels.

I have a paper saved somewhere that describes how MIMO works in detail, but the basics are that if you form a matrix with the complex path gains (i.e. both amplitude and phase) between individual transmit and receive antennas (e.g. t1, t2, t3, r1, r2, r3 for a 3x3 MIMO system) of the form
[[gT1R1 gT1R2 gT1R3]
[gT2R1 gT2R2 gT2R3]
[gT3R1 gT3R2 gT3R3]]

(BTW, Malda, LaTeX or MathML please? Octave/Matlab format isn't quite the hottest for representing a matrix in human readable form...)

you can perform operations (I believe a singular value decomposition but my memory could be wrong and it may be another decomposition) on that matrix to form two transformation matrices and a diagonal matrix. The diagonal matrix contains the path gains of three independent pseudochannels (which I believe are either the square root of the matrix eigenvalues, the eigenvalues themselves, or the square of their eigenvalues), and the transformation matrices can be used to take transmissions intended for the three pseudochannels and convert them to actual transmissions/reception on each antenna.

I'm sorry I did such a crap job explaining this, I really need to find that paper as it does a much better job. :)

Beware! Many companies have begun calling anything that uses multiple antennas "MIMO" even when it isn't true MIMO. An example is the Netgear WPN824 which uses an adaptive phased array antenna system from Ruckus Wireless (not to be confused with the Ruckus music service) and calls it "MIMO". Phased array antennas (even adaptive ones) have been around for years - they just aren't truly MIMO. That's not to say that they aren't beneficial - The classical approach to nullifying some multipath issues is to use a directional antenna and point it along the strongest path, and that's what the WPN824 effectively does. The nice thing about this approach is that if added properly, a phased array antenna system does not do anything to make a WLAN transceiver fail 802.11b/g compliance. If you disable the 108 Mbps "turbo" mode of the WPN824 (which is a function of the Atheros chipset that Ruckus' antenna is connected to and not of Ruckus' setup itself), the WPN824 is a 100% 802.11b/g compliant router that provides range benefits from that phased array antenna. (Ruckus happens to sell routers and clients with their antenna system but without any nonstandard features, BTW.) I have a WPN824 myself and love it.

Re:Can someone explain this? (0)

Anonymous Coward | more than 7 years ago | (#17123824)

Skin depth has ABSOLUTELY NOTHING to do with this. Skin depth determines how far an RF signal will penetrate into a conductive or semi-conductive material (usually metal, often used to discuss RF penetration into water).

I think you're wrong. Skin depth relates frequency to power drop off even when conductivity is zero. There doesn't have to be a material medium for this to hold true. See Griffiths, Intro to Electrodynamics 2nd ed.

Does draft 802.11n really work? (1)

heroine (1220) | more than 7 years ago | (#17120832)

Kiwipedia made it sound like 802.11n didn't really work. There was too much interferance from all the other 2.4Ghz devices. 802.11n over copper was replacing wireless 802.11n in most applications. It was the same modulation but over wire to increase bandwidth. When corporations talk about supporting draft 802.11n, they tend to refer to copper and not wireless.

Re:Does draft 802.11n really work? (3, Funny)

Gospodin (547743) | more than 7 years ago | (#17121202)

Kiwipedia: Your source for all things New Zealandish.

Re:Does draft 802.11n really work? (0)

Anonymous Coward | more than 7 years ago | (#17130792)

Kiwipedia: Your source for all things New Zealandish.

So... it's just a single page about Peter Jackson and the LotR?

Re:Does draft 802.11n really work? (1)

dgatwood (11270) | more than 7 years ago | (#17122406)

Why would someone do 802.11n over copper? Wired ethernet is at 10Gb/sec. speeds, while 802.11n is only a paltry 100-200Mbps.

Re:Does draft 802.11n really work? (1)

Andy Dodd (701) | more than 7 years ago | (#17122854)

And one of the key fundamental differences between a/b/g and n is the use of multiple antennas multipath effects to increase data rate by creating multiple independent data channels.

Copper is fundamentally single-path unless you use multple copper connections, but at that point channel bonding is a hell of a lot easier than MIMO. :)

sh17 (-1, Flamebait)

Anonymous Coward | more than 7 years ago | (#17120874)

that suuports sai*d. 'Screaming

Apple is shipping pre-n already (2, Insightful)

blackmonday (607916) | more than 7 years ago | (#17120986)

They are not advertising it, but Apple's new laptops have pre-n built in already. There is speculation that pre-n will fuel the iTV and its HD capable HDMI port. Don't you love rumors?

Re:Apple is shipping pre-n already (1)

fkamogee (619579) | more than 7 years ago | (#17121636)

So is Lenovo. (on the T60)

I'll give you interesting... (1)

djupedal (584558) | more than 7 years ago | (#17121064)

Apple has apparently slipped 802.11n into some of the currently shipping machines via buffed up Airport cards - gotta love stealth upgrades, eh? :)

Re:I'll give you interesting... (0)

Anonymous Coward | more than 7 years ago | (#17121838)

Half right -- the hardware they are shipping is supposedly 802.11n capable, but they have not yet shipped any 802.11n software or firmware. Seems like these "incompatibility" problems are meaningless if they can be fixed with a flash upgrade to new firmware once they standard is finalized! In the meantime, it is sufficient that they interoperate with hardware from the same vendor.

Why is the speed so big an issue? (1)

bendodge (998616) | more than 7 years ago | (#17121138)

Why is everyone so hyped up about faster wireless? Next to nobody is using it to move anything more than broadband, and that about 3 Mbs at most.

Why not spend your money on better antennae? I think most people who complain about speed will continue to be disappointed until they get a high-gain antenna, which is what really matters in wireless.

(Meantime, I'll enjoy my flawless 100 Mbs Ethernet I buried after dumping wireless.)

Re:Why is the speed so big an issue? (1)

AHuxley (892839) | more than 7 years ago | (#17121402)

Apple ect want media streaming.
Huge files and fast.
Not all end users can wire up homes with Cat 6 eg renting.
Instant on, out of the box will an easy sell to many users.

Re:Why is the speed so big an issue? (1)

deep_creek (1001191) | more than 7 years ago | (#17123490)

I upgraded a few months ago to a Linksys WRT300n-wireless router (802.11n) from a 2-wire (802.11b). I have to say the transfer rates are amazing. My wife and I are able to exchange photos, music, and videos practically as fast as reading it directly off of the hard drive. What used to take minutes with 802.11b now takes seconds with 802.11n

no WiMAX? (0)

Anonymous Coward | more than 7 years ago | (#17121878)

I wonder what ever happened to WiMAX. Intel has been talking about integrating WiMAX into their chipsets for over two years now and so far nothing happened. At least 802.11n, just like WiMAX and 802.11b/g, is well suited for wireless mesh networking which has become a huge trend.

How you do pre-standard (1)

Glasswire (302197) | more than 7 years ago | (#17121974)

There's a certain point in the pre-standard development cycle when you know the range of possible ways the standard can go such that you can implement ANY possible final standard with the same physical radio -but with updated firmware. The trick is not to do it so early that you've guessed wrong and have a radio that cannot be upgraded to the final standard.

One assumes that Intel will have made sure their N implementation is upgradeable to the final IEEE standard.

Taco: Please Start Using Intel's New Logo, Huh? (3, Informative)

Glasswire (302197) | more than 7 years ago | (#17122116)

...It's only been 12 months since they changed it to the new one [intel.com]

Re:Taco: Please Start Using Intel's New Logo, Huh? (1)

Rutulian (171771) | more than 7 years ago | (#17123926)

Ummm...are you kidding? Those icons haven't been updated in years. They're still using the old crappy Gnome 1.x foot logo when the new artwork for 2.x has been around for, let's see, 4 years?!!

Re:Taco: Please Start Using Intel's New Logo, Huh? (1)

Sneakernets (1026296) | more than 7 years ago | (#17124850)

I think the new logo sucks, to be honest with you. It's just another company following the "change how everything looks to be cool" school of thought. Don't believe me? let's take a look at the products/companies/food chains doing this in the last... say.. 8 years:
Intel
m&ms
Captain D's
Atlantic Records
The Microsoft Windows logo
Apple Computer's logo
Pepsi
Visa
Sprint
British Petroleum
ATI
Sears
Federal Express
Burger king

Now, I know that a lot of these aren't at the same time, but I can list 5 off the top of my head that did change logos in the same 6 month period.

Intel using NEXT draft (2.0), not current draft (2, Interesting)

MojoStan (776183) | more than 7 years ago | (#17123014)

First of all, this (802.11n in next Centrino) is very old news [xbitlabs.com] (Feb 2006).

More importantly, Intel will in all likelyhood be using draft 2.0 of the 802.11n spec, which is much closer to the final spec than today's crappy "pre-N" stuff (draft 1.0). Draft 2.0 equipment will even be tested and certified by the Wi-Fi Alliance [arstechnica.com] for interoperability.

Draft 2.0 is due to be ratified in March 2007. Next-gen Centrino (Santa Rosa) is due in April 2007. In the unlikely event that draft 2.0 is not ratified, the Wi-Fi Alliance will put together de-facto standards, which will still be much better than today's current draft 1.0. Any respectable article would mention this very important information.

802.11a Adoption (1)

jkloosterman (1017270) | more than 7 years ago | (#17123086)

With the widespread adoption of 802.11(pre-)n devices, as consumers are likely to bend eventually to the marketing of all the retailers, even more interference will be caused on 2.4Ghz bands (especially since there are only 3 non-overlapping channels for 802.11{b,g} devices. In the times ahead, 802.11a will become more popular as there is less interference and more channels, even though the range is shorter and has "only" 54mbps bandwidth.

Apple's iTV & Wireless N (1)

DECS (891519) | more than 7 years ago | (#17123430)

From RoughlyDrafted:

I got some criticism for writing in How Apple's iTV Media Strategy Works [roughlydrafted.com] that I thought Apple's new iTV was going to incorporate 802.11n, the new and much faster industry standard for wireless networking. Some readers thought that n isn't going to be ready in the timeframe Apple announced for iTV's arrival, while others said 802.11g is plenty fast enough to stream video already.

N: Ready and Willing

Wireless n is most certainly is going to be ready however. Even if the IEEE doesn't get around to filing their papers on the standard, Apple has compelling reasons to deliver n for the iTV, as well as pre-n competition. Belkin, D-link, Linksys, Netgear and others have shipped pre-n gear since 2004, so the technology isn't just some far off, futuristic and undeliverable crazy talk.

Remember too that Apple introduced Airport Extreme in January of 2003, prior to the official ratification of its underlying 802.11g, which didn't happen until six months later in July. Since final approval of 802.11n is due in July 2007, it won't be a stretch at all for Apple to deliver n in the first quarter of next year. The real problem for existing vendors is that the various pre-n non-standard implementations aren't compatible with each other, and that that there hasn't yet been a killer app for n.

iTV: the Killer App for Wireless N [roughlydrafted.com]
How Apple's iTV Media Strategy Works [roughlydrafted.com]

Centrino chips? (1)

bigred85 (1030936) | more than 7 years ago | (#17126318)

Intel announced at the Globalcom 2006 Expo that they will be including Draft 802.11n hardware in their Centrino chips.

Whoa, hold on a minute here.

Correct me if I'm wrong, but I was under the impression that "Centrino" is something of a blanket term encompassing an Intel processor, chipset, and wireless card/chip.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...