Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ask Slashdot: Why Does Wireless Gear Degrade Over Time?

timothy posted about 2 years ago | from the indifference-mostly dept.

Wireless Networking 615

acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"

Sorry! There are no comments related to the filter you selected.

The Hamsters get tired (3, Funny)

Anonymous Coward | about 2 years ago | (#41723023)

and worn out. Also I think they have pretty short life spans.

Re:The Hamsters get tired (3, Funny)

K. S. Kyosuke (729550) | about 2 years ago | (#41723157)

So, if I had a pair of WiFi devices, couldn't they just reproduce and train replacement hamsters? And do I need to add extra grain for that?

Re:The Hamsters get tired (3, Funny)

jimmetry (1801872) | about 2 years ago | (#41723497)

Uh, yeah, but how do you think reproduction works? The stork (Fedex) will have to drop off your new baby hamster WiFi device before it can be trained.

Signal isn't chaning, the noise floor is (5, Insightful)

Anonymous Coward | about 2 years ago | (#41723027)

As all of your neighbors add wireless routers, the noise floor goes up, and the usable signal goes down, even though the signal strength is the same.

Re:Signal isn't chaning, the noise floor is (4, Insightful)

mk1004 (2488060) | about 2 years ago | (#41723335)

That wouldn't explain why replacing the router fixes the problem, unless he just happens to be replacing the old router with one that just happens to have a stronger transmitter or better antenna. The pessimist in me says that the chances of that happening can't be 100% of the time.

Re:Signal isn't chaning, the noise floor is (5, Interesting)

Migraineman (632203) | about 2 years ago | (#41723397)

More than likely, the older router was expecting a relatively clean RF environment, and was crippled when all the neighbors deployed APs nearby. The newer APs were designed to handle cluttered environments, and their more-advanced algorithms provide improved performance over the previous generations' products. As old equipment is replaced with new, you'll probably see the same degradation in performance until new countermeasures are developed (in the next gen equipment, of course.) Ref: arms race.

Re:Signal isn't chaning, the noise floor is (1)

amorsen (7485) | about 2 years ago | (#41723405)

If he switches to a router with 802.11n from one which doesn't, he is likely to get better reception. Even for non-n devices, if the new router is made by someone halfway competent.

Re:Signal isn't chaning, the noise floor is (3, Informative)

ATMD (986401) | about 2 years ago | (#41723465)

If you read the OP carefully, all he says is that he replaced the routers - not that doing so actually fixed the problem.

Deteriorating SNR does seem the most likely explanation.

Re:Signal isn't chaning, the noise floor is (3, Funny)

Gorobei (127755) | about 2 years ago | (#41723339)

Damn neighbors - I see 35 wireless networks from my mac: Jes's Awesome Network, Doris Family, Alex, Bellclaire Hotel, I Win, Epsteinland, buckduke, toujoursavectoi, sheilajaffe1, ming. And a bunch of hexcodes. Bit sad to not see HotWorkoutPants on the list today.

Re:Signal isn't chaning, the noise floor is (3, Funny)

Gorobei (127755) | about 2 years ago | (#41723361)

Oh, Cliffs of Insanity just showed up - nice name.

Re:Signal isn't chaning, the noise floor is (0)

Auroch (1403671) | about 2 years ago | (#41723491)

Simple fix : install custom firmware, turn up your broadcast power.

built in failure (4, Insightful)

Anonymous Coward | about 2 years ago | (#41723031)

built in failure. bow to your corporate masters and go consume.

Did the signal degrade, or the noise increase? (5, Insightful)

SClitheroe (132403) | about 2 years ago | (#41723033)

Over 3 years I'd imagine a greater density of wifi devices all sharing the same spectrum to have appeared. Perhaps the signal level is the same, but the noise floor has increased substantially, degrading performance.

Re:Did the signal degrade, or the noise increase? (3, Funny)

epSos-de (2741969) | about 2 years ago | (#41723089)

Yes you are correct. Switch the WiFi channel or the transmitter to some number that is not used already. Channel 1, 11 and 6 are the most commonly used ones. Just use channel 2 and you will be 50% better than on channel one already.

Re:Did the signal degrade, or the noise increase? (3, Interesting)

K. S. Kyosuke (729550) | about 2 years ago | (#41723169)

I thought the channels overlap to a significant degree. If there is interference on channel 1 to such a degree that you feel the need to switch it, are you sure that switching to channel 2 would actually help?

Re:Did the signal degrade, or the noise increase? (4, Informative)

Ironhandx (1762146) | about 2 years ago | (#41723331)

There is overlap to a significant degree, channel 2 won't help, but channel 4-5 might. There are generally about 3 channels worth of overlap on those wifi channels. If its only a small amount of interference thats causing the signal to drop, channel 3 might even do it. However if theres enough interference to cause problems, swapping to channel 2 from channel 1 won't help because they share about 80% of the same band.

Re:Did the signal degrade, or the noise increase? (5, Insightful)

pepsikid (2226416) | about 2 years ago | (#41723173)

NO. If you use channel 2, then you're straddling channel 1 and 6, so you actually have to compete with more interference. Unless you live somewhere with no other wifi neighbors, like out in a desert or 3rd world country, never use anything but channels 1, 6, 11 or 14!

Re:Did the signal degrade, or the noise increase? (4, Interesting)

NJRoadfan (1254248) | about 2 years ago | (#41723237)

Use of channel 14 isn't permitted in the USA. Routers sold here disable it by default, although you can get the option back by flashing 3rd party firmware onto the router. I ran a router on channel 14 for a brief period of time to see if interference was causing connection issues. The problem I ran into is some wireless devices wouldn't work on channel 14 (like ebook readers) since the radio was region locked.

Re:Did the signal degrade, or the noise increase? (3, Insightful)

Anonymous Coward | about 2 years ago | (#41723341)

Don't get caught by the FCC, there are some pretty hefty fines for interrupting that reserved space in the spectrum.

Re:Did the signal degrade, or the noise increase? (3, Interesting)

tehrhart (1713168) | about 2 years ago | (#41723279)

I recall that the IETF meeting in Paris this year had some wifi troubles and they ended up using overlapping channels intentionally. It would seem to me that straddling two of the non-overlapping channels would at least allow you to compete for resources in two areas - i.e. if channel 1 was flooded, you'd still have some bandwidth availible in your overlap of 6, but I could be mistaken. Reference article about IETF Paris: []

Elliot noted that France lets Wi-Fi use channels 1-13 in the 2.4 GHz band. "As three channels are very limiting in a very 3D structure, like this hotel, I've chosen to go with 4 channels, using 1, 5, 9, and 13," he said. "This is a layout that is well respected by others, and one [that] we've considered using at the IETF on numerous occasions--and very similar to what we used in Hiroshima. You get a slight bit more of cross-channel interference, but the additional channel is worth it, especially in this hotel's environment."

Re:Did the signal degrade, or the noise increase? (1)

amorsen (7485) | about 2 years ago | (#41723445)

If you have a lot of wireless devices and access points running at the same time, do 1 4 7 11 (or 1 5 9 13 if in Europe or Japan). Yes, you will get a little interference, but it should not be too bad. The extra channel makes placing access points a lot easier and means you can pack them more densely, making up for the interference. The 1 5 9 13 arrangement in particular can work really well.

But of course it is better to just move as much traffic as possible to 5GHz.

Re:Did the signal degrade, or the noise increase? (0)

Anonymous Coward | about 2 years ago | (#41723175)

Actually partial overlap is much worse than full overlap, so stick to the 'standard channels' if you can find some niche with no overlap.

Re:Did the signal degrade, or the noise increase? (4, Insightful)

transporter_ii (986545) | about 2 years ago | (#41723199)

Yeah. Just change it to channel 2 and interfere with everyone using channels 1 and 6, BECAUSE THE ONLY NON-OVERLAPPING CHANNELS on 2.4 Ghz ARE 1, 6, and 11.

I'm not an electrical/radio engineer, but I could have designed a better standard...blindfolded.

Re:Did the signal degrade, or the noise increase? (0)

Anonymous Coward | about 2 years ago | (#41723403)

Bandwidth is expensive, they only give us plebes the useless/crappy chunks. The standard was made to make use of such a chunk, but it might have been better to just say there are 3 or 4 channels

Re:Did the signal degrade, or the noise increase? (4, Insightful)

DJRumpy (1345787) | about 2 years ago | (#41723265)

This used to work, but with the common availability of WiFi, any scan of your local neighborhood will often never find a channel with more than 1 channel separating you from neighbors (auto channel switching isn't aggressive enough...why is that?). The only way I've found to keep ahead of it is to invest in new frequencies as they become available. I've had the 5Ghz spectrum for quite a few years with no neighbors using it until this month. The first one popped up on my scanner a few weeks ago.

The other has to do with the quality of the equipment. I used to use Linksys, then Netgear, and then tried Buffalo as was disappointed with each either through hardware issues, or due to poor performance. The Linksys gear seemed to go down hill after Cisco bought them, but I always thought that Cisco was an industry leader (not in the telecom field so feel free to chime in). My old 10Mb switches are still working after a decade but it seems rare to find one of these that lasts this long these days.

I finally ended up with an Apple Time Machine which worked well with a mixed environment of Windows and Mac's for wireless backup, and my original printer didn't have WiFi so the print server was ideal. I have a WiFi printer now but that also works as well.

4 years later and I'm still pulling 16 MB/s (granted with very little competition on the 5Ghz band) with mixed mode (g for the printer, and some older smart phones that can't hit 5Ghz).

Holding out for the newest frequency, after which I'll switch again.

Re:Did the signal degrade, or the noise increase? (1)

prehistoricman5 (1539099) | about 2 years ago | (#41723401)

I'm not surprised your 10 year old switches still work. I'm a member of a team doing network hardware upgrades for my uni and I've been pulling out plenty of gear of that age or older.

Re:Did the signal degrade, or the noise increase? (1)

DJRumpy (1345787) | about 2 years ago | (#41723505)

I have to wonder if heat is a factor, or just the simplicity of the older gear that gives it the longevity.

Re:Did the signal degrade, or the noise increase? (3, Funny)

WilliamGeorge (816305) | about 2 years ago | (#41723287)

How the hell did the parent post get modded funny?

Re:Did the signal degrade, or the noise increase? (1)

theskipper (461997) | about 2 years ago | (#41723461)

"Just use channel 2 and you will be 50% better than on channel one already."

Well, I initially read it as the poster saying channel 2 will be better/more powerful than channel 1 because the number is bigger. Which is kind of funny but obviously not what he meant. So it's probably what made the mod go with funny.

TL;DR: It was kind of funny if you misinterpreted the post.

Re:Did the signal degrade, or the noise increase? (0)

Anonymous Coward | about 2 years ago | (#41723093)

Not OP, but does the radio die over time?

I recall a warning in ddwrt that overclocking the radio chip kills it off faster?

Re:Did the signal degrade, or the noise increase? (4, Interesting)

afgam28 (48611) | about 2 years ago | (#41723099)

I once had a router where the signal started to go bad over time. I called up the company and the tech support guy told me that most routers "wear out" after around 2 years, and that I'd need to replace it. He struggled to give a logical answer when I asked him how a device with no moving parts could wear out so quickly.

If you're right, and if this is the standard advice being given to everyone, we're in for a huge arms race.

Re:Did the signal degrade, or the noise increase? (1)

similar_name (1164087) | about 2 years ago | (#41723249)

Just out of curiosity why would buying a new router fix the problem as implied by the article?

Re:Did the signal degrade, or the noise increase? (1)

transporter_ii (986545) | about 2 years ago | (#41723267)

More devices, but also less efficient use of the spectrum. In order to have "Super Duper Turbo" modes, routers combined channels, taking up 40 Mhz channels in order to reach 300 Mpbs. While a minority actually need this speed, the rest of everyone purchased these routers because they thought it would make their Internet run faster. ...Only as fast as the weakest link...blah, blah, blah.

Re:Did the signal degrade, or the noise increase? (5, Interesting)

girlintraining (1395911) | about 2 years ago | (#41723285)

...but the noise floor has increased substantially, degrading performance.

Bingo! You hit the nail on the head. Wifi is now commonly found in most homes. And the overwhelming majority are b/g routers. That means that everyone's "last mile" internet is running on only three non-overlapping channels (in the United States), with a maximum capacity of only 54mbps for each of those channels. While your effective range decreases, your signal still continues to interfere with others out to its maximum range, which is typically around 300 feet. Beyond that, it's only a decibel or so above the noise floor (about -96dB) and is basically ambient. So consider urban density: In a 300 foot hemisphere, how many transmitters will be in that space?

Well, I live in a residential neighborhood that is mostly single-dwelling homes, which is about as ideal as you can get from a low-density city environment. Using a pringles can, I took a neighborhood survey and found about 26 access points within 300 feet of my home. Now, this is a survey that took several days to complete because of the marginal signal integrity, after which I drove my car in circles matching associated clients to those APs. Each access point had approximately 2.25 clients associated with it. So that's about 60 transmitting devices, in an ideal urban environment. And that's just those using wifi.

2.4GHz is also used by: Wireless phones, microwaves, wireless "hifi" stereo systems, etc. It's also used by wireless mice/trackballs and keyboards. So, realistically, I've got at least 100 devices that are transmitting with a signal high enough to interfere with the front-end RF of my wifi.

Shannon's Law stands tall in all of this: As you increase the noise floor, the amount of data you can transmit regardless of encoding scheme or receiver selectivity falls proportionally. Every device added decreases your own devices' performance.

I found that by setting my router to 'g' only and then forcing the bitrate down to 24mbps, I was able to get a much more reliable and speedy signal. Every WiFi standard is designed to cope with interference by renegotiating to a higher or lower bitrate dynamically. Which would be fine if they were isolated, but in an environment where they're in close proximity to each other, what happens is as each device broadcasts and interferes with the other, they detect this and then renegotiate, generating more interference; And pretty soon you've got routers constantly in a state of renegotiation, with fluxuating bitrates. Manually force your router to a specific bitrate and don't allow re-negotiation, and you'll find that those momentary spikes in the noise level won't wash out your signal -- renegotiation takes 10--30ms, and during that time, you can't send/receive any data. The data burst that caused it is over long before the renegotiation completes.

So in short, it's not your transmitter, it's the environment. Take your transmitter out of its default settings and enable RTS/CTS (if available) and you'll be fine. Another, more sociopathic answer, is to get a 100W 2.4ghz booster (you'll have to build it), mount it on your roof, tune it to one of the 3 non-overlapping channels (I suggest 1 or 11, since most microwave ovens tend to tune at the middle of the band -- channel 6), and then let it run for about 3--5 days. Everyone will bail off that channel because nothing tuned to it will operate over a distance of even a few feet. Again, very illegal, very sociopathic... but very effective. You'll have to do "plow the spectrum" about once every month or two, so count on downtime.

Re Jamming your neighbors (1)

scharkalvin (72228) | about 2 years ago | (#41723429)

That won't work because moset people are too stupid to change channels on thier routers, I bet 90% of all routers in residencial use are on the same channel, which ever one is selected by default in the firmware. So if YOU change channels you might be on a clear channel!

Re:Did the signal degrade, or the noise increase? (0)

Anonymous Coward | about 2 years ago | (#41723435)

Instead of an illegal (unless you have a ham license, then I think it's 1400w pep) 100w booster, why not some mac/bssid/ssid collisions?

Router Troubles (1)

Anonymous Coward | about 2 years ago | (#41723037)

Are you sure that it's the router itself? Have you made any changes to your house in the last 2 years?

It is due to pollution... (1)

Anonymous Coward | about 2 years ago | (#41723041)

I think the problem is the wireless pollution that is spreading. Few years ago your friends were the only one with a wireless router in the surroundings. Now more and more devices are connected and this is limiting the signal propagation of your devices.


signal strength or network speed? (0)

Anonymous Coward | about 2 years ago | (#41723051)

Are you sure it's signal strength (i.e. dB) or network bandwidth interpreted as "bars"? What I've noticed over the years is that the more wifi devices we add, the slower the network is, which makes sense. It could be cheap components in the amplifier section that are getting hot and degrading over time, but that's just a WAG.

Neighbours (1)

Anonymous Coward | about 2 years ago | (#41723055)

I'd suggest that what's actually degrading your signal are all the neighbours who are also using the 2.4GHz band. It's not just WiFi, but a whole slew of other wireless gadgets. Move to the 5GHz band whenever possible, there's a lot less congestion.

Re:Neighbours (2)

Tablizer (95088) | about 2 years ago | (#41723209)

Tell their stinkin' signals to get off your lawn!

Re:Neighbours (2)

AliasMarlowe (1042386) | about 2 years ago | (#41723275)

Tell their stinkin' signals to get off your lawn!

Lawn, hell! They've gotten into his house...

Huh? (0)

Anonymous Coward | about 2 years ago | (#41723059)

My wireless router is ~6 years old. Works great.
The one I bought my parents is 7 or 8 years old. Still works fine.

Perhaps your old routers are just dealing with more interference as more of your neighbors are buying overpowered devices that take up more and more channels. And now you've become one of those offenders as well since you've bought yourself an overpowered device that takes up too many channels.

Obligatory (3, Interesting)

SuperMooCow (2739821) | about 2 years ago | (#41723061)

It could be the noise floor going up near your house, or just planned obsolescence [] .

Check if your channel is too crowded (1)

Anonymous Coward | about 2 years ago | (#41723063)

Maybe there's too much people using the same channel as yours.

Check with one of thoses apps :
Wifi Analyzer on android
This one is great and work on Windows + Java :

Re:Check if your channel is too crowded (4, Informative)

spongman (182339) | about 2 years ago | (#41723125)

Using the same channel does not increase signal interference. Signal interference comes from APs using neighboring channels in close proximity. If you're looking for greater range, try switching to the same channel as your neighbor. Your bandwidth could be lower, but the interference will be reduced.

Re:Check if your channel is too crowded (2)

amorsen (7485) | about 2 years ago | (#41723493)

Using the same channel does not increase signal interference. Signal interference comes from APs using neighboring channels in close proximity.

Err, that makes zero sense. Wifi access points do not coordinate their transmissions or do any sort of code division multiplexing or anything else that might help with interference. Two transmitters on the same channel will absolutely interfere, worse than if they were on neighbouring channels. If you are lucky, they will interfere enough that the other access point decides to switch channel.

Other Devices (0)

Anonymous Coward | about 2 years ago | (#41723069)

I also wonder if the quality of our devices may just SEEM like they are getting worse, because of new networks that appear in the area as time goes by. It seems that more and more of our neighbors have installed wireless networks after ours was installed.

cheap electroytic capacitors (5, Informative)

pentabular (2609873) | about 2 years ago | (#41723071)

..have a tendency to degrade and fail over time.

Re:cheap electroytic capacitors (3, Informative)

TechyImmigrant (175943) | about 2 years ago | (#41723241)

>cheap electroytic capacitors have a tendency to degrade and fail over time.

Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.
I don't think electrolytics are it.

I have my suspicions about the noise figure of LNAs changing over time. There are some very highly strung, teeny weeny transistors in LNAs (Low Noise Amplifiers) right in the signal path.

Re:cheap electroytic capacitors (5, Informative)

Theaetetus (590071) | about 2 years ago | (#41723257)

>cheap electroytic capacitors have a tendency to degrade and fail over time.

Not significantly over 2 years and you don't use electrolytics in the in IF/RF signal path in a 2.4 & 5.8GHz radios.

True, but you do use them in your cheap switch-mode power supply, and as they degrade, you get additional AC noise on the rails of your amplifiers that are in the IF/RF signal path. Particularly in cheap routers that are operating near the limits of their amplifiers, voltage drops on the rails could cause clipping of the high frequency signal, which will result in dropped packets, required rebroadcasting, etc.

Re:cheap electroytic capacitors (1)

Anonymous Coward | about 2 years ago | (#41723455)

Also, the ESR increases, meaning the power supply can't supply current as quickly to the finals or front end, which might also have an effect on signal

Re:cheap electroytic capacitors (0)

sgt_doom (655561) | about 2 years ago | (#41723369)

Please see sgt_doom's explanation....

It doesn't. End of discussion. (-1, Redundant)

Anonymous Coward | about 2 years ago | (#41723079)


Wireless noise (0, Redundant)

Anonymous Coward | about 2 years ago | (#41723081)

It isn't that the wireless router / base station is degrading. The signal quality is degrading because there is more noise from other Wi-Fi access points and devices in the area. There is a finite amount of spectrum available and when you have lots of home base stations in the same area/neighborhood the interfere with each other leading to lower range for each of them. It is similar to talking in a noisy room. In an large, empty room you can hear some one from across the room, in a noisy room with many people talking you may only be able to hear someone a few feet away.

When I installed my first Wi-Fi base station in 1998 in my parents house I could walk down the block and get reception a few houses down. Today there are dozens of base stations and potentially hundreds of devices on the street. I struggle to get consistent reception within the house despite improvements in antenna design and b/g/n routers.

Magic Smoke (4, Funny)

lobiusmoop (305328) | about 2 years ago | (#41723085)

Obviously the magic smoke, although not released suddenly, does gradually leech out of the components leading to loss of performance over time.

Amplifiers/Filters? (2)

rabtech (223758) | about 2 years ago | (#41723087)

Could the analog components of the amplifier/filter circuits be degrading? If capacitors are leaking, etc then that would definitely make the performance decrease but maybe not enough to completely stop working.

You should consider another option: older equipment may not have firmware as good at dealing with congestion (802.11N helps with this), or maybe the new box has 5Ghz which has much less interference issues? Maybe the real degradation was the neighbors installing access points? You may also have had certain pieces of gear installed that interacted badly with your access point (some of them have really awful firmware or very loose implementations of the standard).

These are just guesses... I haven't personally had any degradation except for interference in the 2.4Ghz band. When I bought this house devices would only detect my network and maybe one other. Now seven show up. Interference isn't just a problem in apartments anymore.

Capacitors and other parts are not invulnerable (2, Informative)

Anonymous Coward | about 2 years ago | (#41723095)

Probably capacitors degrading, transient spikes making it through the hardware, electrostatic discharge during assembly, just plain overheating as dust coats the internals, more people using the same frequencies (possibly including yourself as you add more wireless devices), and a bevy of other reasons I can't think of at the moment.

Re:Capacitors and other parts are not invulnerable (1)

BluBrick (1924) | about 2 years ago | (#41723293)

Sunspots. You forgot sunspots. Just what kind of BOFH are you, anyway?

Re:Capacitors and other parts are not invulnerable (3, Funny)

Will.Woodhull (1038600) | about 2 years ago | (#41723371)

You are probably on to something here.

I volunteered at Free Geek, a computer recycler / refurbisher, years ago when "fat caps" on the motherboard was a frequent reason for computers to be sent there. AIR, the problem was a lot of counterfit components being sold to reputable manufacturers. There were several big name manufacturers involved. Something like that could be happening in the router market. Going with the lowest bidder for the components is still important in low margin markets.

Another possibility is that a kid in the neighborhood is collecting the innards of smoke detectors as part of his unofficial science project, and is storing them too close to your house. A radioactive environment will shorten the life of capacitors and other components. Have you noticed whether any of your kids glow in the dark?

I'm kidding with that last, of course. Sort of.

analog transistors age (5, Insightful)

jfb2252 (1172123) | about 2 years ago | (#41723103)

This is a hypothesis based on peripheral involvement with analog and digital RF at 0.5 and 1.5 GHz for twenty years.

AFAIK, the output stage of anything broadcasting above about 2 GHz has to be analog, with the lower frequency signal mixed into a carrier at the higher frequency. Digital synthesizers and chips which can deal with 1.5 GHz directly are still very expensive and are unlikely to be used in the consumer routers. So the final output stage is likely an analog RF transistor.

Analog transistors change characteristics with age at elevated temperature, where elevated is anything over 20C. Implanted ions diffuse with time and temperature, changing junction characteristics. The small structures required by high frequencies are more sensitive to such things.

Re:analog transistors age (2)

pongo000 (97357) | about 2 years ago | (#41723253)

Analog transistors change characteristics with age at elevated temperature, where elevated is anything over 20C.

Ever notice how hot wireless routers get, especially when they are stacked? I find this to be the most plausible explanation yet posted...

Re:analog transistors age (1)

MangoCats (2757129) | about 2 years ago | (#41723297)

Or, there's another explanation. I've had a number of wireless routers for several years, some over 5, and in my applications, they do not degrade performance over time. In fact, I happened to have my laptop in a corner of my backyard that I hadn't taken it to before (well over 300' from the router which is inside, with several walls in-between), and was surprised to see full wi-fi signal strength from my 3 year old 802.11n router paired with my 3 year old laptop. My neighborhood is fairly sparse with WiFi signals, I can only see 3 or 4 neighbor's SSID broadcasts at any given time - if you're in a higher density region, I imagine that is not the case.

Re:analog transistors age (-1, Redundant)

sgt_doom (655561) | about 2 years ago | (#41723383)

Please see sgt_doom's explanation . . .

Re:analog transistors age (1)

Auroch (1403671) | about 2 years ago | (#41723511)

Please see sgt_doom's explanation . . .

but ..., this is sgt_doom's post, and so this is his explanation ... so ... do you propose it's signal degradation due to recursivity?

Maybe more noise (1)

feedayeen (1322473) | about 2 years ago | (#41723107)

The actual transmitter and receiver is nearly indestructible since it's mostly a copper wire wrapped around something inductive. The electronics are also operating at a power level where even though they operate at a rather toasty temperature, it is well below the danger point where silicon might degrade, and that'd likely brick it rather than have a slow decline.

Most likely, the cause is RF interference from your neighbors.

Yes, WiFi radios burnout with age. (-1)

Anonymous Coward | about 2 years ago | (#41723113)

It has been my experience with home and professional grade wireless devices (such as access points, and wifi-integrated routers,) is that they do indeed burnout with age. I have been shutting off radios when they are not needed to combat this issue. Newer home-grade wifi routers have a "Green" feature that allows the owner to configure the radio to lower the transmit power or shutoff at predetermined times. I know that most DLink devices, Cisco/Linksys devices, and DD-WRT support this feature. Some vendors group the feature into a "Green" feature site, some don't.

Check the power supply (3, Informative)

Anonymous Coward | about 2 years ago | (#41723117)

Check the power supply. Usually the electrolytic capacitors are already dry.

Designed to fail (3, Funny)

Anonymous Coward | about 2 years ago | (#41723121)

Not sure about wireless gear, but some devices (e.g. printers, light bulbs, fridges) are designed to break after a certain period of time, so that you would buy a new one.

Re:Designed to fail (0)

Anonymous Coward | about 2 years ago | (#41723135)

E.g. some printers (or cartridges) have a counter that will "tell" when it is out of ink. If you reset the counter with a software hack, you can continue printing, even after your printer says no.

It shouldn't. (1)

rhalstead (1864536) | about 2 years ago | (#41723123)

The first thing that comes to mind other than they shouldn't lose strength that quick, is cheap, or faulty components and primarily capacitors. A batch of Samsung monitors are famous or infamous for capacitors going out in the power supply. Router builders could have come up with part of that batch as well. Another is low voltage, as they all use "wall warts" AFAIK. Try a different PS before pitching the router. QC and quality of the generic wall wart is not exactly for lab standard work. Normally and I have to emphasize the "normally" capacitors last on the order of 20 years...or more. In the last 15 years (give or take) the only router failures I've had were due to lightning and one that appeared to lose its brains. Most of the "stuff" here is in two buildings, one of which has the entire interior constructed of bonded barn metal so the placement of NICs or relays is important. IE the router is in the basement and the NICs are in the shop (pretty much shielded except for windows) some 130 foot distant. I have to admit that most of the time I use CAT6 instead of wireless as large backups take far too long on wireless.

Familiar with the problem, and here's how I fix it (4, Interesting)

carlhaagen (1021273) | about 2 years ago | (#41723127)

I've been using wifi instead of ethernet for about 7 years now. Almost all of the NICs/APs I've used have displayed this problem with time. It's as if the equipment somehow develops creeping signal attenuation. My guess is that it's something relating to capacitors gathering a slow overcharge of some sort, causing them to block current in a growing fashion - I seem to recall this being possible from my early days of electronics studies.

Anyhoo, I fix the problem by simply switching the equipment to another channel, say, 3-4 steps away, to make sure the frequency some of the components will be switching at will be notably different. So far it has worked with all equipment I've had this problem show up on. After a while the signal attenuation develops on the new channel as well, upon which I simply switch back to the one I used before. Rinse, repeat.

Semiconductors degrade, and faster when hot. (1)

gweihir (88907) | about 2 years ago | (#41723133)

Semiconductors degrade over time, the hotter the faster. At 25C normal digital semiconductors have 30-50 years lifetime. (I have observed this myself with a batch of 25 network cards.) Halve that for every 10C. Now, this applies to the individual transistor. Even if just the RF power amplifier runs hot, it degrades at roughly this rate. As this is analog, not digital, degradation does set in in an analog fashion, and may even go much faster.

Power adaptors to blame. (5, Informative)

Anonymous Coward | about 2 years ago | (#41723143)

In my experience power adaptor degradation is the main culprit. Over time the adaptor will provide lower voltages and a less stable current. This translates into a lower signal output and higher noise respectably. I've seen bad adaptor turn repeaters into signal jammers - trust me, that was not an easy issue to troubleshoot...

Other Observations? (1)

Anonymous Coward | about 2 years ago | (#41723145)

I've never seen nor heard of such a thing. I've got WiFi gear, some of it over ten years old and I have yet to see any evidence of "signal degradation".

Because of this article, I have reexamined several pieces of equipment, some still in production, others have been on the shelf for a while, al demonstrate signals that are within original spec.

So, Mr. Electrical engineer, can you show us any other observations to support your assertion? Can you show us any evidence at all, other than your suspicious anecdotes, of signal degradation with age? Finally, please name names. It might be believable if you were using obscure gear, but if you are talking about the usual suspects in consumer WiFi, I call BS.

ROHS issues? (1)

Anonymous Coward | about 2 years ago | (#41723149)

It could that the routers were made from the 1st generation of ROHS (mainly lead free) components. There was a issue with Intergrated Circuits where the metal traces inside the chips would migrate causing thinning of the traces (changes circuit characteristcs) or in some cases growing 'wiskers' that would cause shorts or in RF, radiate power at the wrong point.

My experience says that this seems to be in the past as manufacturers have made new fabrication techniques.

various causes (2, Insightful)

Anonymous Coward | about 2 years ago | (#41723153)

The loss in performance could be due to the solder between components (mostly between the antenna and circuit board) is degrading overtime (this happen a lot with industrial devices), also the diferents components as capacitor and resistor could be wearing out too.

Several causes, but a few that spring to mind... (5, Interesting)

Tastecicles (1153671) | about 2 years ago | (#41723181)

1. slow burnout of emitter gear due to thermal degradation (yes, clock chips and transistors get hot, as do solder tracks and joints). Thermal runaway can occur if a solder joint fails and arcs, or overvoltage causes signal tracks to vapourise.
2. ionising radiation, particularly on unshielded components such as antenna conductors (I've seen something like this occur on an externally mounted amateur radio antenna: the sunward side of the antenna completely degraded, the result being that the only signals received (or sent) were on the shadow side).
3. component quality on consumer gear is not as stringent as it could be. Components can and do fail, and considering the number of components in a lot of consumer gear, it's a wonder any of it actually leaves the factory.
4. the noise floor of several years ago was far, far lower than it is now. The ERP of newer gear is (by design or by necessity) higher than older gear as more and more transmitters have to share the band. As a result, the signal quality taking a dive may be at least partly illusory. The equipment may actually be perfectly fine.
5. parasitic structures in semiconductor packages may be the catalyst for failure, either immediate or delayed. Such structures may be as small as a single atom of chlorine embedded in a crystal of germanium - innocuous at first (undetectable, even), but over time and use, that contamination will alter the chemistry of the semiconductor, possibly causing it to bond with the package material and rendering it useless. This might not even be an issue in high powered gear like regulators but in something like a microprocessor, it's a showstopper.

Re:Several causes, but a few that spring to mind.. (1)

TechyImmigrant (175943) | about 2 years ago | (#41723245)

With adaptive rate and power control, I wouldn't suspect the output path first. I would look to the input path first. That's the delicate bit.

Re:Several causes, but a few that spring to mind.. (0)

Anonymous Coward | about 2 years ago | (#41723439)

This is the correct answer. There needs to be a mod rating about 5, because this is the first person to post it.

I haven't seen this (1)

93 Escort Wagon (326346) | about 2 years ago | (#41723195)

It might simply be that the old D-Link 802.11b router I used to have was so slow already that I wouldn't notice... but, in any case, I never noticed it with that old router, nor have I seen this issue with any of my Airport devices (I've still got an older teardrop Extreme chugging along, serving phones and such that can't handle 5GHz 802.11n).

Are you sure it's not simply a matter of more wireless-capable devices accumulating in your home over time?

Re:I haven't seen this (1)

MangoCats (2757129) | about 2 years ago | (#41723321)

My D-Link gear didn't degrade slowly, it started out of the package new running very hot, and failed altogether within 6 months. I tried them twice (different models) - no more for me.

Re:I haven't seen this (0)

Anonymous Coward | about 2 years ago | (#41723399)

I have to agree, I have some really old wireless routers (some b only) that get used daily and have not suffered from signal degredation. I also have a lot of wireless interference in this area so I adjust my frequency channel accordingly. I may not have had ideal signal strenght in the first place, though the signal/noise ratio is decent. Maybe they don't make them like they used to?

There are also other devices that will not show up as wifi that can cause interference: wireless devices (mice, keyboards, etc), bluetooth (operates within the same frequency range), microwaves, Utility company "smart monitors". Without a spectrum analyzer you cannot rule out other sources of interference. Do a search for "wiki wifi interference" there is some good info there.

frequency drift? (0)

Anonymous Coward | about 2 years ago | (#41723201)

Analog components drift in value over time, and circuits with poor compensation are cheaper than circuits with good compensation.

Radios do not transmit at a single frequency, they transmit in a narrow bell curve centered around the nominal frequency. So as analog components age and the nominal frequency drifts, the frequency that the receiver is tuned to will no longer align with the peak of the transmitted energy.

On the other hand, receivers should have phase-locked loops that allow them to tune themselves to the peak of the received signal. Problem is that with several radios on the same frequency (assuming you don't have just one access point and one remote device) the access point can't optimize for the frequency spread of several devices, and might have

Use wifi analizer and check your channels! (0)

Anonymous Coward | about 2 years ago | (#41723243)

Wifi uses 14 channels, as a rule of thumb a signal pollutes two channels to the left and right.
Thus a strong channel 3 wifi signal can block channels 1 throug 5.

The wifi routers around me seem to change their broadcast channel every few months but are now focust on 6 and 11.

Use a wifi tool and manually set your channel to the least used one.

As a non expert this works well for me, but there must be smarter ways than this.

To your question, here's the answer... (1, Insightful)

bogaboga (793279) | about 2 years ago | (#41723259)

It's by design. Especially if those devices are marketed for the American market. Wanna know what else is designed to fail after a certain set time?

Well, Microwave ovens, cars, especially those from one once big American car company, that recieved millions in bailout cash under this president.

In industry, it's called Planned Obsolescence [] , and Americans are pioneers.

Here's [] a write-up about it.

Capitalism at its best!

Re:To your question, here's the answer... (0)

Anonymous Coward | about 2 years ago | (#41723451)

Funny! I do the same with my money! ;)

The other side of wifi devices (1)

blue_teeth (83171) | about 2 years ago | (#41723289)

I've a Thinkpad T60 laptop and the WiFi adapter just seems to work fine, even after 6 years of use & abuse.  Yes, I've replaced two WiFi routers.


zrbyte (1666979) | about 2 years ago | (#41723291)

Eating away at the PCB!

Bees. (1)

kallisti5 (1321143) | about 2 years ago | (#41723319)

Obviously Opera's bees have taken residence in the walls of the house and are absorbing gamma radiation from the wifi antennas.

Re:Bees. (1)

kallisti5 (1321143) | about 2 years ago | (#41723327)

Oprah not Opera. Fail troll is fail.

The explanation (1)

sgt_doom (655561) | about 2 years ago | (#41723365)

There's these little tiny critters called electrons, right?

And these tiny boogers, these electrons, go through these here digital circuits, also quite tiny, and their movement at that micron level eventually wears the holy hell out of those pathways they travel, causing pitted chips at the micron level --- easily observable with a high-powered electron microscope and other instruments.

I guess they don't teach science in them thar schools anymore, huh?????

And for your further edification, sir: []

Re:The explanation (0)

Anonymous Coward | about 2 years ago | (#41723509)

Electrons don't travel through "these here circuits", energy does. Unless you're talking about drift current (meters/day), but you weren't.

Don't they teach physics in them thar schools any more?

Bootloader fragmentation (1)

julian67 (1022593) | about 2 years ago | (#41723379)

Flash memory fragmentation is a problem with lots of older routers (and maybe newer ones too, I'm not sure). A fragmented environment can cause all kinds of degradations from poor wireless performance right up to ethernet ports failing and the entire device failing beyond repair (I have experienced this).

There is some good info at []

RouterTech is a good site for networking info and also offers a FOSS Linux based replacement firmware for Texas Instruments AR7 based ADSL modem/routers.

"Q. I have heard about fragmented flash memory (or environment) in routers. What is this, and how do I deal with it?
A. Flashing firmwares, saving configuration settings, and doing stuff with environment variables all involve writing to the router's flash chip. Over time, the flash memory (particularly the area holding the configuration information and the router's environment variables - i.e., the first 10kb of the router's mtd3 partition) can become fragmented. If this happens, you can have all sorts of problems. The most common ones include not being able to upgrade your firmware successfully via the web interface, not being able to save your configuration settings successfully, and routers bricking themselves spontaneously. This is a major issue on routers with the Adam2 bootloader, which is seriously broken. In our experience, it is particularly problematic with DLink Adam2-based routers - but all Adam2-based routers suffer from this problem, because the bootloader cannot defragment its environment properly, except manually from the Adam2 bootloader command prompt itself. The PSP bootloader, on the other hand, does the job pretty well by itself. "

Use G not N (0)

Anonymous Coward | about 2 years ago | (#41723381)

N networks seem to be very frail to interference. G might be a bit slower, but they seem to be a bit more robust.

Old and tired. (3, Informative)

SuperTechnoNerd (964528) | about 2 years ago | (#41723385)

Perhaps it's frequency drift. As components age their values change slightly. And when dealing with 2.4 Ghz and above tolerances are strict. It's just my guess.. Take it or leave it.

Maybe slow degradation of the heat (1)

ridgecritter (934252) | about 2 years ago | (#41723407)

dissipation path in RF power amplifiers? I don't know whether common wifi routers have discrete RF power amp devices to feed their antennas, but if they do, they are probably connected by thermal grease to a heat spreader of some kind. Over years of thermal cycling, thermomechanical expansion can create voids in the thermal grease, which will increase the device-to-ambient thermal resistance. The device will run hotter (= lower efficiency, less RF power to the antenna) or fold its power back to stay within thermal limits (same result). This could also happen if there's lots of accumulated dust on the heat sink, which would increase device to ambient thermal resistance. My 2 cents.

Something funny is going on (1)

scharkalvin (72228) | about 2 years ago | (#41723459)

While I don't have too much trouble with my routers (I have two in my house since one can't cover the whole house) I DO have issues with my garage door opener. Most of the time I can open it from the street, but sometimes I have to hold the transmitter right next to the door before it will open. It's almost as if a phantom jamming station goes on the air at specific hours of the day! The batteries in the transmitters are up to snuff, and one transmitter is buit into our Dodge Caravan and runs off the car's electrical system. All of the transmitters have the same problem at random times.

Capacitors, Maybe? (0)

Anonymous Coward | about 2 years ago | (#41723467)

It may be that a capacitor on the router itself or inside the power supply went bad.

Cheap antennas and connectors? (1)

s4ltyd0g (452701) | about 2 years ago | (#41723487)

My first guess would be corrosion or some such, impeading the transmitted signal. If the connection loosens, or becomes corroded the SWR will increase because of the impeadence mis-match. The antenna is no longer ressonant so only a fraction of the full transmit power is actuallay radiated over the air.

vry 73

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?