Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Are Low Refresh Rates Bad for the Eyes?

Cliff posted more than 11 years ago | from the the-flicker-it-hurts dept.

Hardware 27

suwalski asks: "Often when I go over to someone's house to help them with 'computer stuff' (translation: free support), I notice that many people who don't know better still use 60Hz as their refresh rate. XP seems to automatically tune higher, but for the others, I immediately bump it up, because it hurts my eyes. They say they don't see the difference. Am I right to assume that low refresh rates that make my eyes water are not healthy? If people don't notice the low refresh rate, does it still damage their eyes? Anyone know of any studies or papers?"

cancel ×

27 comments

Sorry! There are no comments related to the filter you selected.

It's fucking Christmas (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#4953336)

and you worry about flicker? I want to say:

GET SOME FUCKING PRIORITIES!

Re:It's fucking Christmas (1)

zbowling (597617) | more than 11 years ago | (#4953730)

Some people shouldn't be allowed to speak.

Yes, yes they are. (1, Informative)

Anonymous Coward | more than 11 years ago | (#4953368)

CRTs flash and then rely on the eye to hold the frame inbetween updates. When these updates get too infrequent we see the flicker and it can be painful to the eyes. When the updates at at about 30 frames/sec we don't see the flicker any more, but people often still get headaches and other eye strains because the flicker is still there regardless of whether we can see it. 30hz is unacceptable, 60hz is better, but 3x or 4x... anything higher is preferable to lessen the flicker. With 60hz expect eye fatigue.

Flatscreens are another matter. They hold the colour consistantly. There is no problem with flicker, only bluring.

First post you mother fucking sp0rks.

Re:Yes, yes they are. (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#4953392)

I LOSE IT.

Re:Yes, yes they are. (2)

Joel Rowbottom (89350) | more than 11 years ago | (#4953591)

When these updates get too infrequent we see the flicker and it can be painful to the eyes. When the updates at at about 30 frames/sec we don't see the flicker any more, but people often still get headaches and other eye strains because the flicker is still there regardless of whether we can see it. 30hz is unacceptable, 60hz is better, but 3x or 4x... anything higher is preferable to lessen the flicker. With 60hz expect eye fatigue.
Yup, and it's a hell of a lot easier to see it if you look out of the corner of your eye, too.

Re:Yes, yes they are. (0)

Anonymous Coward | more than 11 years ago | (#4953824)

Wrong. CRTs rely on the phosphor to keep emitting light after being hit by electrons. That's why you have so many kinds of phosphor that have different persistence values. Like the high persistence monitors used for older computers, or the ultra fast phosphors used in oscilloscopes.

I don't know why CRT manufacturers don't use some form of variable persistence technology (like crt analog storage tubes on scopes) to emulate the 'hold' that LCD lines have.

I guess it's because they don't teach engineering anymore in universities.

Glad I'm Not The Only One (2)

Joel Rowbottom (89350) | more than 11 years ago | (#4953570)

When I inevitably get the relatives asking about PC upgrades and stuff at Christmas, and I end up reconfiguring their machines for them, I put the refresh rate up.

However, a low refresh rate is worse if there are flourescent lights in the room - which of course strobe a lot more noticeably than filament bulbs. Perhaps this is contributing to the problem?

Then of course, if you're always staring at a TV, your eyes might become accustomed to it.

And MP3s... (0)

Anonymous Coward | more than 11 years ago | (#4953633)

...will ruin your hearing too.

don't forget... (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#4953651)

premarital sex will blow your legs off

Re:don't forget... (0)

Anonymous Coward | more than 11 years ago | (#4954609)

What's that? I went blind by... doing something... and I couldn't read your post.

It irks me too (2)

Pyromage (19360) | more than 11 years ago | (#4953899)

I have no clinical data to say whether or not it's actually unhealthy, but everyone I know that actually seriously uses a computer can notice the difference. Every single one of them.

Some people may not notice, but those are usually the non-serious users. The "Computers are cool, but Word scares me" crowd don't usually notice much of anything. I once sat a guy in front of my dvorak keyboard (IBM Model M, removable key caps!) and he didn't notice for quite a while (minutes even!)...

Re:It irks me too (1)

BenTheDewpendent (180527) | more than 11 years ago | (#4954134)

I'm a serious user. Games, Coding, slashdot, graphics, photoshop, etc... 60hz doesnt seem to bother me at all. Sure now and then i get tired of staring at a screen but thats usualy just cause im tired in the 1st place and no screens are really good to look at by that point.

Re:It irks me too (0)

Anonymous Coward | more than 11 years ago | (#4954550)

you are cool !!!!! computure expert!!!

refresh rates (1)

chunkwhite86 (593696) | more than 11 years ago | (#4954060)

I know my eyes start to sting and water when I'm looking at 60hz. Even 70 or 75 is noticibly below the 85hz that I'm used to. I cant say that I can easily make out the difference between 85hz and 95hz, so 85 seems like a comfortable sweet spot. Most every monitor made within the past 5 years or so should be able to support 85hz at the most common home-user resolutions. It seems like bad OS design to have the drivers default to 60hz.

Re:refresh rates (1)

neuroticia (557805) | more than 11 years ago | (#4954202)

A lot of people are still using cheaper/older monitors. They've been told that monitors don't need to be upgraded until they die, so they're on old 14 inch 800x600 max monitors that can only refresh up to 65 at max resolution, and 70 at 640x480. They plug these suckers into their brand-new speedy P4 computers, and don't seem to give a damn.

This is why it defaults "lower". Actually, XP does a good job of defaulting higher.. My box defaulted to 75 and 1024x768 (17 inch monitor) which, of course, I upped to 85 @ 1152x864

While I agree that WinXP should be better at detecting and auto-implementing these things, it's still way ahead of the ballgame in this area. Plug an older or even newer monitor into a Mac, and you've got a 50-50 shot of ending up with "Out of Sync" message on the screen. Better odds with OS X, but I'm still seeing it after swapping my monitor out with an older monitor that likes being set at 1024x768. WinXP? Swap it out with a 21 inch monitor set to a scary-high rez, then with a 12 inch LCD screen that can only display 256 colors and at 640x480, and it recognizes these limitations and adjusts accordingly. Which is good--cuz if you can't access the damned computer 'cuz the monitor's set too high and the OS isn't dealing with it, then you can't do a whole hell of a lot. If you can access it, and the monitor's set too low, then hey... That's fixable, no?

-Sara

Re:refresh rates (0)

Anonymous Coward | more than 11 years ago | (#4954929)

Plug an older or even newer monitor into a Mac, and you've got a 50-50 shot of ending up with "Out of Sync" message on the screen.

Trolling for your personal vendetta against Mac again, are we? The above statement is pure FUD. At best, it may be anecdotal based on your limited experience alone.

Older Macs, manufactured before OS X came out, had ADC connections. To use a VGA monitor with one of those, you had to use an adapter, in which case all bets were off. Even so, it tended to default to a very "safe" resolution (640x480, 60Hz) unless otherwise configured. Hell, my 6 year old Mac with 8.6 has worked just fine with three different VGA monitors of varying ages. I bet you'd have more likelihood of a problem going the other way, plugging an ADC monitor into WinXP.

Current Macs are another story. My QuickSilver has recognized every monitor I've plugged into it, old and new. All VGA, not ADC. As you say, OS X is better than OS 9 in this regard - ie, it's perfect, whereas OS 9 can have problems. X certainly does everything you say XP does. But oh yeah, you haven't actually tried it lately, basing all of your spiteful opinions on a bad experience with much earlier and less mature versions. I almost forgot.

Your comments always contain intelligent discussion along with accurate information, but are also invariably peppered with anti-Mac FUD and exaggerations because you feel personally slighted by a previous bad experience. It's always amusing but getting old. You'd gain more credibility by sticking to the facts - there's plenty of those to help your cause, since Macs do suck in certain ways.

Cheers!

Yep (4, Informative)

legLess (127550) | more than 11 years ago | (#4954246)

Low refresh rates hurt my eyes badly - 60 is awful after just a couple minutes. I can see a little difference between 85 and 100, but only retroactively (i.e. 85 doesn't bother me). I used to run the network for a large architecture firm, and many of these people - although they drafted in AutoCAD all day long - saw nothing wrong with 60Hz. Some of them noticed that 85 or 100 was better, but I think many of them just acclimated to 60, not realizing anything better was possible.

On another note, Windows users should check out RefreshForce [pagehosting.co.uk] , which automatically sets the highest possible refresh rate every time you (or a game, or other app) switches resolutions or color depths in Windows. I run it on a couple machines wit no trouble.

Now it's everything else...... (1)

MegaHamsterX (635632) | more than 11 years ago | (#4954547)

I have always consistantly run the higest possible refresh rate on my monitors for years, yes I find it is much easier on my eyes, but now I have another problem. When I watch tv, movies anything with a slow refresh I see nasty stuff like jumpy movement and bad interlace artifacts(in tvs). Is this just me or does it affect everyone as well.

Re:Now it's everything else...... (3, Interesting)

spectral (158121) | more than 11 years ago | (#4955266)

I have the exact same problem, though none of my friends seem to. In addition, in most every movie I've seen recently (yes, EVERY movie I've seen recently. In fact the high budget ones seem almost worse with it) that has any sort of high speed movement, it doesn't look crisp. I don't know if they're doing this intentionally, but the second there's any decent amount of motion, everything gets.. well, motion blurred. To the point of being practically one big mess where they're just colored blobs streaking around, and no more identifiable than that. Again, none of my friends seem to notice.. :(

Re:Now it's everything else...... (2)

yamla (136560) | more than 11 years ago | (#4955844)

This is something that very much bothers me, too. It is far more noticable to me when the high speed movement is against a white (or light) background, much less so if it is dark. Very few movies, I find, have a lot of this but when it does happen, I'm with you. The objects are just one big mess.

Re:Now it's everything else...... (0)

Anonymous Coward | more than 11 years ago | (#4957129)

I noticed this in LOTR. I wasn't really impressed with most of the action scenes since they were just a big blurry mess.

My Optitions Advice (2, Informative)

Trevelyan (535381) | more than 11 years ago | (#4954575)

This may seem a strange concept, but I asked my optition this (as opposed to /.) =)
He said to make sure that the refresh is above 75Hz if not more, the higher the better (well my current monitor is doing 64.9 =/)
And yes it is the low refresh that is hurting your eyes. One way to spot exceptional bad refresh to look just over the top of the monitor, if you can see it flicker then the refresh is way too low.

The word you are looking for is "optician" (2)

Hell O'World (88678) | more than 11 years ago | (#4955670)

...or at least that's one option.:)

But good idea. Ask a professional.

UK Law (0)

Anonymous Coward | more than 11 years ago | (#4956402)

In the UK, health and safety law requires a minimum refresh rate of 72Hz because studies show that anything lower can damage your eyesight.

I think it depends... (1)

dotmaudot (243236) | more than 11 years ago | (#4957473)

I never had problems even at 60Hz. I can notice that 72Hz is better, but 80Hz gives me nothing more.
What I found very useful is photocromatic lenses (I have a heavy myopia). It seems that polarizing light helps my sight!

ciao, .mau.

whoope (1)

tadheckaman (578425) | more than 11 years ago | (#4972290)

I run MY apple monitor at 200 hz, and XP doesnt have a single problem with it. And my eyes never get watery or stuff.

Re:whoope (0)

Anonymous Coward | more than 11 years ago | (#4996408)

200Hz? that's going to kill your monitor! although higher refresh rates are better for the eyes....the monitor's life span will be seriously shortened at that rate, regardless if it can handle it
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?