30 hz eyes

alexoprice

n00b
Joined
Oct 3, 2003
Messages
27
Since our eyes only see things at 30hz than why would we ever need anythin to go more that 30fps? Theres got to be more to it anyone know. We shouldnt be able to tell the difference of anything above 30fps.
 
Depends on the person, however you can tell a differant between 30 aqnd 60 fps. And you can tell a slight differance between 120 and 60.
 
Usually if your getting 60-70 fps you shouldn't worry about getting any higher because your not going to notice a difference. When people say they can notice a difference between 70 fps and like 170 fps its most likely because they knew which one was 70 fps and which one was 170 fps when they compared. If you looked at both without knowing which was which your not going to see any real differences. Plus the fact that your going to have a hard time getting your monitor to refresh at 170 fps. Top LCD's are only 16 ms which equates to around 65 Hz. That means if you are getting 200 FPS in UT2003 on your suped up rig that doesn't mean much considering your actually viewing around 65-70 fps.

Why people put so much emphasis on gaming performance when processors have already broken the barrier for speed needed on both AMD and Intel processors is beyond me. Especially since almost every game is video card intensive.
 
Won't you feel it while playing a game though? Like if my fps drops i can feel it :eek:
 
Originally posted by Mikesta
Won't you feel it while playing a game though? Like if my fps drops i can feel it :eek:

Yea, when your FPS drops down real low like 30 or 40 fps. But if your netting more then 70 fps you should be thinking about turning up the eye candy and stop worrying about FPS.
 
The eye doesn't scan full frames (far from it), so saying that is can only see 30Hz is a bit imprecise, too.

I think it's some of the same thing that makes 60Hz monitors so unpleasant to use: Even if you can't consciously see them flicker (Not that that is too hard), you still feel it. Same goes for 3d games, where your reflexes come into play and the whole scene is interpreted like something real. At 30 fps movements don't seem perfectly smooth, especially when observing fast movement, which I guess throws some subconscious system off a bit.

I am not an optician or a sensory psychologist, though :D
 
This is an ages-old debate. You need quite a bit more than 30fps. In movies/tv you have motion-blur encoded in the picture. (Freeze an action frame on TV, and you have motion blur in the frame) This helps the brain really belive there is motion.

In 3d games there is no motion blur encoded in the frame. Each frame is a static picture. To make your brain see motion, you have to feed it many more fps, since each is a static picture, frozen in time.

If that didn't make any sense, do some Googling. There have been many debates on this very issue. :p
 
It the same thing that makes TVs ok to watch even with the low refresh rates, really: The phosfors stay lit for longer than on a computer monitor. (LCDs are a science in themselves)
 
Also, on some games, most notiable Q3, game physics calculations are tied to FPS. The more FPS you can render, the more detailed physics it calculates. So even though your eye can't see it directly, gameplay smoothness is increased by a > FPS
 
Originally posted by defcom_1
This is an ages-old debate. You need quite a bit more than 30fps. In movies/tv you have motion-blur encoded in the picture. (Freeze an action frame on TV, and you have motion blur in the frame) This helps the brain really belive there is motion.

In 3d games there is no motion blur encoded in the frame. Each frame is a static picture. To make your brain see motion, you have to feed it many more fps, since each is a static picture, frozen in time.

If that didn't make any sense, do some Googling. There have been many debates on this very issue. :p

You hit it right on the nose. A human sees motion blur, you wave your hand infront of your face it has motionblur. Video game unfortuneitly do not. It would take forever for your video card to render motion blur in a video game. So instead of running a video game at 30fps w/ motionblur they double it to 60fps w/o motion blur. 60fps is the lowest you can play at w/o noticing the framerates anything above that looks smoother, but not that much. Its much like the audio law when encodeing analog to digital, it has to be twice the sampling rate. So interms of video games it has to be twice the framerate. Hopefully in future games, motionblur will be useable so that we won't need high frame rates.
 
sure they do, but not in the literal sense. Its not like theres a video card pumping out frames for you to "see". And technically you don't ever see anything, you have the sensation of seeing. Its all perception. We are just using framerates as a means of comparision.
 
Originally posted by pistola
You hit it right on the nose. A human sees motion blur, you wave your hand infront of your face it has motionblur. Video game unfortuneitly do not. It would take forever for your video card to render motion blur in a video game. So instead of running a video game at 30fps w/ motionblur they double it to 60fps w/o motion blur. 60fps is the lowest you can play at w/o noticing the framerates anything above that looks smoother, but not that much. Its much like the audio law when encodeing analog to digital, it has to be twice the sampling rate. So interms of video games it has to be twice the framerate. Hopefully in future games, motionblur will be useable so that we won't need high frame rates.
Well its not that your graphics card cant produce motion blur, its that since almost nothing is predetermined in games, it would be impossible for a motion blur to occur, and if it could, than your CPU/GPU can see into the future.
 
Originally posted by obyj34
Well its not that your graphics card cant produce motion blur, its that since almost nothing is predetermined in games, it would be impossible for a motion blur to occur, and if it could, than your CPU/GPU can see into the future.

Last time I looked, motion blur was determined by figuring out where the object was, where it is now, how long it took to get there, and what the appropriate motion blur should be. Go play GTA3.

Anyway, there are several reasons why you would want greater than 60-70fps.

#1) FPS is not static, in 99% of cases. Therefore, if you've got 150fps in a regular scene, but in a high-action scene your fps only drops to 80 or 90, you won't notice it. If you've only got 70fps in that regular scene, in the action scene you'll be lucky to see 30, which you *will* notice.
#2) Your eyes can "see" a lot more than the old 30fps, or 60fps, or whatever moniker you prefer. In-fact, your eyes can detect images displayed for 1/200th of a second(which in series, would be 200fps).
#3) As mentioned previously, physics is calculated along-side FPS. If your FPS drops to say 30 in quake 3, you'll "feel" it because you'll notice the game going slower, not only the framerate being lower(consequently, this is part of why halo runs just fine at 30fps).

There are reasons why you would *not* want fps higher than 60-70, as well.

#1) Beyond, well, let's just say 100fps, you can't tell *much* of a difference.
#2) Since most people are probably running their monitors at like 75 or 80hz, you'll either be left with tearing above 75 or 80fps, or you simply won't notice a difference anyway. If your monitor is running at 80hz, it's only putting out 80 images/second to the screen, regardless of how many hundreds your videocard is sending it. (Vsync tearing occurs, to the best of my knowledge, when the video card sends a new image as the monitor is still drawing the current one, and this happens in a consistant pattern, so you see a tear move down the screen).
 
Ultimately, you want to have a system that runs flawlessly so that the graphics are awesome and that when the system does do something stuipid that will slow it down you don't notice it as much. Its good to run it as quick as you can to get the best gameplay experience. And its good to show up your friends in 3DMark03 every now and then!!! :p
 
:rolleyes::rolleyes::rolleyes:
technically you're trying to sound smart but failing.

Originally posted by pistola
sure they do, but not in the literal sense. Its not like theres a video card pumping out frames for you to "see". And technically you don't ever see anything, you have the sensation of seeing. Its all perception. We are just using framerates as a means of comparision.
 
Originally posted by burningrave101
Usually if your getting 60-70 fps you shouldn't worry about getting any higher because your not going to notice a difference. When people say they can notice a difference between 70 fps and like 170 fps its most likely because they knew which one was 70 fps and which one was 170 fps when they compared. If you looked at both without knowing which was which your not going to see any real differences. Plus the fact that your going to have a hard time getting your monitor to refresh at 170 fps. Top LCD's are only 16 ms which equates to around 65 Hz. That means if you are getting 200 FPS in UT2003 on your suped up rig that doesn't mean much considering your actually viewing around 65-70 fps.

Why people put so much emphasis on gaming performance when processors have already broken the barrier for speed needed on both AMD and Intel processors is beyond me. Especially since almost every game is video card intensive.

what are you talking about, LCDs dont have a refresh rate, they have an update rate, the only thing it draws are changes. thats why theyre so much easier on your eyes
 
you could miss something. if you turn quickly and its only 30 fps. the frame where you see the enemy might be completely missed.

i dont know how they count that but i am sure seing an enemy for no matter how short a time is better then him never even showing up and you getting fragged.
 
Originally posted by pistola
You hit it right on the nose. A human sees motion blur, you wave your hand infront of your face it has motionblur. Video game unfortuneitly do not. It would take forever for your video card to render motion blur in a video game. So instead of running a video game at 30fps w/ motionblur they double it to 60fps w/o motion blur. 60fps is the lowest you can play at w/o noticing the framerates anything above that looks smoother, but not that much. Its much like the audio law when encodeing analog to digital, it has to be twice the sampling rate. So interms of video games it has to be twice the framerate. Hopefully in future games, motionblur will be useable so that we won't need high frame rates.

because it saying "draw this frame with a little of last frame" is real time consuming?
 
I can definatly tell if something is drawing at 30FPS vs 60FPS, all I care now is if the game is at least a constant 60-70FPS and the image quality is superb.
 
From studies in cognitive psych, they say that at under 30ms, we view things as simultaneous. At around 60ms we detect movement and going up to and beyond a 200ms delay it appears to be successive images or whatever you might be talking about.
 
now i want a gaming monitor that automatically makes motion blur just so my voodoo3 can play doom3. will my 10 year old passive tft lcd suffice?
 
Originally posted by kronchev
what are you talking about, LCDs dont have a refresh rate, they have an update rate, the only thing it draws are changes. thats why theyre so much easier on your eyes

"Top LCD's are only 16 ms which equates to around 65 Hz."

-----------------
what he said is true, you only get around 65 updates a second, so any frames above that number are completely wasted.

updates on an lcd tell the screen where there is a change, refresh on a screen updates the whole screen. same difference. it just takes a little bit of math to turn the response time into being comparible with (more easily fps relative) Hz.

lower response time equates higher Hz.

but Hz usually refers to an electromagnetic frequency forced controlled travel (propelling cathode rays, bus speeds, radio waves, etc). LCD are just lit by a backlight.

the reason they are easier on the eyes is because the screen doesnt have a charge losing phosphorus layer that creats a blank/black interval between every update. this blank interval causes your eyes to momentarily lose focus and refocus, which causes eyestrain.

if you crank up the Hz to something really high the phosphorus layer is less likely to get dim, and your eyes wont adjust for the blank interval and your muscles will be less stressed.

but the higher contrast of crt also contributes to pupil fatigue.
 
Originally posted by Ocean
"Top LCD's are only 16 ms which equates to around 65 Hz."

-----------------
what he said is true, you only get around 65 updates a second, so any frames above that number are completely wasted.

updates on an lcd tell the screen where there is a change, refresh on a screen updates the whole screen. same difference. it just takes a little bit of math to turn the response time into being comparible with (more easily fps relative) Hz.

lower response time equates higher Hz.

but Hz usually refers to an electromagnetic frequency forced controlled travel (propelling cathode rays, bus speeds, radio waves, etc). LCD are just lit by a backlight.

the reason they are easier on the eyes is because the screen doesnt have a charge losing phosphorus layer that creats a blank/black interval between every update. this blank interval causes your eyes to momentarily lose focus and refocus, which causes eyestrain.

if you crank up the Hz to something really high the phosphorus layer is less likely to get dim, and your eyes wont adjust for the blank interval and your muscles will be less stressed.

but the higher contrast of crt also contributes to pupil fatigue.

both of you are technically correct but saying an LCD has a refresh rate is wrong. that equates it with the same illumination method as a CRT and thats wrong. even if the whole screen is changing every possible update, as you said, you wont see it.

and i dont think that top of the lines are only 16, hell my friends LCD, a 17 inch Sceptre (unknown but decent LCD company) that cost him like 300 does 75 hz at 1280x1024.
 
Originally posted by kronchev
and i dont think that top of the lines are only 16, hell my friends LCD, a 17 inch Sceptre (unknown but decent LCD company) that cost him like 300 does 75 hz at 1280x1024.

The LCD's like the Dell 2001FP and Viewsonic vp171b and the NEC LCD's and others that are considered the "top" LCD's at least here on HardOCP are only 16ms.

Does ghosting not occur when your playing a fast paced game like a FPS and your LCD cannot refresh fast enough to keep up?
 
I don't know if anyone knows the true limits of your eye, but you can certainly see things "faster" than 30hz. This article points to a study of Air Force pilots that could identify planes that they were shown for only 1/220th of a second.
 
using motion blur in that sense decives your eyes (well, the brain) and doesnt really work well. You're lower optical cortex will shout out "WTF is going on!?!?"

Originally posted by kronchev
because it saying "draw this frame with a little of last frame" is real time consuming?
 
Originally posted by kronchev
and i dont think that top of the lines are only 16, hell my friends LCD, a 17 inch Sceptre (unknown but decent LCD company) that cost him like 300 does 75 hz at 1280x1024.

He's an idiot that if he changes the refresh rate to anything but 60 with an LCD. Set it as low as possible, and turn off V-Sync. As stated before LCDs don't "refresh" like CRTs, so there's no need to change refresh rates any higher.
 
This is one of those debates that I don't think we will ever solve. and even if hard fact arises, some people will still think differently.

But anyway here is my theory...

I think the people that claim they can tell the diff between 70 and 100 or so, are actually feeling the difference. Games just seem to feel slower. Like mouse reactions and such. But that is just my theory.

It's not so much a sight thing as it is a feeling.
 
Originally posted by Spinal
This is one of those debates that I don't think we will ever solve. and even if hard fact arises, some people will still think differently.

But anyway here is my theory...

I think the people that claim they can tell the diff between 70 and 100 or so, are actually feeling the difference. Games just seem to feel slower. Like mouse reactions and such. But that is just my theory.

It's not so much a sight thing as it is a feeling.

It might, perhaps, be because in games, you expect immediate reactions to your actions, and while 30fps doesn't feel slow when you're just observing, it might be noticed by whatever mechanism(s) track the effect of our movements.
 
Well even though your eyes see at 30 fps, to say all you need is 30 on the monitor is wrong, think of your eyes fps as a light turning on and off, and the monitors fps as one as well. If 30 fps on the monitor was all you needed then the lights would have to flicker at the exact same speed. This just doesn't happen, since all humans eyes refresh at different moments, the best bet is to get the fps on monitor as high as possible. So there might be some flickers when the light is off in your eyes, but hopefully there will be flickers everytime when the light is on in your eyes. Yes that means some of those fps on your monitor are wasted, but that doesn't mean any fps over 30 or 60 or 70 are wasted, it just fills in the gaps more and decreases the chances of your eyes flickering on without anything to read.
 
WOW..

what a bunch of bullshit going on in this thread.

eyes do not see in FPS. they can tell the difference between a monitor operating at 60Hz and a monitor at 120Hz. they can tell the difference between 30 fps, and 60fps, and 120fps or more.
 
Oh ya and they have motion blur on tvs because when they were first conceived higher fps was not possible so they HAD to put motion blur, if you relax your eyes you can see the blurs. HDTV is at 60 fps and they removed most of the motion blur.
 
Originally posted by theTIK
Well even though your eyes see at 30 fps, to say all you need is 30 on the monitor is wrong, think of your eyes fps as a light turning on and off, and the monitors fps as one as well. If 30 fps on the monitor was all you needed then the lights would have to flicker at the exact same speed. This just doesn't happen, since all humans eyes refresh at different moments, the best bet is to get the fps on monitor as high as possible. So there might be some flickers when the light is off in your eyes, but hopefully there will be flickers everytime when the light is on in your eyes. Yes that means some of those fps on your monitor are wasted, but that doesn't mean any fps over 30 or 60 or 70 are wasted, it just fills in the gaps more and decreases the chances of your eyes flickering on without anything to read.

Ehrm, I had the distinc impression that eyes don't do that?
(If they did, we would get assorted strobe effects, and we don't, just to begin to describe why it seems unlikely)
 
Originally posted by HHunt
Ehrm, I had the distinc impression that eyes don't do that?
(If they did, we would get assorted strobe effects, and we don't, just to begin to describe why it seems unlikely)

We wouldn't get strobe effects, not as long as each time the light flickers on there is the images or object there. The reason why running a monitor at 30 fps gives you strobe is because a lot of the times your eyes flicker on there is nothing there. And if there was someone who could see in 120 Hz his whole life, and then dropped down to our 30, ours eyes would seem strobe to that person. Our human brains compensate by creating motion blur.
 
Originally posted by theTIK
We wouldn't get strobe effects, not as long as each time the light flickers on there is the images or object there. The reason why running a monitor at 30 fps gives you strobe is because a lot of the times your eyes flicker on there is nothing there. And if there was someone who could see in 120 Hz his whole life, and then dropped down to our 30, ours eyes would seem strobe to that person. Our human brains compensate by creating motion blur.

This goes so directly against everything I've read and heard that I'd be happy for some documentation backing you up :)

Anyway.
If you look at something that repeats itself exactly 30 times a second (like a spoked wheel spinning at the right speed), it doesn't appear to freeze, yet if put on a 30 blinks/s strobe, it does. The eye and brain wouldn't know what happened between the updates in any of the cases, so "creating motion blur" seems to be out.
 
Originally posted by HHunt
This goes so directly against everything I've read and heard that I'd be happy for some documentation backing you up :)

Anyway.
If you look at something that repeats itself exactly 30 times a second (like a spoked wheel spinning at the right speed), it doesn't appear to freeze, yet if put on a 30 blinks/s strobe, it does. The eye and brain wouldn't know what happened between the updates in any of the cases, so "creating motion blur" seems to be out.

Your brain figures stuff like that out. I remember reading about how they tested people by getting glass's that complete inverted their vision, within 24 hours there brains all inverted the vision back, they never took the glasses off during this time, after their brains inverted it the light was being presented to the eyes inverted but they were seeing normal. Then when they took the glass's off normal vision seemed inverted to them, then the brain caught up again and inverted it back to normal.
 
Originally posted by theTIK
Your brain figures stuff like that out. I remember reading about how they tested people by getting glass's that complete inverted their vision, within 24 hours there brains all inverted the vision back, they never took the glasses off during this time, after their brains inverted it the light was being presented to the eyes inverted but they were seeing normal. Then when they took the glass's off normal vision seemed inverted to them, then the brain caught up again and inverted it back to normal.

Only when it has enough information :D
Another thing: Rods and cones have response times and recovery times, of course. However, these are depend on how much light hits, and if it's a rod or cone, among other things. As most things you look at aren't an uniform brightness (far from it), they will neither react nor recover at the same time.
 
Back
Top