How Many Frames Are Needed To Fool The Eyes?

Hey all, I was just wondering how Many FPS are needed to fool the eyes into thinking that something is in motion?

Average home video camera runs at approximately 24 fps. Movie cameras run at approximately 40 fps. Enough to fool the human eye is around 10-12. Enough to fool the human eye can be achieved very easily. For example the Thaumatrope.

The term for this is “persistence of vision.” I believe early studies put film projection around 16fps, which is probably why super8 film runs at 18fps (just above the threshold). However, other media (like a flip book) might be different.

At any rate, for cinema-like animations, a rate of 24fps is the general rule.

@GraphiX: Movie cameras run at 24fps most of the time; I’m not sure where you’re getting the 40fps. And NTSC video runs at 30fps, while PAL video is 25fps. (If you want to be picky and talk about fields, then it’s 60 and 50 fields per second, technically.) Sometimes when shooting for television, a film camera will run at 30 or 25 to match the rate of the television. Most of the newer prosumer cameras (DSLR, e.g.) have an option to run at 24 to look more like film, but include other video-esque framerates as well (30, 60).

Personally i can’t stand looking at a screan that is under 25-28 fps, but that is because the fps in dfferent people eyes varies (not massively) from person to person.
That’s one of the main reasons i don’t watch TV AT ALL beause i just can’t stand it when it has a bad signal or whatever.
So to trick the eye is actually quite a broad thing to ask, because different people see at different fps and thus it changes.
Although i’m pretty sure that approx. 24 is average.

Movies and animations are usually filmed at 24 frames per second. 24 fps is standard within the movie industry.

6
But I like 50

All uses 30 (motion, videos)
But you will be working almost in 24 all the time

What a mess

@GraphiX: Movie cameras run at 24fps most of the time; I’m not sure where you’re getting the 40fps. And NTSC video runs at 30fps, while PAL video is 25fps. (If you want to be picky and talk about fields, then it’s 60 and 50 fields per second, technically.) Sometimes when shooting for television, a film camera will run at 30 or 25 to match the rate of the television. Most of the newer prosumer cameras (DSLR, e.g.) have an option to run at 24 to look more like film, but include other video-esque framerates as well (30, 60).

Many high quality cinema-grade cameras run at 40 I do believe.

you should have asked “how many frames are needed to comfortably fool the eyes”. Star Fox in the NES fooled my eyes at just 12 FPS and yet it still looked as bad as old B&W short movies. :slight_smile:

Movies are usually 24FPS, but they have natural motion blur which eases on the eye. Have you seen modern TVs playing bluray 3D animations? Ultra smooth at 120FPS, looks out of this world…

Clergymen believe.

It would be highly random if you´d find such a 40 fps camera.
It is more expensive to produce a camera that can cap 40 fps and the only benefit is you could do some nicer slomo stuff. But for that film productions use highspeed cameras anyways.

The 25 and 30 fps became the most common from the television and from the frequency of the electricity. 50Hz in Europe -> 25 fps / 60Hz in the US of A ->29.97 fps (no idea why not exactly 30)
So for broadcasting 40 fps makes no sense.

24p became the standard for film production as you don´t need interlaced stuff for film and soon has its 100 year anniversary.
So for cinema 40 fps doesn´t make sense either.

Beyond that used framerates are:
50i, 60i, 24p, 25p, 30p, 50p, 60p and 72p (where I find latter 3 rather exotic, one is “experimental” and the other two gaining ground in HDTV for whatever reason)

For the stuttering…
if you got something with 24fps and it seems to stutter, it is not because it is a too low framerate, it is beacuse the intervals are not the same.
24 fps means for movie you see 24 images in one second, that much is obvious.
However in a movie it means you see one image every 1/24th second.
In a game for instance it can happen that you got 50 fps but it seems to stutter.
It is because you aren´t necessarily seeing one frame every 1/50th second. It might happen that one frame´s there after 1/75th second, and the next comes after 1/10th second and in the end you have your 50 frames per second, but it is perceived to stutter because the frames aren´t displayed constant.

Those are not 120Hz -.-
The highest framerate for Blueray by ATSC standard is 60fps @1080p.
Just because a TV has 100 or 120Hz it does not mean you see 120 different images per second.

I heard the TV electronics does the job of actually interpolating 2 frames and creating a new one akin to motion blur.

Even TV’s with those ridiculous numbers like a newish sony (or was it panasonic…) one, saying that it does 300 or something, It doesn’t meen that they PLAY at that, thats the refresh rate. Although the hardware is capable, the software nor the image is designed for it, so they don’t bother.
Although a high refresh rate does give a slightly improved image, it is none the less a decieving thing that many companies use to advertise. Like subway saying that they have a “great selection of Subway Six inch subs with 6 grams of fat or less”. Yes, they DO have that, but when you walk in there you don’t GET one. You end up with one that has like 40 grams. It’s just advertising, it has no real application.

For decent motion, you want 24 frames per second. 1/24th of a second is the length of time an image is retained in your sensory memory of your eyes. (ie. close your eyes. What you were just looking at lingered around for ~1/24th of a second). This is the threshold where it is impossible to see the individual frames. Sure… you can see “motion” at lower frame rates, but it looks choppy.

Now the threshold for smoothness (the frame rate where higher fps are a waste of resources that don’t contribute to motion smoothness) is debated, but usually places somewhere between 60-65 fps.

Don’t confuse this with REFRESH RATE. The refresh of a screen is when the image is briefly flashed up on the screen. Televisions that claim refresh rates of 240Hz… well that’s not a bad thing. What this typically amounts to is not a higher frame rate, but less motion blur caused by your eyes flicking across the screen.

And yes, one trend lately which I absolutely despise is that some televisions nowadays do this frame interpolation crap where it makes a really smooth image by using motion interpolation to create an artificial image between two frames (but in my opinion ruins the cinematic feel of things).

Neurophysologically the eyes cannot transmit signals above 72Hz(different images per second) interrupts to the brain (neurons are slow, and the distance to the part of the brain closest to the optical nerve taken into account). Stroboscopy effects mean that you can “register” that the signal is higher than 72Hzdifferent images per second (like on a 100Hz VS a 60Hz tv you will notice the “flickering” of the image).
People who have looked into it when researching the jpeg format. (if memmory serves me, of course you could use google).
there are cinemas using movies shot at 60 frames per second and you will defenitly notice the images being smoother and crisper (Futuroscope near Poitier has great examples of thise, omniversum or Imax also have higher framerates).

That’s your “average Joe” measures, in the 80’s though, so the values might’ve changed a bit.