For one thing, it’s much easier to measure spans of time when you have an integer frame rate. For example, 1 hour at 30fps is exactly 108,000 frames, but at 29.97 it’s only 107,892 frames. Since frame numbers must all have an integer time code, “drop-frame” time code is used, where each second has a variable number of frames so that by the end of each measured hour the total elapsed time syncs up with the time code, i.e. “01:00:00;00” falls after exactly one hour has passed. This is of course crucial when scheduling programs, advertisements, and so on. It’s a confusing mess and historically has caused all kinds of headaches for the TV industry over the years.
In addition to the other poster's on-point remarks, film cameras have always run at integer frame rates. We have generations of motion pictures shot at 24 FPS.
Many TV shows (all, before video tape) were shot on film too, but I'm not sure if they were at an even 30 FPS.
For one thing, it’s much easier to measure spans of time when you have an integer frame rate. For example, 1 hour at 30fps is exactly 108,000 frames, but at 29.97 it’s only 107,892 frames. Since frame numbers must all have an integer time code, “drop-frame” time code is used, where each second has a variable number of frames so that by the end of each measured hour the total elapsed time syncs up with the time code, i.e. “01:00:00;00” falls after exactly one hour has passed. This is of course crucial when scheduling programs, advertisements, and so on. It’s a confusing mess and historically has caused all kinds of headaches for the TV industry over the years.
In addition to the other poster's on-point remarks, film cameras have always run at integer frame rates. We have generations of motion pictures shot at 24 FPS.
Many TV shows (all, before video tape) were shot on film too, but I'm not sure if they were at an even 30 FPS.