So for the longest time, TVs were displaying 60 half-frames (due to interlacing). As such, having a game run at 60 Hz would result in slightly smoother animation, but a lot more processing power which could otherwise be used to enhance the detail. So typically 60 frames games were smoother, but 30 frames games were much more detailed. Any other rate, and you the smoothness will vary, which is rather jarring to the immersiveness factor.
All of this only applies to consoles, though, since PCs have always had monitors that could do a variety of framerates, and therfore their games always strived for the highest framerate possible (and where 60 Hz is on the far low end) and don't really worry about the framerate dipping at times. High end modern monitors can typically do at least 140 Hz, and some older games on new hardware can actually create several hundred frames per second if certain settings are disabled.
Regardless of PC or console, the same programming technique is used for creating the frames and managing the process of sending them to the screen. Each object is placed in position in a virtual 3D space in RAM, textures are applied (only if they are facing the virtual camera's location), the view from the virtual camera is established, and the objects that the camera can see are flattened into a 2D image. This whole process is called rendering.
Then, the rendered image is put into a piece of memory (usually in the graphics chip itself) that has been designated as a "frame". This frame is then sent to the TV/monitor, and as it is being sent, a second image is being rendered and put into a second "frame". Once the first frame has been sent to the screen completely, the second frame is designated as the primary frame, and a third image is rendered and overwrites the first frame. This process is called Frame Buffering.
PCs (and maybe HDMI TV connections, I'm not sure) have the capability of the monitor being able to send a signal back to the game program and tell it when a frame is done being drawn. This allows the game to not switch frames while it is being sent to the screen (vertical synch), preventing a top section of an image and a bottom section of the same image displayed not matching up, refered to as "tearing". Since consoles know what frequency the TV is operating at based on the region (or more recently in Europe, through an option in the settings for either the game or the system), they can simply use an internal timer as an artificial vertical synch.
That being said, there are games that experience a drop in FPS in processor/graphics intensive moments. Ideally, these moments wouldn't exist, but they still do in some games as they're pushing the hardware too much to keep the animation smooth.
If you play games on a computer and get a FPS display for the game you're playing you will notice it generally stays around a constant area with a few fluctuations. If you upgrade your hardware you will notice the FPS increase. Adjusting the graphics quality in the options for the game will also change the FPS you get while playing.
It's a direct relation on how much power the game is asking from the hardware and how much the hardware can put out at a certain rate.
Kurt
http://www.myinternetmarketinggroup.com
No comments:
Post a Comment