Hands-on: Acer Predator Triton 700 review
Can anyone clarify this for me please? I purchased a new graphics card today, and entered into a conversation with the shop's IT guru on the subject of frame rates.
I have always worked on the basis that TV frame rates are 25fps - based on interlaced 50hz signal and the fact that movies are generally shot at 24fps (hence they are marginally speeded up on transmission in the UK). Therefore any graphics card that will produce a consistent frame rate of 25fps will be at least as good as TV quality in terms of the fluidity of the picture.
The guru said that this wasn't true as anything less than 50fps was less than the rate at which a UK TV picture is refreshed.
Now I've a feeling that there may be some truth in what he is saying, and in any event the matter is probably more complicated than my simplistic approach. But is my simplistic approach right? That any card that produces 25fps is giving the same smoothness of movement as a TV picture and anything above this is a gain - or does that gain only start to be become apparent over 50fps as the Guru is telling me?
Woodchip - like I say it may not be as simple as my simplistic approach!
I use my machine for just one gaming application (the rest is business!!). That application is FS2004. Running at a resolution of 1024x1280 32bit colour, FS2004 will give me a frame rate of between 9 and 50fps depending on the complexity of the screen image it is displaying (you can obtain this info as an onscreen display). The question is, does 25fps equate to the fluidity of a TV picture, or does it have to reach 50fps? The quality of the image is another matter (I can't watch a TV from the distance at which I view my monitor as the quality is so poor!) - which is why I am using the term fluidity to describe the refresh rates.
25 is fast enough for TV as most Comp TV's capture at that rate. From Camcorders etc
Does it not suprise you that one gets a better picture on your minitor?
With frame rates for games it is important to realise that they are rendered "on the fly" by the graphics card.This means that in scenes where there is little action the frame rate will be high but where there is a lot of action or complicated graphics processing to be done the frame rate will drop.Often quite dramatically.A movie at 24fps plays smoothly because each frame is just a picture needing pretty much the same amount of rendering....hence a smooth transition from frame to frame and smooth action on screen.With a game if the average frame rate is the same 24fps then the minimum frame rate you get will be much lower than this and the eye will almost certainly pick that up as jerky movement and imprecise control of your character.
It is for this reason you will see magazines quoting a figure of 50-60fps as the desireable average frame rate from any game benchmarks.It will guarantee that practically all the in-game frame rates stay above the minimum of around 30fps needed for smooth game play.
Incidentally fps is not the same thing as the monitor/tv refresh rate...that is the rate at which the image is refreshed by the monitor itself.Things get more complicated there so I wont say any more on that other than that it does not matter in terms of smooth playback of either games or movies.It does play a part ,but not in a simple explaination!
Essentially a graphics card needs to be able to stay above the 24fps of a movie at all times in order to guarantee smooth game play......the further above that figure the better.
What the guru says about over 50fps being better for games is correct.....but for the wrong reason.
A lot of people overlook the fact that the vast majority of computer monitors use a non-interlaced display redrawing every line of information on the screen with each refresh; an interlaced monitor refreshes only every other line, the picture quality is not as clear as an non-interlaced monitor and the possibility of eye flicker may be increased or experienced.
For playback of DVD on your computer monitor or via tv out, almost any AGP or PCIe graphics card will suffice (depending on your motherboard) frames per sec are not an issue for DVD playback.
The Picture may look better if you output the video to your TV screen when playing back DVD if you use the extended desktop or the clone mode functions of your graphics card you can do this.
A TV always looks better for video, even dvd because of the way it works, it blurs the image slightly which makes it appear more fluid to the eye. A bit like Anti Aliasing...so I am told.
With games though it's a different story. At least 25/30 frames per second is needed for most games to appear smooth and playable. The latest games do require a fairly high end system to get the best out of them.
A standard TV picture is 50Hz interlaced. Many modern TVs actually work at 100Hz, to try and reduce picture flicker. 50Hz interlaced means every other line on the screen is drawn 50 times/sec - thus the complete frame is redrawn 25 times/sec.
Picture flicker is different to motion flicker - motion flicker on film is not perceived at 25fps - but picture flicker on a CRT may be perceived up to 75Hz.
The reason is the persistence of the image - a film image is persistent, CRT images are not - so they need to be refreshed very fast to avoid flicker. People vary in their sensitivity to flicker too. TFT images are more persistent than CRT - thus the standard refresh rate for TFTs is only 60Hz.
So to answer your question, 25fps is equal in motion fluidity to a TV or a true film projector.
And note that fps in nothing to do with refresh rate. The refresh rate is the automatic redrawing on the screen of whatever image is in the graphics card. fps is the rate at which new pictures are sent to the card.
Thanks for a very concise answer which is in accord with my understanding of matters. Points taken from other posts - FPS is not everything. But the basic principle is as I previously thought it to be. Thanks.
This thread is now locked and can not be replied to.