Best phone camera 2016/2017: Galaxy S7 vs iPhone 7 vs Google Pixel vs HTC 10 Evo vs OnePlus 3T vs…
I have a monitor that can display a range of refresh rates which are 56hz, 60hz, 70hz, 72hz, 75hz and 85hz.
I find that anything below 75hz causes the screen to flicker and me to get a headache so I am currently running at 75hz which is fine.
What I was wondering was if I went for 85hz does this put extra work on the graphics card?
What I mean is if you run a graphic intensive game (like Toca Race Driver 2 for example) on 85hz would you get slower frame rates than my current setting of 75hz or would it make no difference?
128mb - it is a Radeon Pro 9800
You should always run the refresh rate at the highest the monitor can take. It shouldn, really affect your card enough to notice.
That card with that much ram should play with 85hz and more. I aggree with Fruit Bat ramp it up m8
If you experience any adverse reaction you can always bring it back down a tad.
I find that higher refresh rates makes the screen image of the monitor much clearer and easier to read. Hence why I always use 85hz for normal working or games. Anything lower, it seems to give me a headache and seems bad for my eyes i.e. flickering. You may find that some people feel differently towards this and find that the other way works better for them.
I didn't know that the memory on board (128 mb in my case) is directly related to the refresh rate?
What I mean is what you said in your last thread and I quote "That card with that much ram should play with 85hz and more."
I hate using a monitor below 75Mhz. In about 20 seconds my eye's hurt, and if i don't give up at that point the migrane starts not long after. Puke, Puke, etc...Strange thing is that no one else at work can tell the difference.
There's one PC at work that has a driver porblem, and i can't find a refresh rate setting on it to change, and can't use it long enough to find out what the problem is. needless to say i never use it.
My humble 32Mb graphics card can support 85hz, so I don't think it's 'directly' related to the amount of memory (but I may be wrong).
To learn how your graphics card is related to the refresh rate read this helpful article - click here.
"The RAMDAC is the device in the video card that is responsible for reading the contents of the video memory, converting the digital values in memory into analog video signals, and sending them over the video cable to the monitor.
The RAMDAC's ability to translate and transfer this information directly controls the REFRESH RATE for the video mode it is operating in. The refresh rate is the number of times per second that the RAMDAC is able to send a signal to the monitor and the monitor is able to repaint the screen..."
My card's RAMDAC = 300Mhz (source: click here).
Your card's RAMDAC = 400MHz (most probably - check with your manufacturer's specs).
Doesn't seem much in it, does it? HTH : ) G
Another excerpt from the same link that may answer your query about frame rates vs. refresh rates:
"Note: Do not confuse the refresh rate with the term "frame rate", often used for games. The frame rate of a program refers to how many times per second the graphics engine can calculate a new image and put it into the video memory. The refresh rate is how often the contents of video memory are sent to the monitor. Frame rate is much more a function of the type of software being used and how well it works with the acceleration capabilities of the video card. It has nothing at all to do with the monitor."
This thread is now locked and can not be replied to.