Something puzzles me about Nvidia graphics drivers

  Mr Mistoffelees 13:22 06 Feb 05
Locked

Correct me if I am wrong but I believe I am right that most, if not all, LCD monitors can only display a maximum of 24 bit colour. So why do Nvidia, and maybe others, only give one the option of 16 or 32 bit colour? 16 bit doesn't look good but 32 bit is wasting resources. Many graphics drivers used to support 24 bit so why no more?

  HappySoul 14:44 06 Feb 05

I get a choice of 8, 16 or 32 and it set itself to 32. My 15" TFT looks fine.

  freaky 16:11 06 Feb 05

I think that you would notice the full improvement of 32 bit colour on a LCD, if it has a DVI socket that is connected to a graphic card DVI.

If this is not the case, and the LCD is receiving a VGA signal, then setting the graphic card interface to 32 bit will not use any more resources because they are not being used.

The drivers supplied by Nvidea and such, are for general use and are not specific to the type of display of the users PC. It is therefore up to the user to select the most appropriate setting dependent on their personal preferences.

  Mr Mistoffelees 16:37 06 Feb 05

I think you miss the point. If your screen can only display 24 bit colour, why should you have to feed it 32 bit? Running at 24 bit would reduce the load on your graphics card enabling higher frame rates but Nvidia do not allow a 24 bit option. Regardless of your display if you set your graphics card to 32 bit that is what it will run at. TFT LCD displays can only display a maximum of 24 bit colour so the extra processing to generate the additional colour information is wasted.

This thread is now locked and can not be replied to.

Surface Pro (2017) vs Surface Pro 4

20 groundbreaking 3D animation techniques

How to mine Bitcoin on Mac