Any relation between graphics chip and screen brightness?

Let’s say I have a computer with dual graphics cards- one for easier graphics and the other for more intense graphics (gaming or whatever). Regardless of whether switching between the two graphics cards is automatic or manual, would this result in any difference in screen brightness?

In my opinion, there is no relation. For example, with a screen using LEDs for the backlight, my understanding is that brightness would be controlled through PWM for the LEDs (changing the turning on an off timing but all so quick that we can perceive only differences in brightness but can’t see the switching).

In other words, the different graphics cards result in different resolutions but not different brightnesses. Is my understanding correct? Thanks to anyone willing to help out.

It will not change screen brightness, it may change the way colors are shown slightly but nothing to do with brightness. The only way it would change brightness is if you were using an NVIDIA card and a monitor that supported their Lightboost technology. Almost all graphics cards can run 1920x1200+ so it shouldn’t really matter too much on that front either.

Thank you for that. Would you happen to know how brightness is controlled? I think we’ve established that it’s a completely separate control mechanism (completely separate circuitry). So when we make adjustments through the menu on the monitor, my guess is we are performing a dimming operation of the fluorescent tubes or LEDs using duty cycle or PWM. There are probably some configurations in which some of the fluorescent tubes or LEDs are selectively turned on and off (I have no idea about this but it seems reasonable that somebody would have thought of using this type of control as well).

Forget about weird explanations.

Each chip has its own software setup, that overrides the system’s setup. If you have an nVidia Optimus system (nVidia+intel graphic chipsets) the nVidia configuration tool lets you adjust EVERYTHING, from color vibrance to brightness, to saturation, to color profiles. Intel chipset is more limited both in graphical abilities and configuration options, but it also lets you adjust the monitor gamma and things like that (which will result in a different brightness).

Check both configurations and make sure that the parameters are the same (at least the ones you can see). If you can’t find where’s the problem, you may need to tweak the gamma output in one (or both) of them.

[quote=“Blaquesmith”]Forget about weird explanations.

Each chip has its own software setup, that overrides the system’s setup. If you have an nVidia Optimus system (nVidia+intel graphic chipsets) the nVidia configuration tool lets you adjust EVERYTHING, from color vibrance to brightness, to saturation, to color profiles. Intel chipset is more limited both in graphical abilities and configuration options, but it also lets you adjust the monitor gamma and things like that (which will result in a different brightness).

Check both configurations and make sure that the parameters are the same (at least the ones you can see). If you can’t find where’s the problem, you may need to tweak the gamma output in one (or both) of them.[/quote]

Thank you for that, Blaquesmith.

I do patent work. Basically, an examiner is telling me that switching between two different graphics cards in a previous patent will inherently result in two different screen brightnesses. I don’t believe that he’s right. Screen brightness has to do with control of the backlighting, in my opinion. This is a very special kind of control involving duty cycle and maybe selective on and off of the LEDs (when the backlight is an LED backlight).

Let me add a bit more. I think it’s feasible for there to be a different brightness as you switch between different graphics cards, but this is not a result of switching between different graphics cards but a result of the set up you have established. So if you go from bright to dim as you switch from graphics card A to graphics card B, it would be a simple matter to set things up so that the opposite happens. This is not unlike the set up you might have when switching between AC and battery power (you can set things up so the brightness changes or stays the same).

Do you agree that I’m on the right track here? Thank you!

The thing is that each different company do their silicon in a different way, with different optimizations regarding the color treatment. In 24-bit digital format, they usually store the values as RRGGBB, 8 bits for each color (each letter represents 4 bits, which can be easily represented by an hexadecimal value). The brightness depends on the color. #FFFFFF represents white, and #000000 represents black. This absolute colors should be seen the same by the same monitors, but sometimes they aren’t. Why?

  1. the output signal is not digital. If you use old monitors, the signal may not be digital, so different cards will have different digital/analogic converters, creating differences.

  2. each company changes its configuration and optimizations. If you pick two different graphic cards by nVidia, one made by Asus and the other made by Gigabyte, and run a benchmark program on them both, you’ll see that the results are different, even when they both use the exact same software drivers and the same core chip. Some may give your image a greenish finish, some may not. Some may be more saturated. Everything depends on the factory settings and tweaks for each set. That is why professional software like photoshop uses color profiles and ask you to calibrate your monitor, to compensate for the differences between hardware rigs.

[quote=“Blaquesmith”]The thing is that each different company do their silicon in a different way, with different optimizations regarding the color treatment. In 24-bit digital format, they usually store the values as RRGGBB, 8 bits for each color (each letter represents 4 bits, which can be easily represented by an hexadecimal value). The brightness depends on the color. #FFFFFF represents white, and #000000 represents black. This absolute colors should be seen the same by the same monitors, but sometimes they aren’t. Why?

  1. the output signal is not digital. If you use old monitors, the signal may not be digital, so different cards will have different digital/analogic converters, creating differences.

  2. each company changes its configuration and optimizations. If you pick two different graphic cards by nVidia, one made by Asus and the other made by Gigabyte, and run a benchmark program on them both, you’ll see that the results are different, even when they both use the exact same software drivers and the same core chip. Some may give your image a greenish finish, some may not. Some may be more saturated. Everything depends on the factory settings and tweaks for each set. That is why professional software like photoshop uses color profiles and ask you to calibrate your monitor, to compensate for the differences between hardware rigs.[/quote]

Thank you, again! Let me think about my problem a bit more with your input in mind.

Here are a couple of replies I got elsewhere in case anyone is interested in this topic:

[quote]The video card doesn’t provide the energy for the screen, so different video cards wouldn’t make a difference as long as they were operating correctly.

One thing to consider - are the video cards providing the signal in analog or digital format? If the output signal is digital, the two video cards would render the same exact value for their output - provided that they are functioning correctly… If they were analog, there might be a variation, and some cards may indeed be limited in that output signal level because of some defect, but if the cards are operating properly then the difference between the two would be minor. It also wouldn’t necessarily be simply brighter or darker - there would be a small amount of variation (because of an analog device) for each colour output and the net result could be brighter, darker, or imperceptible - most likely the latter. The purpose of the video card is to render things equally from one card to the next, based upon the same inputs from the program.
[/quote]

[quote]Also it could be in the software setting of the gamma (I/O curve defining the contrast curve), as there are differences between the Mac (γ = 1.8) & Windows (γ=2.2 - 2.4) platforms.

If you remember back a few years ago, there was the problem of games created on Macs appearing as too dark when played on a PC: This is directly attributable to the difference in the γ between the two platforms.[/quote]