That HDR is part of the new normal for monitors and gaming is well-known by now. High dynamic range makes image quality brighter and more vivid. It adds realism to displays and helps colors pop with life. BenQ HDRi technology takes HDR to the next level by auto-adjusting display brightness based on ambient lighting, instead of sticking with static HDR metadata.
However, HDR also happens to function as a very delicate technology. The line between standard dynamic range (SDR) and high dynamic range isn’t always that clear. Generally, VESA DisplayHDR certification requires a minimum of 400 candelas or nits of peak brightness to qualify as HDR. Formats such as HDR10 settle for slightly lower figures like 350 candelas/nits. Anything below that looks essentially like SDR and will function as HDR in name only. For context, DisplayHDR currently goes up to 1400 nits, and several TV manufacturers routinely make models capable of 4000 nits. So don’t worry, HDR isn’t like staring at the sun.
The advent of HDR has essentially made the brightness setting redundant, because monitors and TVs that wish to deliver good HDR performance must always showcase peak brightness. Other than when displaying SDR content, in which case they can automatically tone down brightness without user intervention.
Note even the most hardcore, top of the line, expensive TVs now have essentially done away with brightness settings and only offer gamma scales. Even those don’t do much, because in the age of HDR reducing brightness makes as much sense as trying to turn water a little drier. Games, movies, TV shows, and even photos are rendered and mastered with HDR baked in, so lowering their brightness would do nothing but detract from the way they were intended to be viewed.