Set analog LED brightness resolution to more than 8 bits

Hi there,
I want to set my analog CCT strip to 10 bits ledc duty resolution. If I just change 8 → 10 in ledcSetup (line 376 bus_manager.h) the maximum brightness is really limited. Not a surprise as everything seems to be set to 255 max brightness, as max brightness with 10 bits would be 1023.

So my question: Is there a way to use 10 bits brightness resolution with WLED atm?
Maybe one can edit some lines but I’d like to know where exactly.

Check the ESP documentation on LEDC and PWM.
You can have higher resolution at the expense of PWM frequency.

That’s what I’m trying to do, but a higher PWM resolution means more brightness steps and in WLED everything seems to be set to 255 max. So one can not just change ledcSetup. 10 bit resolution would equal 1023 as max brightness. With 12 bits it would be 4095. If there are some lines where I can simply exchange 255 to one of the named values, please tell me.

Interestingly WLED uses 19531 hz as default PWM frequency for ESP32. That is the maximum PWM frequency the ESP32 can output at 12 bit resolution. Still WLED uses just 8 bit.

You would need to rewrite WLED to support 16bit color depth. Not in our immediate plans.

Just out of curiosity: Do you discern a 1/256-th brightness level change? Because I do not.

Yes, i do. Fading the lights in and out does not feel smooth. It’s bearable when dimming from 100% to OFF, but gets really worse when one dims from lower brightness or adjusts for a small amount.

Nothing can be done about currently used digital LEDs (although I’ve seen some promising new addressable LEDs like HD108 that advertise 16 bits), but analog LEDs are just limited by their driver and the ESP32 is very capable with LEDs. So using the full potential of the ESP32 would have high priority on my wishlist for WLED.

Are you lighting professional? I am a photographer and have a keen eye for light but I do not perceive brightness change from 244 to 245 for example.

You may be seeing some other issues which are not related to 8-bit vs. 16-bit difference but rather to the frequency of strip updates.

When you really need 16-bit depth is when you start mixing colors or doing large color shifts. But that is not the domain of LED driver.

It’s not so much about the brightness change but about the fade. Problem is that the fade time seems to be constant, whether one changes from 0 to 255 or from 0 to 10. So if the fade time is set to 1000ms and one fades the LED 0 to 255 one gets a brightness step every ~4ms. If one fades from 0 to 10 one gets a brightness step every 100ms and that is perceivable and feels choppy. Of course that is an extreme example but this way I perceive a fade under 10% total brightness as not smooth. Those lower brightness levels are especially important if one has a high power LED (I use several meters of 240led/m high power).

That problem would be fixed if one had more overall brightness steps, so a higher resolution.

Another fix would be to make the fade time dependant from the amount of brightness change to be performed. I’ve experimented with this. Under 10% total brightness the fade feels smooth with a transition time of 200ms. But 200ms doesn’t look good with higher brightness fades. With a higher brightness change 700-1000ms is better. My HUE lights seem to use that approach and their fade is smooth under all circumstances.

Actually I just found an issue that essentially relates to that topic as well:

Is that on the roadmap?