7-byte-per-pixel format RGB+CCT LED strips and WLED

Hi! I have some extra tape from a Feit (Costco) addressable 24v LED strip (shop page). Each pixel has 6 RGB LEDs and 6 pairs of CW/WW (I believe this is CCT) LEDs.

I can get it to run and change colors with various protocol settings (e.g. WS2805) but regardless of which setting I pick, none of the LEDs get set to the right color.

From poking at LED strip drivers, I believe the reason is because this strip is using 7 byte per pixel; 3 for RGB, and then 2 bytes for each of the white LEDs (precisely how the second byteis relevant is not clear to me, I appear to have full range 0->255 on the first byte and ignoring the second entirely).

I am wondering if there is any setting in a standard build of WLED (I just installed 0.15.0-b7 and tried most of the strip options) that can let me match this format? I’d be almost as interested in a solution that just lets me set the RGB bytes and skip the 4 white bytes. I’m also willing to dig in a bit and try to add this data format into a custom build of WLED, but maybe that’s best kept as a fallback.

did you check the WS2815 protocoll that may send the real sequence to the string

I probably should mention that I’m running 0.15.0-b7 in case that is important.

I’m not entirely sure what you mean, could you possibly elaborate? – there’s no WS2815 option in the LED preferences, just e.g. WS281x, although that’s an RGB pixel, and the WS2805 which is RGB+CCT.

I am able to get the first RGB pixel to display exactly what I want with either. If I understand correctly, WLED sends the color packets to the first pixel as R,G,B, so it displays as I want. However, to help illustrate my issue, the next two bits in the 2805 are CW/WW, but my LED strip thinks they’re two redundant bytes in the CW channel.

Here’s a concrete example of two pixels. My strip is expecting 7 bits; RGBWWCC RGBWWCC. If I have the WS2805 protocol and set red (R=255,G=0,B=0,W=0,C=0) and send a length of two pixels, my LED strip sees (255,0,0,0,0,255,0) (0,0,0,0,0,0,0), so my first pixel shows a full red in the color, and full on the CW LED and the second pixel is off.

OTOH if I send pure blue (0,0,255,0,0) for WS2805 then I get a full blue first pixel and a full red second pixel: (0,0,255,0,0,0,0) (255,0,0,0,0,0,0).

I could fudge something with the WS2811 white strip, since I’d have full control over each bit, but WLED wouldn’t be aware of the color (only off->on) IIUC and I couldn’t use anything like the color pallette.

I don’t think I could use the WS281x setting either, since I’m not sure that I can set subpixel offsets. I can only see options to add in offsets of pixels (i.e. sets of 3 channels) instead of the 4 channel offset that I’d need to avoid the WW/CW channels.

Looks to me like it’s actually two 3 bytes per pixel strips interlaced given that there are two ws2811 ICs per segment.

Looks to me like it’s actually two 3 bytes per pixel strips interlaced given that there are two ws2811 ICs per segment.

Supposing this were the case, any suggestions on how to configure? The RGB LEDs take 3 bytes and are separated by intervals of 4 bytes (the CxWx bytes). If I select e.g. WS281x it seems like all of the offset/skip options are by the length of the pixel (i.e. 3 bytes), and of course 3 and 4 are coprime :slight_smile: so I could at best get 1 in every 4 LEDs to light up with that setting.

Is there a way to tell WLED to skip so many bytes?

FWIW I’ve tried locally to create a new NeoPixel bus that has 7 bytes per pixel, with fmt RGBCxWx (x = padding bytes) and either using that library directly or jamming it into a local WLED build I can get the correct RGB+CCT color mapping.