How do I dereference a pixel in a CRGBpalette16?

How do I dereference a pixel in a CRGBpalette16?

LEDs[i] = myPalette[i]; // This doesn’t seem to work

The function you want is ColorFromPalette.
Check out the example code for some details:

I started on the page you referenced. the problem with the ColorFromPalette function is the auto blending - I want to spit out values pixel for pixel as per the pallette, instead of having it interpolate the palette into 256 entries. I also haven’t had success with my own CRGB data structures. There is just three bytes of data there per entry, no? I can’t seem to just spit it out. Suggestions?

Does this work?
LEDs[i].r = myPalette[i].r;
LEDs[i].g = myPalette[i].g;
LEDs[i].b = myPalette[i].b;

Should I just a 256 byte palette even though I’m not using the whole thing - and maybe save pointers to particular sections. I’m sure it’s doable but with my C++ skills it seems awkward.

Your library is fabulous but I’m having a hard time parsing it to do what I want to do, due to numerous C tricks and ASM sections.

Try just using ColorFromPalette, with a final argument of NOBLEND. That should retrieve the raw color. Let me know if that doesn’t work.

Check out the ColorPalette example for more sample code that might help.

leds[i] = ColorFromPalette( myPalette, colorIndex, brightness, NOBLEND);

Or use LINEARBLEND.

Sorry if I’m a bit confused here. How many distinct colors are you loading in? Just 16?

Lets see if I can explain this. Using a CRGBpallette16 when I assign colors to pixels, I end up with color 0 from the pallette on the first sixteen LEDs, color 1 on the next 16 pixels, and etc.

What I would like is colors 0, 1, 2, 3 … on pixels 0, 1, 2, 3 … etc and to roll over at 16 - but I can do that with modulo if necessary. Hope that is more clear.

I’m puzzled as well on why the direct assignment fails and yields a color I haven’t spec’d. All the Arduino printing raz a ma taz doesn’t make it that easy to debug what is going on.

Thanks for your help. It’s fabulous software, it’s just taking me longer than it should to learn to drive it.

You might want just an array of CRGBs:

CRGB myColors[16];

and take it from there with your code.

The whole point of our odd little 16-entry palettes (which only take up 16x3=48 bytes) is to act like there are actually 256 color entries (which would take up 256x3=768 bytes), but at a cost of much less SRAM. The color interpolation is to give smooth color blends, as you might get in a 256-entry palette, but without the (very) high cost in SRAM.

The reason you’re seeing what you did is an artifact of the way you assign just 16 color points into the palette, but then you can retrieve 256 color points. The 16 you put in get spread out across 256, with blending between them.

For your purposes, since you don’t want the interpolation or the appearance of 256 entries, you might be happier with just a simple array of CRGBs.

What do you think of that approach?

Seems like I tried that too - but didn’t get the color I wanted unless I did this

LEDs[i].r = myArray[i].r;
LEDs[i].g = myArray[i].g;
LEDs[i].b = myArray[i].b;

if I just did this
LEDs[i] = myArray[i];

I got the wrong color. It’s not really that big a deal because when the hex is decompiled, I imagine that both are probably pretty similar.

Any thoughts welcome. If you don’t have any, don’t worry, I’ll just bang on things for a while until I get to the bottom of what is failing and why.

One thing I thought might be happening is the swap for the color order, but I image that happens in showLEDs() no?

Could you share your sketch in a pastebin so I can look at a bigger view?