So… the Kickstarter has been going so awesome (100% funded in less than 48 hours!) I’m working on some stretch goals. First is that I want to update to FastLED 3.x to get the new chipsets. Wasn’t sure if it was going to blow my RAM limitations (and mean less pixel count) but it uses even less RAM, so great work there!
Question about dithering though. Docs say to either call show() or delay() frequently to get it to work best. The way the AllPixel works, however, show() gets called when new data comes to the device. So I don’t currently have any control over the frequency of that. I thought about calling FastLED.delay(0) after every time I check for new data (which means it gets called either right after there WAS data or everytime there isn’t… it’s just keeps looping around checking for serial data nonstop), but then I noticed it just calls show() internally. Which makes total sense. But here’s the confusion… is that going to decrease my frame rate performance? I don’t think given tests I’ve done but I wanted to be sure.
For example, I hooked up the max 680 LEDs (LPD8806), turned on dithering, and let a simple rainbow gradient rip as fast as it could. Without dithering, update (pushing bytes to AllPixel, and FastLED pushing them to the strip) took ~8ms. With dithering turned on it took… ~8ms. So no degradation. Is it really pushing the data out to the strip that fast? Or is it somehow that my setup is making it so dithering isn’t actually happening (I am decreasing the brightness to 64 for these tests)?
Partly, I can’t really see any difference, at least with what I’m testing. So I just want to make sure it’s being done right.