Hello,
I'm trying to implement a fading buffer to use for rendering particle "trails". The effect is pretty much what is described here. Specifically the fiddle there makes it clear quickly.
The core idea behind the approach is:
- Allocate 2 buffers (renderTextures)
- Render particles to active texture
- For fading then: clear the alternative texture, render the primary texture to the secondary one with alpha (the fade), swap primary/alt textures.
- Render active renderTexture to screen (after everything else that goes under it)
- Loop back to 2 and repeat...
I have this working pretty much, but I am fighting 2 things: performance + weird bug that occurs only on-device (iPad3/4 ios8/9). Due to this I wanted to get some feedback on my approach.
I'd like to ask 2 things. First, in the setup above, there are 2 options I can image for swapping the textures: either swap the textures on the sprites objects... or swap the sprites associated with them. Since it would be annoying to add/remove the wrapper sprites on each "swap" I choose to instead simply swap the texture references on the sprites. Any red flags around this? Seems ok to me.
The 2nd question... is my PIXI setup around the high-level approach correct (using 2 renderTextures and 2 wrapper Sprites)?
And of course, if anyone can think of a more optimal approach I'm all ears. It seems no matter how you slice it the "fade" requires touching each pixel in the buffer and therefore will be an expensive operation. The totally different approach of storing the trail history and rendering it out per frame.... no.
Sincerely,
Michael Z.