Megaparticle Beginners' Thread
category: code [glöplog]
I've been messing with Mad's 4k particle implementation kindly provided in the download of http://www.pouet.net/prod.php?which=57507.
So far I've worked out how it works a bit. I could be totally wrong. :3
A 1,000,000 point 3d array is initialized with random positions. That is then sent to the gpu using gl_buffers. The positions of the particles is the automated per frame with a geometry shader. What I don't get is what 4klang is doing here:
And why the particle draw function is called multiple times. And I have no idea what the 3d texture is doing. Is it for normals? Lighting?
So far I've worked out how it works a bit. I could be totally wrong. :3
A 1,000,000 point 3d array is initialized with random positions. That is then sent to the gpu using gl_buffers. The positions of the particles is the automated per frame with a geometry shader. What I don't get is what 4klang is doing here:
Code:
float lz = (&_4klang_envelope_buffer)[3*3+0]*.2f(&_4klang_envelope_buffer)[3*4+0]+.2f;
VERTEXCOUNTANIMATED = (int)((float)VERTEXCOUNT*lz);
And why the particle draw function is called multiple times. And I have no idea what the 3d texture is doing. Is it for normals? Lighting?
I don't know about that particular piece of code, but depending on the complexity and solution you're using for letting the GPU handle things, you'll end up doing multiple passes to simulate, shade and draw.
I didn't take a look yet at the code ;)
But from what mad told me (at least what I remember): The 3D texture is used in order to get the normal - by using central differences or something like that (So I wouldn't be surprised if that 3D texture contains an SDF).
But from what mad told me (at least what I remember): The 3D texture is used in order to get the normal - by using central differences or something like that (So I wouldn't be surprised if that 3D texture contains an SDF).
Yeah that would pretty much only work if the data can be _interpreted_ as an actual surface I guess. Nice idea though, hadn't considered this yet (head still stuck in more conventional particle systems).
That code takes the current envelope level of 2 instruments from the song and uses that to somehow specify/adjust the number of vertices which are animated depending on the song