Need Help With Performance: Dynamic Particles

I have created this: https://ellie-app.com/T9NxzWvnd7a1

It generates a particle at a specified frequency across the XY axis for black pixels on a canvas. Then the particles grow and shrink at random. It creates quite a neat effect. However, it has a bit of a lag that I think revolves around the fact that every Tick creates and new Cmd to deal with randomness.

I am not sure where to focus to improve the lag and am looking for guidance from those a little more experienced.

Thatā€™s a pretty neat effect! Just to make sure I understand whatā€™s going on here:

  1. Thanks to use of a canvas, you retrieve points coordinates corresponding to rasterization of some text.
  2. You pass the points coordinates, and the canvas size as initialization of the elm program.
  3. The canvas is not used anymore, all effects are done in SVG with elm, and what is called ā€œcanvasā€ in the elm code is simply the dimension of what was the canvas and what is now the SVG drawing.
  4. In elm you randomly update all points making the text in SVG at animation frame.

In my experience, the thing that hinders most performance is garbage collection. Since elm has immutable data structures all functions that basically update data induce garbage collection. Not all the data since there is some structural sharing in the way they are implemented but garbage still.

Iā€™m having a look to see if I can spot some obvious issues. PS: My laptop almost died after few minutes looking at code (swap was reaching critically high levels XD)

You nailed it! Those are the exact steps. I imagine my update particle functions are inducing a lot of fun garbage collection mapping through the list of particles. I wonder if keying the svg nodes would helpā€¦that way thousands of SVG circles are not being completely re-rendered but rather updating size only

I believe the virtual dom will reuse nodes instead of moving them if there are ā€˜minimalā€™ changes, not sure exactly how it behaves in this situation though. Would be interested to see if keyed nodes help.

Actually, my first thought was to offload this to the GPU. Youā€™ve got a bunch of little points that you want to update. Should be possible with WebGL. Not sure how that would look though, it would require an interesting shader.

Yes particles are typically the kind of things that get better handled by parallelized hardware like GPU so strictly performance-wise, the best approach would probably be to use WebGL.

This is a lot of fun so I played a bit with the code. According to Chromium and Firefox performance analysis, the ratio of spent time just by browser style computing + rendering is almost already 1/3. Then the time spent in VDOM is also quite big since all SVG elements are computed.
So in my opinion, you wonā€™t be able to get a lot of boost anyway. There seems to be a performance issue though when launching the code in Ellie (probably because of debugger), that make it a lot slower than doing it directly without Ellie.

Using Html.Key will not help you on this one. Using Key, will not prevent the VDOM implementation to perform a diff. What is sometimes usefull is Html.Lazy, but in your case it will not help since all points are updated at animation frame. Lazy does not work at the Attribute granularity.

All that being said, I played a lot with the code, and I can share another version I ended up with. The key difference is that Iā€™m keeping a Seed for random generation, so that I can trigger one random generation only when needed for each particle. You can have a look in this Ellie, but I recommend to run without Ellie.

https://ellie-app.com/TmJLmt2LqNa1

1 Like

What is the reason for actually doing all the Circle-effects with SVG? I think that using a canvas for this would probably be a lot faster, since the browser then does not have to keep track of each circle, and rather than a VDOM diff, youā€™d just end up re-drawing the circles to a flat picture.

If by canvas, you mean 2D canvas, the API is not usable in elm since it requires JS function calls or typed arrays manipulation. This is doable with ports though as most things not doable in elm.

Alternatively, one can use evancz/elm-graphics which is backed by the canvas API. Iā€™ve updated the Ellie to let the choice between SVG and elm-graphics: https://ellie-app.com/TqwkM4Sw8pa1. The performance with elm-graphics is worse. Iā€™m not familiar enough with WebGL to make a WebGL version.

Have you tried running it outside of Ellie? Try it outside of Ellie without the debugger running.

Ellie has debug enabled, which means that every state of the Model and every Msg is kept in memory; so that you can travel back in time and examine them. Whilst useful for debugging, when it comes to animation it can be a serious memory leak.

My laptop is a beast, it has 64Gb of RAM in it, not that Chrome is probably configured to let me use all of that on every SPA butā€¦ I found after running for a few minutes the page would crash (Chrome/Linux). That makes me suspect there is a memory leak - unfortunately I was not able to run a performance profiling as that would tend to crash too.

If it is still too laggy running without the debugger, it may well be that you are reaching the limits of SVG. SVG has nodes in the dom for every element of the image, and this really limits how much you can animate at once. It would likely perform much better done in WebGL.

Interested to hear how you get on, as I am also doing some animation in SVG at the moment.

1 Like

This ran much faster.

1 Like

Can you explain why this version is so much faster? I looked into the Random code and can see that calling Random.generate gives you a Cmd which that module then evaluates as an ā€˜effectā€™. However, it too keeps a seed and chains it along the random generation, just as you do in your improved version of the code. So why the difference? Is there something inherently less performant about running a Cmd?

If you observe the particles you will see that they follow the pattern: creation -> growing -> shrinking -> creation. The creation is the only moment when an actual random generation is needed. The rest of the time, the particle life is pre-determined. In the original code, @wking-io is regenerating all the particles via the generator Cmd mechanism, using Random.constant when not at the creation step. In the code I shared, data is only ā€œgeneratedā€ at the creation step, saving a lot of extra datastructure creation I think.

I also grouped update processing into one call instead of pipelining functions updating only 1 or 2 fields of the record, reducing the amount of intermediate data structure creation. This had very tiny impact from what I recall.

1 Like

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.