Run pure computation without blocking the UI?

Hello,

say I have a text input where the user can type text. As the user types, each time a new letter is typed, I want to compute something about the text. The computation should usually be fast but it might sometimes take up to a second or two. I’d like to avoid the cursor from freezing in the input text while the computation is running. This is a pure computation, done by the Elm code, there’s no IO involved.
Is there a way I can perform some kind of async to avoid freezing the UI ?

Thanks.

1 Like

Could you please describe your problem a bit more in depth?
Eg. what is (on a high-level) the content of the input and what computation do you want to do on the text?

There might be different suggestions for dealing with the problem based on the details, but right now it’s a bit too zoomed in :slight_smile:

Adding it as a Task may help. Task - core 1.0.5

Edit: it doesn’t help.

I am currently using “Gizra/elm-debouncer”: “2.0.0” and it works well for that kind of use case.

1 Like

You can Json.Encode the arguments to the function and send it to a - Web Worker - to run in a spearate thread.

Or you can chop the compute job up into 10s/100s/1000s of pieces using continuations, and pass them through your own update function, so that other Cmds can execute without being completely blocked. This is a form of cooperative multi-tasking (the sort of things primitive OSs in the 90s did!, remember terminate and stay resident apps in MS-DOS?). I have tried this technique and do not recommend it, because no matter how large or small you chop it up, it still causes the application to be noticeably less responsive, since you max out a CPU core at 100%.

I recommend looking into the web worker approach, as it can genuinely make use of >1 CPU core. You can use Platform.worker to create an Elm program that you run using a web worker. If you want the web worker thread to be interruptable (say the user presses Cancel or adds another keystroke before the last one completes) then make it do the thing of chopping the job up into lots of pieces, so that it can periodically poll for a cancel message. Yes, cooperative multi-tasking style…

Also, here is an example of compute using continuations to chop up the work into smaller steps, this one has a configurable step size ‘n’. The Ongoing constructor creates a continuation of the search. Its usually fairly easy to do this with a functional programming style computation, as at the recursion step you just insert a continuation

recurseFn a b c becomes \() -> recurseFn a b c or it could be RecurseFn a b c if you capture the recursive arguments ina custom type constructor as I did with Ongoing.

https://package.elm-lang.org/packages/the-sett/ai-search/latest/Search#nextN

2 Likes

We’ve had a similar problem (well, hard to say since we still have to hear more details) at my previous job: a JSON reponse contained ~20MB of a tree structure: folders of (folders of) Questions which in turn contained Datapoints and Suffixes.

Making a tree out of it at the time of decoding would freeze the UI for 5-10 seconds. Instead we amortized the cost by creating a lazy tree out of it, with unevaluated thunks (\() -> ...) where each thunk did just a little bit of the computation (finding the current item’s children etc.)

Since not all work happened at the same time and the amount of work that needed to happen as user expanded another level of the tree was small enough, this was enough to de-freeze our UI. In the end, most of the work was never needed as the users only explore a fraction of the tree at any given session!

3 Likes

FWIW see also this answer I wrote:

2 Likes

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.