Cmd.batch sometimes encourages ignoring race conditions?

From my limited exposure to seeing Cmd.batch used in a codebase, it seems like at least 20% of the time, the batched commands could introduce race conditions because people might not reason about whether sequencing is needed. Are people using elm review or other guidance to help mitigate against people unknowingly using Cmd.batch improperly?

could you add an example of such an race-condition?

1 Like

One example was using a JS api that sent data to an external API. The JS api mentions that you should send method1 calls before any method2 calls begin. There was a port for the two API methods. We had various people using Cmd.batch [method1Cmd, method2Cmd]. This could have been caught in code review, but it wasn’t. Even if it was caught in code review, there is also some inertia against correcting it, since you have to introduce some state machine like logic to properly sequence.

I would not rely on any execution sequence in Cmd.batch. Instead, just have one methodCmd port and solve this issue in JS side by calling two methods in a sequence, or using promises.

1 Like

I understand that specific issue and would consider those approaches, although I prefer using an Elm solution with a state machine rather than using a JS escape hatch. (I would lean towards using Rescript if I didn’t want to use Elm’s purity)

My question is that these sorts of issues pop up repeatedly, so we can solve this, but it won’t help much in preventing the next time this occurs in another use of Cmd.batch.

Maybe you can write a custom elm-review rule for your problem case. It could catch the blacklisted use of Cmd.batch

2 Likes

I think the difficulty is that there is no general rule here. Sometimes you don’t care what order effects happen in, other times you do - is there some automatic way to distinguish? I don’t think there is without understandind the intention behind a program.

Always a good idea to draw out the state machine for each part of your UI.

4 Likes

I think there is not if you don’t encode relationships between data/requests which is probably too difficult.

I somewhat agree that elm should give you the possibility to indicate that order does matter for you.
One easy way to do this would be to go back to Task and give us andThen (yes that’s basically how many other pure languages do it: Haskell, PureScript for example).
I know that the elm community is very much against this (as you could/would have logic in there not in update I guess) but this is one of my bigger concerns right now.

1 Like

Cmd.batch is specifically for situations where you don’t care about order (the docs even warn about this). As others have mentioned, it’s tricky to catch this automatically since there are valid uses of Cmd.batch.

One way might be to write an alias

withoutOrder : List (Cmd a) -> Cmd a
withoutOrder =
  Cmd.batch

and use a custom elm-review rule to force all code to use that instead of batch. That way authors will be more aware that this function is order-independent.

2 Likes

For situations where order does matter, there are a few common solutions. From preferred to least preferred (IMO) they are:

  1. Task is the best solution but doesn’t support every kind of side-effect (you chain HTTP requests with Task but not port calls)
  2. Having separate message that triggers when each step completes. The update then returns a single command for the next step (e.g. a Method1Complete message that returns (model, method2Cmd) )
  3. Sequencing the work on the JS side and only triggering a single port command
3 Likes

Also I think this lets you chain Cmds sequentially?

https://package.elm-lang.org/packages/brian-watkins/elm-procedure/1.1.0/

2 Likes

https://package.elm-lang.org/packages/brian-watkins/elm-procedure/1.1.0/

Looks very cool, still grokking it.

Out of curiosity, if I ended up using this, it would probably meant that I couldn’t simulate these Cmd’s with elm-program-test like I could with the more verbose state machine approach?

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.