Help gather data on build times!

What are build times like as Elm 0.19.1 projects grow larger?

I feel like Elm 0.19.1 is relatively fast for a compiler, but I want to measure it. Some people have already contributed data. Here are some stats when using the --output=/dev/null flag:

  • 136,049 lines with worst incremental time of 283ms (here)
  • 12,016 lines with worst incremental time of 282ms (here)

There’s no typo there! So I think this data is going to be pretty interesting, but I need your help to gather more data points!

How to Help

I created a measure-elm-make executable that gathers a bunch of data about your project: build time from scratch, incremental build times, line counts, etc.

Step 1: Navigate to your largest Elm 0.19.1 project, and download measure-elm-make with these instructions.

Step 2: Run it in your largest Elm 0.19.1 project. It takes one argument. The main entrypoint of your application. So calling it might look like this:

./measure-elm-make src/Main.elm

It will run the compiler a bunch of times to gather data (twice for each Elm file) and then provide instructions on how to share the data.

Run this on your normal development machine. The goal is to get numbers that are representative of day-to-day development, not to get the best numbers possible.

Stay focused on the terminal while measure-elm-make runs. The goal is to get data on “what it is like in practice” so it is okay to have a bunch of applications open (whatever you would normally have open!) but actively using them while this script runs will make the data on incremental build times noisy and confusing.

Turn off any file watchers. Running this alongside tools like elm-live that run elm-make when a file is touched will mess this data up pretty badly.

It prints out a lot of repetitive stuff. That’s expected! I want it to be as realistic as possible, and doing the terminal animations with \r may have some performance impact. The bigger your project, the longer it will be printing stuff out since it is running elm make twice for each file in your project.

I am especially keen on getting data from projects with more than 50k lines of code, so please give it a shot if you are in that group!

Troubleshooting

Please ask about anything weird you run into in this thread or in the #core-coordination channel in Slack.

13 Likes

Hi. We have around 80k LOC project. I would like to share result for our code but we have many Elm programs. All of the programs share one runtime at the end but there is no one main entry point. Would you be a still interested in data for our project and does the executable support such setup? Thank you.

2 Likes

@akoppela, the binaries are only set up for a single entry point. Are you running elm make src/A.elm src/B.elm src/C.elm as you develop, or only at certain times? I was surprised to hear from someone that they run something like that with a file watcher. Maybe you have a similar build command?

(Aside: the perf for 80k with one entry point should be the same for 80k with multiple entry points, so building new binaries may depend on how much data is coming in.)

I’m also running something like this. I have a project with six separate Elm entry points (with a fair amount of shared code between some of them). They get compiled with a Parcel command that includes all of them and runs as I develop.

1 Like

We use Webpack and under the hood it runs elm make src/A.elm src/B.elm src/C.elm. Please let us know if we can share stats at this time or maybe we can do it next time. Thank you for smart work as usual :wink:

I guess this scenario is more common than I imagined based on what I knew, so I updated the binaries on the release page to handle multiple entry points. If you download them now, this means:

# if you normally compile with
elm make src/Home.elm src/Settings.elm src/Shop.elm --output=elm.js

# you can now run the measurements like this
./measure-elm-make src/Home.elm src/Settings.elm src/Shop.elm

The names of the entry points will be included in the OVERVIEW as before, but no names are included in the build.log file.

But only call it this way if that’s what you do during normal development! Do not try to get your line count higher by running it in ways you never do during day-to-day development!

So if you have multiple entry points, you can now follow the directions in the original message and get a build.log file.

2 Likes

I get:

-- DEBUG REMNANTS--
...
...

ERROR: Ran into a problem when running:

    elm make src/Main.elm --output=elm-stuff/0.19.1/temporary/elm.js --optimize

Your project must compile successfully for these benchmarks to work.

which means we won’t be able to contribute our 100k+ loc app to this benchmark.

I started removing Debug.log/toString but then was reminded that we use EveryDict which internally uses a Debug.toString for it’s unique key. :frowning:

1 Like

@mordrax, we use turboMaCk/any-dict at work and like it! But it needs you to give it toComparable functions for the dicts, so it doesn’t easily solve the problem.

assoc-list might, but there are probably some performance hits, so it’s not something to be done blindly (and/or in production)…

1 Like

I’m hitting an issue with the Windows binary …

C:\workspace\app\src\Web\elm > .\measure-elm-make.exe .\src\Main.elm
ERROR: Could not find `elm` on your PATH.
C:\workspace\app\src\Web\elm > elm
Hi, thank you for trying out Elm 0.19.1. I hope you like it!

-------------------------------------------------------------------------------
I highly recommend working through <https://guide.elm-lang.org> to get started.
It teaches many important concepts, including how to use `elm` in the terminal.
-------------------------------------------------------------------------------

Anything I can do to help with resolving this?

You have to install elm globally (edit: using the official installer) or to add the directory where your elm binary is (maybe some node_modules sub-directory) to your PATH.

set PATH=%PATH%;C:\path\to\elm\binary\directory\

I appreciate the suggestion. Still not working though.

I have it installed globally via npm install -g; the output from just > elm was hoping to convey that.

For curious parties, I just have a gist of what I’m seeing (adding platform info to gist):

Reason I was hoping to contribute is that we’re developing on Windows, and the application is reasonably sized at 31251 SLoC total.

Try to add C:\Users\andrew.lenards\AppData\Roaming\npm\node_modules\elm\bin\ to your PATH.

Installing elm using the official windows installer should also most likely work (can be removed after).

1 Like

Use the normal Windows installer that @dmy linked to. It comes with an uninstall.exe that removes everything.

Why? The way executables work with npm on Windows is pretty confusing. More specifically, if you installed a package that gives a binary command then npm creates two scripts: one bash script named binary and one batch file named binary.cmd. Both of these files just call node index.js to do whatever work is specified by the package. There is not any other way to do it as far as I know. The issue is that Haskell’s default function for searching for binaries does not appear to consider the PATHEXT environment variable on Windows, so it does not recognize .cmd files when it searches. (One implication of all this, is that if you install with npm on Windows you always run a batch file, initialize the nodejs runtime, and then run the Elm binary.)

1 Like

There’s something odd going on with the reports. 3 out of 4 of the Windows benchmarks have the same line count



Edit: It looks like the correct line count is shown in the body of the report. It’s just the title that’s wrong.

Ah, that is the number that is in the instructions of what the title should be. I guess it’s clearer for folks on non-Windows that that number should be swapped out. I updated the issues. Thanks @MartinS!

1 Like

Using the installer as @dmy instructed works.

Thank you to both of you. I appreciate the assistance (and I do realize that npm executables on Windows non-trivial).

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.