What we are doing at my company is using Docker to build everything, in particular we run the elm compiler from a Docker container. This means we do not care what specific machine the CI runs on, as long as it has Docker installed. As we build not only elm projects but also C++, Fortran, etc., this is quite convenient and scales well.
We used to simply put the 0.19.1 directory under version control, which is inconvenient because it bloats the repository & dependencies are duplicated for each elm project. Then we tested removing those submodules (i.e. downloading from the internet at each fresh build) and literally four day later the elm package repository brokeā¦
The problems I have with caching the .elm directory for CI are:
- Each project has different dependencies so, eventually, your .elm directory becomes a very large subset of the whole elm package registry
- We cannot execute the CI jobs on any runner (CI runners are no longer tool-agnostic)
In this thread, Evan says that he does not want to enable private repositories because it would add a lot of complexity to the package system. I think that simply enabling any git repository (not just Github) is sufficient: just require Github for published packages. This is more or less the idea developed here.
It would bring the following benefits:
- Use private packages instead of git submodules
- No single point of failure (elm package repository is one)