[Nix-dev] Incremental recompilation

Nicolas Pierron nicolas.b.pierron at gmail.com
Sat Apr 30 17:39:39 CEST 2016


On Fri, Apr 29, 2016 at 7:56 PM, stewart mackenzie <setori88 at gmail.com> wrote:
> IMHO, incremental compilation is a strong boon for nix adoption. It is the
> only serious issue I can see which prevents Nix being a Make replacement.

I agree with you on this point, Nix has the potential of been a better
ccache & distcc in one shot.

Note, this is not the same issue as mention in the Volkhov email.  The
email suggest caching artifacts of the compilation spawn by Nix and
independently of the build system, while doing a make replacement
involves replacing/wrapping existing build system and using Nix for
caching artifacts of the compilation spawned in user-land.  But, I
guess the second problem's solution could be a mean for solving the
first problem.

Also, I admit I thought on how to solve this issue in the first place,
but the main problem I got is not even building, this is only about
making a copy of the source directory into the nix store.
When your project sources are larger than 1 GB (without the VCS), you
don't want to be reading all the bytes before being able to start any
compilation.

Thus, none of the approaches proposed before are practical for larger projects.
So, how can we make it such that it will scale?

The only approach I can think of would be to:
 - Upload each individual file to the Nix store, when they would be needed.
 - Make a symlink fram to re-build fake source directories.
 - Use hash-by-content to prevent recompilation.  (when using  "gcc
-M"  to filter out directory content, you want to make sure you get
the same set of headers, and only cause a recompilation if some of the
header changed)

-- 
Nicolas Pierron
http://www.linkedin.com/in/nicolasbpierron - http://nbp.name/


More information about the nix-dev mailing list