[Nix-dev] Distributing unreleased software

Marc Weber marco-oweber at gmx.de
Sun Dec 13 00:19:02 CET 2009


Hi,

Eelco Dolstra: Thank you for you review. I didn't knew that lib is
passed by stdenv. I should have known earlier.

I'm fine with putting all this stuff into a dev branch.
But I'm not fine with merging svn branches at all. This takes too much
time. Or I have never learned how to do it properly.

If this should be a separate branch I'll just add another overlay the
way I've done for haskell and upload it to github.
I'm not going to daily merge changes from trunk into a -head repository.

There were additional reasons why I choose to not commit the haskell
stuff to the main repository:
  a) many updates
  b) very experimental
  c) is slow

But only a) may apply to this use case now.

Eelco Dolstra: About groups and two function arguments:

    Very often you have a set of packages you have to update at the same
    time. Use cases
      a) netsurf and dependencies
      b) xorg
      c) various groups of haskell packages

    Then you can use the group name to update them all at once.

    Eelco Dolstra: You're also right that I should add a special attr called
    version only containing the revision part so that I can replace "-devel" by
    "-head-rev-233" or such.

  I tried explaining the phases in the README: http://github.com/MarcWeber/nix-repository-manager
  But I'll repeat them here for convinience:
    Updating a package is done in two phases:

    a) get sources and build package

    b) package works so upload it.


    After step a) it looks like this. The first argument is used to
    identify the local repository file

      # REGION AUTO UPDATE: ...
      src = sourceFromHead "hack-nix-07c4f810c13183325cd920919e7fb3d2f9128bce.tar.gz"
                           (throw "not published");
      # END


    b) after publishing the (throw "not published") get's replaced by
       the fetchurl call.

    I recall now that we do have a basename function. So maybe I should
    add the fetchurl call only and extract the local .tar.gz name from
    url argument.


About bleeding edge in nixpkgs in general: The more the better.
Because if errors creep in they should be catched as early as possible.
That's what continuous integration is about..

But I also agree that it makes sense to warn users. Eg in gentoo you
have to unmask experimental packages.

At the moment most users of nixos do know what they are doing. And most
of the experimental packages which do exist in nixpkgs are uncommon and
you want to use the latest version.

> >> Why use Nix if not for zero-risk trying out bleeding-edge stuff?
You never have zero risk. Eg using a more recent compiler can even
insecure old kernels ..
The question I ask is does nixos generate more value than risk.
I'm still unsure but I take the risk and say yes :)

Marc Weber



More information about the nix-dev mailing list