-
Notifications
You must be signed in to change notification settings - Fork 450
Add a dune tools env
command to add dev tools to PATH
#12521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
CC @art-w as this is similar to your PR #11408, but takes advantage of some more recent work to simplify the change. @Leonidas-from-XIV I know you've been generally against this feature in discussions we've had in the past. I'd be open to adding additional mechanisms for launching programs in an environment with access to dev tools in the future, but currently I'm just trying to wrap up #10964 for the MSP of package management. |
The issue I see is that this creates a huge API surface, i.e. once we implement that we have to support it. And that means dealing with things like
I think the suggestion in #10964 (comment) is a much better one, achieving the same without adding the trouble of managing a whole shell environment. In my opinion, all the "env" style managers like Pythons virtualenv or OPAMs env are intensely unpleasant to use because you always need to be careful that the right environment is enabled (and if things go wrong blame it on the users for not being careful enough). Having an environment with a potentially mutable state doesn't mesh well with the stateless mode of dune pkg. |
I think that's a reasonable alternative. One suggestion that would make it conceptually simple and easier to implement would be to introduce a new command rather than add this functionality to |
What would be the semantics of |
Correct, it would run the |
As long as we don't try to intercept what the user is running it seems reasonable to me (i.e. we just add the tools to the The downside is that it might lead to some unexpected interactions where the user did not expect the "wrong" |
Do we have a |
@Alizter Yes, |
I do not agree with that assessment. IMO we need two things:
This change addresses (2). Printing a single line with one path variable seems to be nearly the smallest API surface we could possible offer on *nix systems.
IMO we should do nothing in this case. It won't be the environment, and however the user is trying to invoke the tool will just fail. I fully agree that we should not intercept every binary with a shell-script.
In my view, the extent of the environment update should be as minimal as possible. Dynamic updates should not be relevant. But if new bins are added to the dir(s) added to the path (or whatever) that will of course be available by normal unix means. If users run nested invocations of the env update, they deal with the complications.
This is a separate question, IMO. I think we need: A general purpose way of accessing/finding binaries installed (e.g., as dev tools). But to start with, we should at least have a way of getting access to the blessed dev tools being installed.
IMO, this is because:
I think you are overstating the problems with tools that make it easy to update shell environments. But perhaps it is because I envision something that has very little logic to it, and is just abstracting away access to a static location (and maybe later a few other bits of information). I understand that you don't like tools that alter shell environments (such as direnv), and no one should force you to use them, but these conventions are used widely, and in my experience they cause very little trouble when used responsibly, and give a lot of benefits. Requiring that a user prefix any invocation of a dune installed binary with |
else ( | ||
let initial_path = Env.get Env.initial Env_path.var in | ||
let new_path = | ||
List.fold_left dev_tool_bin_dirs ~init:initial_path ~f:(fun acc bin_dir -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One problem I see with this is that new env will be needed every time a new tool is installed.
Have you considered, instead, creating a single directory with all the binaries symlinked? This would ensure that once someone was "in" the dune environment, any additionally installed executables would be available immediately (as I think one would expect). It will also avoid having unnecessarily long paths.
I'm somewhat inclined to think that the desirable behavior I've described is a prerequisite for actually having an environment like this that would not be very surprising and confusing to work with. Without this simplification, I do think we run into the kinds problems @Leonidas-from-XIV is anticipating. And, on the other hand, once we have such a stable path with all the binaries, the invocation of a dune tools env
operation becomes basically a trivial convenience.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you considered, instead, creating a single directory with all the binaries symlinked? This would ensure that once someone was "in" the dune environment, any additionally installed executables would be available immediately (as I think one would expect). It will also avoid having unnecessarily long paths.
The downside is that Dune will have to maintain a "state" of this directory, e.g. adding or removing symlinks whenever dev tools are installed, removed, updated etc. The advantage with adding directories to PATH
as needed is that they're only valid until the process exits.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The downside of the former seems like a very tractable problem foe the implementation, but the upside of the latter is actually a downside for users.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought about it but I'm not sure exactly how to do it within the framework of dune rules (I suspect there is a way if I dig deep enough though). From a user's point of view there will be no difference though. Why do you see it as a problem to have a separate entry in PATH for each dev tool?
If/when we allow arbitrary packages to be installed as dev tools, keeping each dev tool in a separate directory will let us better handle the issue where different opam packages contain executables with the same name as each other since they won't collide in the filesystem.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh reading through your other comment I think I understand. It appears you believe that this command will only add entries to PATH for dev tools which are currently installed, however this command instead adds entries to PATH for all dev tools which dune knows about, regardless of whether they are installed. That way no env update is needed when a new tool is installed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On further thought, since this does behave OK w/r/t to not having the behavior or incorrectly assumed, perhaps it is good enough for the current state of things, given that there are relatively few dev tools supported. Any improved, general solution would still be compatible from an end-user perspective IIUC.
So, with the caveat that I still think it is non ideal and I don't see how the current approach can work for general dev tooling support, I think I am ok with this approach for now if others are.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps this indicates that the proposers solution here should actually go thru the design rethinking we need in the general solution to installing dev tools?
I agree with this. When dev tools were first designed the intention was that they represented the recommended way of performing certain tasks with dune that required external tools, such as launching a top-level in the project's context, generate documentation, or start an lsp server. The current UX around dev tools isn't well suited for installing arbitrary packages as dev tools. I think we should take some time to come up with a more general solution for dev tools or to introduce running exes from abitrary packages as a difference concept to dev tools entirely (however both are out of scope of this PR).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The downside of the former seems like a very tractable problem for the implementation
It's not only that, it's also generally counter to the philosophy of package management in Dune which generally does not keep a state past the execution of a Dune command. This focus allows it to avoid all kinds of issues where the state is desynchornized from the declared configuration and keeps a focus on Dune operations being fast because they need to be quick to repeat.
So while it is possible we do need to evaluate every feature with a lens of "does it conceptually fit". Sometimes breaking that is worth it and sometimes the breakage can be contained.
I don't think that "others do it" is as good of an argument in general. e.g. Nix breaks with a lot of conventions of regular package managers and that's generally considered a good thing, otherwise one could just use these other package managers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am certainly not arguing that we should do things just because other systems do it a certain way. But if we introduce added complexity to users that has no precedent and has no benefit to users, I think it is worth remarking, since familiarity and precedent can sometimes be a reason to justify choosing slightly inferior options.
Speaking of nix, it uses nix-profiles/bin
to provide a single directory with binaries for users. Tho, as a counterpoint to my claim, nix-shell
uses an approach like the one here, adding the separate binary dir for each package from the store into the path. I am still not keen on this, but happy to deffer on that point.
It's not only that, it's also generally counter to the philosophy of package management in Dune which generally does not keep a state past the execution of a Dune command.
Why is generation of a symlink in a directory more stateful that generation of all the other artifacts produced by package management?
From the perspective of a build problem, is there any reason in principle that we couldn't we take the production of the symlinked executable(s) in a shared ./bin
directory to be the ultimate build target for tools, with the rest of the artifacts just be intermediate artifacts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should take some time to come up with a more general solution for dev tools or to introduce running exes from abitrary packages as a difference concept to dev tools entirely (however both are out of scope of this PR).
Yep, makes sense to me!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code looks quite ok. I would somewhat prefer if the feature would attempt to evaluate $SHELL
instead of having to pass --fish
by hand but that can be added later.
bin/tools/tools_common.ml
Outdated
& flag | ||
& info [ "fish" ] ~doc:"Print command for the fish shell raher than posix shells") | ||
in | ||
let _ = Common.init builder in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be good to ascribe the result type here, to make sure that _
does not get bound to a curried function application if Common.init
gets extended with an additional argument.
bin/tools/tools_common.ml
Outdated
Arg.( | ||
value | ||
& flag | ||
& info [ "fish" ] ~doc:"Print command for the fish shell raher than posix shells") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
& info [ "fish" ] ~doc:"Print command for the fish shell raher than posix shells") | |
& info [ "fish" ] ~doc:"Print command for the fish shell rather than POSIX shells") |
else ( | ||
let initial_path = Env.get Env.initial Env_path.var in | ||
let new_path = | ||
List.fold_left dev_tool_bin_dirs ~init:initial_path ~f:(fun acc bin_dir -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The downside of the former seems like a very tractable problem for the implementation
It's not only that, it's also generally counter to the philosophy of package management in Dune which generally does not keep a state past the execution of a Dune command. This focus allows it to avoid all kinds of issues where the state is desynchornized from the declared configuration and keeps a focus on Dune operations being fast because they need to be quick to repeat.
So while it is possible we do need to evaluate every feature with a lens of "does it conceptually fit". Sometimes breaking that is worth it and sometimes the breakage can be contained.
I don't think that "others do it" is as good of an argument in general. e.g. Nix breaks with a lot of conventions of regular package managers and that's generally considered a good thing, otherwise one could just use these other package managers.
Something I learnt while working on the binary dune install script is that |
Ah, I was not aware of that. It's a good point and I agree that this PR can go ahead without this feature. |
The expected use-case is for the output of this command to be eval'd in the user's shell. In order for a terminal-based editor to run the ocamllsp dev tool, `eval $(dune tools env)` must be run in that terminal before launching the editor. Fixes ocaml#10964 Signed-off-by: Stephen Sherratt <[email protected]>
f5ffc55
to
8ad1641
Compare
* 'main' of github.com:/ocaml/dune: (147 commits) cram test: test only parameter flags in merlin generation fix(oxcaml): import eta-expansion changes from opam-repo (ocaml#12563) address review comments Mask the path to the stdlib fix(oxcaml): generate merlin config for library parameters fix(melange + include_qualified): track correct `.cmj` dependencies in emit (ocaml#12531) refactor: remove some unused code in [Path] (ocaml#12558) dep_rules: don't run (transitive) `ocamldep` on single module buildables (ocaml#12555) fix(pkg): ignore project settings for building packages test(pkg): reproduce ocaml#12131 melange: add a test for module cycle checks (ocaml#12554) chore: lint check for new changes entries (ocaml#12553) feature(cram): allow for conflict detection (ocaml#12538) ci: update for ocaml 5.4 release (ocaml#12552) chore(script): generate changelog from structure (ocaml#12516) Reuse dependencies between project and tools (ocaml#12526) Introduce Io.overwrite_file test: fix dune install requiring a mandir Enable package management for more tests Add a `dune tools env` command to add dev tools to PATH (ocaml#12521) ...
The expected use-case is for the output of this command to be eval'd in the user's shell. In order for a terminal-based editor to run the ocamllsp dev tool,
eval $(dune tools env)
must be run in that terminal before launching the editor.Fixes #10964