-
Notifications
You must be signed in to change notification settings - Fork 252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhance the experience for the testing of packages in development #6579
Comments
I guess the intent here is to use this for testing? For example in my day-2-day flow I have some extra targets that let me add a new source and new global packages folder, so each time I have a new iteration of the package restored. Personally, I'm not confident that "teaching" NuGet PR to work with individual nupkgs is a good idea, but there's definitely some improvements we can do here. |
I updated the title a little bit. The easy workaround here for customers facing this is to define a new global packages folder, by potentially adding a new nuget config file, in the root of the project where they are working.
Ofc, you can always use some msbuild goo, but lots of people would be more comfortable with the above. |
I don't think it would solve the problem. I guess the intention is to replace the same version package in the project on every restore. May be if we had the |
Yeah, I forgot to mention it...they'll just do locals clear -globalpackagesfolder when ready for a new version. That's the best you can do without any work and msbuild tricks. |
HintPath's get broken by trying to use this unless you can get everyone to use the same local folder path, The way I figured out to get around this OP issue in development was to use a powershell script to build the package I'm testing. It takes the current epoch, splits it in about half, and supplies that as the Rev and Build elements of the version number when packing. So long as I dont run the script twice within the same second, there isnt a problem with caching. Consuming the package for testing is just setting up a local folder as source, within VS's sources and not in the nuget.config for the solution. The latest one us right on the Update tab of the NuPMUI
Who are these people you refer to? I only know |
My comment about the usage of the global packages folder is meant for Package Reference. The global package folder is the de facto installation directory for PR, while in Packages.Config it's merely used as another source. packages.config is fundamentally broken for the above scenario because of hint paths. |
The content in packages.config cannot really be materially different from package ref. How its implemented in VS/Nuget is what is creating the incorrect hint paths, etc. Why you bothered with that file and created additional work instead of just addressing the hint pathing is still a mystery to me, but it seems no amount or delivery manner for feedback makes a lick of difference. Nevertheless, have some more... Having a global packages folder sounds nice, but the command to clear them is not (use the "locals" to clear the "globals", no mention of "cache" in the command names). Then again, I dont think I necessarily want a single folder with all the packages (in development and known good). If so, I sure wouldn't want to clear everything just to refresh a development package, or clear everything just to trim out a few unreleased development packages. Defacto is not the same as default, so what you are saying is there is and will only be one global folder, and that by doing so you will again be creating additional burden for those who try to use this system to author packages. |
The hint paths are a bad design, it's as simple as that. Creates a lot of management overhead and it's very error prone. That's why package.config is not a priority for the future.
That's what this issue is for.
That's exactly how Package Reference works. You can call it default if you will.
In the packages.config case, the global packages folder is still used, rather as a priority source. |
Hint paths are part of the project system and not of nuget or packages.config. The PackageRef implementation could have used them too, and instead of fixing the packageCfg implementation to not use them, you all decided to do what is generally the most expensive and disruptive choice in software - Npm can use a global folder, but I'm not required to do so. I'm pretty sure Bower is the same. And the difference in word choice between "Defacto" and "Default" is important. The former means "this is what it is, there are no choices", but I think you are saying the truth is that its the latter and there is a choice to be had, and by the individual developer and not governed by a config that is checked into version control and locking everyone in. I hope its not going to be as infuriating as the GAC is in regards to loading assemblies, where the GACced asm is loaded before one adjacent to the program's location. PackageCfg could use a global folder, but I found it was broken due to hint path. |
Hint paths are hard coded values. That's why in package reference, the knowledge of where exactly do dependencies come from is decoupled from the knowledge about "which" dependencies I want. Again, packages.config uses global packages folder as an extra source. Regarding the defacto vs default. |
PackageCfg works just fine from the command line, be that devenv.exe or MSBuild.exe or nuget.exe. so that statement is just not correct. Nuget is the package manager for visual studio projects, so I dont think it matters how well it works outside the project system. Other uses for the nupkg use it differently anyway. (This argument slightly reminded me of the bad old days when there was only classic asp and web site projects - a bunch of unrelated files in a folder that you hoped could be made to stay working.)
Merge conflicts from nuget happen because of two things...
I can't use a single source folder all the time, and with the example given of authoring a package, I wouldnt want to. Other package managers dont coerce this either. Stop trying to make me do things that have no value to me. And Stop doing things that create work for me that also have no value.
You... are kidding right? They are files. You Copy Them.
"The whole concept" is nuget package management, but even at a smaller scope, package ref and package config are fundamentally conceptually the same. If one is fundamentally flawed, both are. However the major flaw here is the idea that writing an entirely new replacement system is going to somehow be easier and have less ripple effect and cost than addressing a set of problems in the existing one. I try to keep pointing out that you (MSFT in general) are consistently making things difficult with no added benefit or value. You all (including the netcore group) need to look at the overall change you are trying to make a bit more objectively.
... is how that reads. Again, change with no benefit. Back to the OP, The easier and safer authoring workflow is going to require either naming specific packages or multiple local sources. The idea that of a single "local-global" one is not going to work. |
That's not correct. Package Reference is transitive, readable, provides a consistent experience from commandline and VS.
It's not a change with no benefit. Regardless, let's stay on topic. The local/global trick is just a workaround/good practice for testing right now. That'd likely require a new gesture. |
Concept and Implementation are different things. You are describing the implementation. The concept of both is to distribute packaged software components for use in visual studio projects. The choice of an entirely new implementation (with all new bugs being created in both implementations) over correcting the issues with the existing one seems both dumb and extremely costly. I'd rather have 15 locations instead of being coerced into one that is subject to disagreement by different running instances of visual studio. File locking isnt handled right when there is only one VS instance touching a packages folder, I can't imagine how problematic it will be where all of them are competing and disagreeing. Back on topic then... |
This issue just keeps coming up, spread all over the place. Here's a thread (with a myriad of links) on this very topic going back years; dotnet/sdk#1151. And here's a year-old thread from Twitter with a bunch of comments and feedback; https://twitter.com/natemcmaster/status/1099021447920406529. |
What would that look like? Sounds like what you could do today by pointing a local NuGet feed to the bin folder of a project and manually updating version numbers and build packages.
I would say it's pretty common to have at least a few dependencies, at least in the cases I've needed it. |
Thanks for the links. @khellang
An idea would be to mark a package with certain metadata and then we can maybe overload previous gestures like Given that restore runs on every build, it'd be unperformant to check if the packages are up to date. So a solution would require the user saying something like "I changed the package". The Package/Project Reference is certainly one way to solve this. Don't have a concrete proposal for that though, given that it involves more than just NuGet. |
Don't know that much of the inner details of the NuGet magic, but here's an idea. Extend child element of
In this case it's adding it to the cache that is part of the problem and we wan't to increase development and testing of nuget, hence caching that particular package is not needed. Step 2 let use move away from versioning issues. Using Edit Add script from closed issue #!/bin/bash
if [[ -z $1 ]]
then
echo "Version is required"
exit
fi
# Check if nuget local packages should be cleared
if [[ -x `which nuget` ]]
then
echo "Using nuget at `which nuget`"
# Format: 'global-packages: C:\Users\<username>\.nuget\packages' (take note about the two :)
PACKAGES=`nuget locals global-packages -list`
# Replace forward slash with backslash
PACKAGES=$(echo $PACKAGES | cut -d ':' -f3 | sed 's/\\/\//g')
# Add missing drive letter
PACKAGES="/c$PACKAGES"
echo "Delete cached NuGet directory '$PACKAGES'"
rm -rf ${NUGET_GLOBAL_PACKAGES}<package>
else
echo "No nuget.exe found in path"
exit
fi
ROOT=`pwd`
dotnet pack -c Debug <proj> -p:RepoRoot=${ROOT}
# Since pre-release version is required
dotnet add <proj> package <package> -v $1 |
I understood what the problem is. Let's look at "Pack" target. This target does almost nothing. Everything happens in dependencies. So, I should run my "BeforePack" target before those dependencies.
|
You can use
In consumer project you can use floating version. It will automatically update your package.
In consumer project you can also make new package on every rebuild event. Only I'm not sure if I specified the properties correctly or not.
|
@Denis535 I like the idea, but after saving the .csproj file I see many repeated errors in Package Manager output window, even if the pacakge is created successfuly.
Those errors are not showing with standard naming like:
|
Making a package every time you build the project is a waste of time for non-trivial packages. You also lose the ability to update the nuspec and try rebuilding the package without having to wait for the whole project to rebuild. Having the things separate means at most an I mention nuspec because historically, packing without a nuspec is inviting weird, unwanted, and inconsistent behaviors from nuget. |
@LazZiya |
@StingyJack |
@Denis535 many thanks for the package, it helped me and I created another version formatting, sharing it below for those who may want a short and readable version:
The output package name is something like:
It requires more processing, but it looks nice :) |
@Denis535 what's the dotnet command for creating the nuspec file that you can edit/tweak and then later feed into the pack command? |
@StingyJack |
@joacar https://docs.microsoft.com/en-us/nuget/reference/msbuild-targets#packing-using-a-nuspec
If you getting any of these scenarios do let us know by creating an issue. |
@nkolev92 Perhaps that was just when doing it from Pipelines. The documentation says either csproj or nuspec which I can see how that doesn't work if you need the csproj file.
|
Maybe the simpler way to think about is that through dotnet.exe you always call Then the project file itself can use the conventions for csproj packing for PackageReference projects, or itself it can point a nuspec file like in the docs I linked. |
@nkolev92 - i have used The biggest example I run into somewhat often is where packing a csproj doesnt work in any setup is when there are two assemblies in a branch that produce packages, and one of them depends on the other. An example would be a common objects and interfaces library, and a services library that implements the interfaces and uses the objects in the common library and whose proj file consumes the common proj as a project reference. The common library could be installed into a project, or the service library could be installed and that should install the common library package of the correct and corresponding version. I tried to explain a facet of this before , but apparently did not explain it well enough. The service library package doesnt include the common library package as a dependency when packing the proj, and in some versions of nuget.exe packing the service library's proj file has not packaged the common library file as part of the service library package. I thought I entered that as its own issue but I cant find it at the moment. I'm not sure how often this comes up in other's development environment work, but it has come up for me and my colleagues often enough that we had to draft a wiki page for this topic. |
This is the issue: #4491. In general, |
@nkolev92 I would really love to see some solutions for improved/faster inner-loop experience when developing packages. My first thought would be to have a mechanism to switch quickly between package reference and project references in VS/VSCode/etc, maybe along these lines: dotnet/sdk#1151 |
It fails to restore until you build the first project, and it's fraught with caching issues, e.g. NuGet/Home#6579
Just another note: If the package you are testing contains assemblies that are going to be loaded by MSBuild, this environment variable may be needed to not having MSBuild locking files, thus preventing clearing the cache:
Take this into account if this issue will ever be addressed... |
I have been spending the last couple days trying to figure this out. I did something like this to achieve a similar effect.
If anyone knows a more elegant way to handle this, I'm all ears. Changing version numbers each time just for local development is just silly. semver is great and everything but we shouldn't be forced to adhere to it just for the sake of purity because doing so results in dirty hacks or workarounds. |
It fails to restore until you build the first project, and it's fraught with caching issues, e.g. NuGet/Home#6579
It fails to restore until you build the first project, and it's fraught with caching issues, e.g. NuGet/Home#6579
Wow this is open by 2018? I'm digging internet by some days to find a way to have the freedom of create/change a package in local and immediately test it from another project without being annoyed by:
I opened this request here: #13918 . Here I ask to skip saving in the global packages folder packages developed in local (ie taken by a Local Feed). This should be what you are asking for. And actually I somehow "home-made" this by myself.
I have not to deal anymore with: cache problems, nonsense upgrades of the package version (on both Package and project who consumes it), useless push to a remote repo waiting a slow CI/CD pipeline who produce a pre-release versione of the package (that however will be tested in local...). It works well, but it would be nice if it was already out of the box, provided by NuGet itself. What you think about? =============================== @nickhoeferpickpro just now i noticed that we and @thesushil have come to the same conclusion |
https://twitter.com/Mpdreamz/status/965325828455321600
The text was updated successfully, but these errors were encountered: