-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi-targeted Engine Extensions #394
Comments
@ChrisMaddock The specification at https://github.com/nunit/docs/wiki/Engine-Addins-Spec makes a distinction between Extensions and Addins. We have so far implemented Extensions but not Addins. What you are talking about here, I think is Extension packages but the same change would apply to Addins, if we later implement them. FYI, the file extension In any case, I agree we should do multi-targeting provided there are multiple targets. 😄 I think that depends on the outcome of #388, specifically in terms of what targets are supported for the primary engine. It seems as if actual implementation of this issue is blocked until that isssue is resolved. Discussing the design makes sense though. So...
[One side note here: we don't use nuget or chocolatey either in the most common, standard way. That's clear. However, the way we use them was based on in-depth discussions with the folks who developed those apps and I think is a bit beyond what the average app does.]
The last paragraph in the issue description references something "below" but there's no below that I can see! Could you clarify? |
Thanks as always for your thoughts Charlie!
#388 is the main driver, but I believe this functionality is useful anyway for things like nunit/vs-project-loader#25. I was hoping I could split it out of Rob's work, and work on it independently. 🙂
Agreed. My plan was that the extension which can be loaded is always based on (2). This is related to nunit/vs-project-loader#25 again - it means we can provide/use a .NET 4.6 addin, even though the current engine targets .NET 2.0.
Can I ask how you'd propose packaging for NuGet/Choco, Charlie? My thoughts were:
Do you think we should to that under Regarding the msi, everything currently lives directly under the addins directory. I'd propose a similar structure:
Sorry, I reordered without proofreading! 😄 Reworded:
|
Nooo - apologies if my draft read like that! 😄 Here's the NuGet version:
The msi version:
The choco version:
I'd propose simply adding an extra level of wildcard dir to each path, and letting the engine resolve the 'best' extension to chose. Paths should be duplicated for backwards compat. So something like...
msi (installed as single item - so no backwards compat concerns):
Choco
I'm less certain whether choco has any conventions for packaging different dll's for running on different frameworks. Maybe @gep13 could advise. 😄 |
Let's back up for a moment to what we already have built into the
What if we handled the issue of multiple exception versions by enhancing #4 to examine the target runtime of the extension as well as the version. Then, rather than having structured packages with multiple targets, we could simply have different packages. Use of structured packages seems to me to be advantageous in the usual nuget situation because nuget does the work for us. But in this situation, we have to do the work of selecting ourselves. I'm not completely opposed to adding structure to our packages but I would lean toward exhausting the options we already have first. If we do want to add that structure, then we may want to put annotations about the runtime platform into the PS: If you leveraged what's already there as described, then nunit/vs-project-loader#25 would no longer depend on #388. |
That's exactly what I was suggesting! 😄 I must have done a rubbish job of explaining it! You're correct we gain nothing 'automated' from using a structured NuGet package, but we should have a 'convention' to package an extension targeting multiple platforms. We can't put two
I agree - I don't see there being a dependency/blocker. #388 being on the horizon is just an additional motivation to do this work. |
I've missed an edge case. If we start using subdirectories in Extension NuGet packages, than 'old' NuGet Engines won't be able to location 'new' extensions, without a manual edit to the (The above also of course applies to choco) |
I was imagining option (c) - use separate packages, which automatically puts things into separate directories. The packages would have different names, e.g.: vs-project-loader and fancy-vs-project-loader. The name of the included assembly would be the same or different depending on whether the two items were mutually exclusive. So, if "fancy" had all the old functionality plus the new, then you would give them the same name and the engine would pick the best. OTOH, if "fancy" were an addition to the basic one, then you'd give them different names. Fancy would respond yes when asked if it could handle the new project format and the original one would respond no. And here's another option to handle the same thing: implement extensions to extensions, which has been sitting there as an issue for a long time. It's not a hard thing to do and I'd been thinking of doing it for vs-project-loader when I was still working on the engine. VS project loader already has a number of alternative things it tries and might be more maintainable as a set of separate assemblies. Just throwing ideas against the wall here. 😄 |
@ChrisMaddock RE: Chaining Yes, that's available. Small correction: there's no "default" location. The directory containing the engine itself is the required location for the initial addins files (there can be more than one). Currently, of course, all chained addins files are processed. We could add some annotation to an addins file that says "only process this for runtime X" Easiest way would be a structured comment on the first line. |
Sorry, by default location, I meant "what's currently specified in out default (distributed) addins files." i.e.
Ah! I see, there is third option! I'd disagree that the user should need to install a separate package, depending on which version of the .NET Framework they are running however. With packaging the extensions together, the user installs the 'feature' they want (in this case, loading VS projects), and gets the best assembly for the job automatically. With separate packages, it seems to user would need to a) be aware of the existence of multiple packages and b) manage which package to pull themselves - which may change in future, if they upgrade the framework they are running on. It would also add additional overhead to our own packaging, and cost with our bundled packages. (What would we include there - the lowest target? All targets? The 'most commonly used', which would presumable be later that net20?) In my opinion, the end-user experience will be much nicer their if the engine resolves the 'best' target framework to use, and the user is only concerned with which 'features' they require. This is consistent with how we package the framework again - we don't require users to install a separate package per framework version, they just get the 'highest' build available for their platform. The difference of course is that NuGet does the resolution for us there - rather than NUnit being responsible for it. |
I don't think option (c) works for the case of the project loader because the functionality is the same in the eyes of the user: "Open VS Projects" We want two implementations for... implementation reasons! But in other cases, there could be separate functionality from a user POV and we might create separate packages. IMO it's worth experimenting with this and I thought the project issue might be a place to do it. The code you would write to use msbuild would not be all that different and you could later transform it to use a structured package. |
I think use of a directory structure is a no-brainer within a package. The real question is whether the engine needs to understand the names of the directories, i.e. a set of constants like net20, etc. It may be better to examine all assemblies (in all directories) and have the metadata specify when they should be used. I'm not sure about this - I'd have to write some code first 😄 - but there could be an advantage to allowing extensions of multiple types to live in the addins directory, for example. As a reminder... the ExtensionService can examine metadata first, before even trying to load an assembly. The second stage of actually loading it is where you might get an exception, but because we use Mono.Cecil, we can always (so far) look at metadata without error. Once we have loaded, we can ask the assembly things like "can you handle this particular project" and proceed accordingly. It's a fairly simple but powerful structure, if I do say so myself. 😄 |
RE: Edge Case Yes, as it now stands, there is no simple way to install a new extension. The user has to edit the So basically, without this, you have to install extensions where the engine expects to find them. A new package with two versions could do this:
The addins file would contain
|
Agreed!
Yes - I agree basing things on directory name doesn't really fit our current model. If it's not too inefficient, I'd prefer to have the engine examine everything - and rely on directory structure just as the 'human readable' part of the system!
Exactly what I was thinking. 😄 I've done a rough spike on the nunit-project-loader over at nunit/nunit-project-loader#14. That's my current thoughts on how we'd want to package - thoughts welcome. 😄 |
Does the Chocolatey Package just contain dll's, or does it contain exe's as well? |
@gep13 - thanks for picking up! Just dll's - below is an example of where we're at now, supporting only full .NET Framework. |
@gep13 The packages are extensions to the our engine, identified by the prefix "nunit-extension-" and contain dlls that are loaded dynamically by the engine itself using the path |
@ChrisMaddock @CharliePoole the reason that I asked is that with the latest release of Chocolatey 0.10.9, there is a new convention of using x86 and x64 folders, in order for the shimming process to correctly identify which exe should be shimmed based on the architecture of the machine being installed onto. If the Chocolatey packages are just containing dll's then the ball is pretty much in your court. We don't really specify any best practices in this area, and instead leave it up to the maintainers of the packages. Chocolatey won't do anything "special" in the case of only dll's. Hope that helps! |
Great, thanks @gep13. It does! |
I've updated the plan above - it seems @CharliePoole and I are in agreement. If anyone else has thoughts - would be keen to hear them. Otherwise, I'll start writing code, as time allows. |
This sounds great! I do think that there are benefits to specially understanding target framework monikers and not looking through the contents of every folder. They might have different kinds of file in them; you wouldn't want to always load exactly one DLL no matter what. |
Bit confused what you mean by this bit - can you give an example of the sort of thing you're thinking of? 🙂 The main downside I can think of not recognising tfm identifiers as directory names is just the inefficiency - hopefully negligable in the scheme of everything else the engine is doing. My concern is marrying up identifying tfm directories with the current
If we start handing directories containing subdirectories differently, things feel like they get a little messy - potentially in a non-backwards-compatible way. (Although I'm sure niche enough not to worry!) What should happen if a dir has subdirs named after tfms? Does it then ignore all assemblies, and just look in the 'best' tfm dir? How about if some subdir's are recognised tfm's and others aren't? Or if there's assemblies in root and tfm-subdirs - do we start ignoring the root assemblies? It just all feels a little unclear to me - where as 'examine everything' is a simple, understandable process to document and maintain. |
Seems to me there is a compatible way to do this without having special naming of the directories. When we examine a directory for extensions, we first look for |
Yeah, I see. I'm thinking there may be extra DLLs in one folder, like AsyncBridge.dll for example. How would the engine decide which of the DLLs to load in that case? What would the .addins file look like when you need to load one DLL if |
It depends on how the extension organizes things. Normally, a single extension has everything in a single directory, unlike our msi where we put everything into an addins directory. The extension Service is designed to handle both situations. So, taking your situation and assuming you use two subdirectories to avoid name clashes...
where xxx.addins has
and yyy.addins has
Alternatively, you could put all five files in one directory. It's up to the extension author. BTW... I took you at your word that both It will be much easier to deal with this in the context of actual extensions, rather than theoretically. |
I thought I said the opposite: there may be extra DLLs in one folder, like AsyncBridge.dll for example. In other words there may be non-extension dependencies for one target framework that should not be loaded for newer frameworks. |
I misunderstood then. There are often such dependencies. See the v2 framework driver for example. They are either ignored (if there is an addins file) or examined and found to contain no extensions. |
Mmm, we do already have a 'convention' for addins with dll dependencies - I don't see the need to change that as part of this issue. Sorry - I didn't realise that was what you were initially referring, Joseph. My thinking was that the engine would continue to examine everything pointed at by the addins file as a potential extension. The engine looks for an The above is all current functionality. My thinking is that we don't want to change any of that, and instead just extend the definition of 'best' to also look at 'highest target framework'. Other than that - we continue listing all potential extension assemblies to the engine (via the variety of different formats available in the Adding new 'special comment' functionality is an option, however I don't see the need to complicate the |
@ChrisMaddock Of course, that has to be "highest target framework loadable under the current runtime." WRT "special comment" it's an example of how you could add special handling per runtime without constraining the names of the directories. I agree we shouldn't add it unless there's a demonstrated need. |
This makes sense! Thanks, both of you! |
Reviving this issue as we now need it in order to implement nunit/nunit-project-loader#61. Current status: (UPDATED)
|
Since this requires modifying both the NetCore runner and the extension, I'll probably need to do multiple merges to the runner main. This seems safe since the runner already throws in the situation we are dealing with. All our default
will have additional lines added like
The original lines are needed for single-target extensions. The added lines will be used for multi-targeted extensions. Normally, the subdirectories will represent standard tfms but may, of course be used for other purposes. |
The netcore runner has been using directory `tools/net6.0/Any' for it's binaries. This is not necessary and results in placing the console code a level deeper in the structure. The purpose of '/Any' for content and tools is to tell NuGet how to process the contained files when copying to the output directory of an application that references the package. Our package is an executable tool, which is not intended to be referenced at all. We install it and use it right where it is. Neither runners nor extensions should use this feature of NuGet. It would probably be less confusing if we didn't use the |
@OsirisTerje @rprouse @manfred-brands Although I resolved this long-standing issue, there's a remaining question on which I would appreciate your eyes and brains. :-) As a part of the solution, I added additional entries to the default
This allowed the runners to locate, for example, the Because I was working on one package at a time (i.e. the console runner and the nunit project loader extension), I only later realized that versions 3.15.5 and 3.17.0 would not be able to use the new extension. Thinking about it, using We already had a solution for that. For years, we've suggested that extensions should include their own In the end, I gave nunit-project-loader it's own A further step would be to require all extensions to have an As I said, I'd appreciate your feedback. I'm here as a helper and it's really up to the project to decide. FWIW, I would lean toward removing the extra wildcard in the defaults and forcing each extension to add a separate |
@CharliePoole I don't know anything about these add-ins, but here is my opinion: Are NUnit console extensions NuGet packages? Then they should follow .NET standard layout.
The above should allow the extension loader to find binaries for the appropriate version for the Framework and OS the runner is executing on. If running on linux, it can look for linux->unix->any (or none) for the RID. See |
@manfred-brands Good questions and probably questions others would ask, so I'll answer at length. Yes, they are. They are also chocolatey packages in the case of the four we maintain. That is, if you are using console runner from chocolatey, you must install extensions from chocolatey. If from nuget, then the extensions must come from nuget. That's because nuget expects to find things in a certain structure and so does chocolatey. @ChrisMaddock , myself and others had a lot of discussion on this issue about whether or not to actually recognize TFMs. If you read the thread, you'll see we decided to go with the simpler approach of examining the contents of directories to see if they were loadable, rather than implementing special handling based on directory names. This was a case of trying the "simplest thing that could possibly work" rather than trying to design the perfect solution up front. Anyway, that's the approach I implemented. Bear in mind that extension manager already knows how to determine if an extension assembly is usable under the code in which we are running. I added the code to decide which of two possible builds of an extension is "best" - i.e. the latest version that is runnable. The engine- and console-provided I've always advised people to provide an
For the latest build of the nunit project loader extension, which is multi-targeted it looks like this...
So, in short, our past discussions of this (I'm astonished to see they took place back in 2018) took the view that understanding directory names (TFMs) might be ideal in some sense, but was probably more work than we could take on and possibly unnecssary. That's because we have pretty efficient, non-reflection-based code for examining assemblies. So that's what I just implemented (six years later!) It seems to be working at this point. Of course, we could run into some flaw tomorrow and need to fix it. In any case, I do like the notion that the loader should know as little as possible about the internal structure of an extension. |
It would be good to support multi-targeted engine addin's. Two motivations:
To allow extensions to target two different .NET Framework versions, where there is a significant advantage to build a version targeting a higher framework, but we don't wish to drop support for older frameworks. Convert to parsing project files using MSBuild vs-project-loader#25 is a great example of this.
To allow the packaging of extensions for potential separate .NET Standard and .NET Framework engines, which we may have in the future.
Here's a rough breakdown of what I think would be involved:
Here's some of my thoughts on how to go about this - comments welcome:
Updated post-discussion 02/04/18
Extension precedence
The engine already has code to select the extension of the highest assembly version, when multiple extensions of the same type are available. I propose we expand this to look at Highest Assembly Version and then 'Best' target framework.
For the sake of consistency with the wider .NET ecosystem, I think our definition of 'Best' framework should match NuGet's. We can limit the overhead by only initially implementing support for addins targetting .NET Framework or .NET Standard.
NuGet packaging of Extensions
For consistency with the wider .NET ecosystem, I believe we should aim to make use of the existing multi-target packaging conventions introduced by NuGet - or the defaults created bydotnet pack
on a multi-targeting project.We currently seem to package our extensions under/tools/
in their NuGet package. Is this the right location, given the main engine NuGet package, packages that under/lib/
? The NuGet docs recommend/tools/
for "Powershell scripts and programs accessible from the Package Manager Console".Taking account of backwards compatibility - I'd like to look at repackaging our extension dll's with the dlls under the/lib/
directory - as I believe is more standard. I could be wrong there - open to discussion on this one with people more knowledgable than me!We'll continue to package extensions under the
tools
subdir. We'll add in framework specific subdirectories, as below:Addins file format/NuGet packaging
The format of the addins file itself doesn't need to change - it can of course be pointed at a specific dll targeting a specific platform already. The question to look at here is the paths we put in the default .addins files to cater for NuGet and .msi installations.
I'd be tempted just to add an extra level of wildcard directory to the existing paths for msi/NuGet resolution, and have the engine attempt to load all extensions following the extension precedence logic described above. This is less efficient than attempting to read directory names and work out the , but would simplify the code, as the below functionality would already be required, given the current functionality to specify specific paths in the .addins file. Thoughts on this bit?
To ensure backwards compatibility between new extension packages, and old engine installations, and
.addins
file will be added to thetools
directory of new extensions with subdirectory. This will allow the .addins files to 'chain' - and old engines to locate new addins, without the user needing to make any manual edits to the addins file.This is spun out of #388. @rprouse - if you had particular plans for this part, feel free to boot me off and take the issue back! 😄
The text was updated successfully, but these errors were encountered: