Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DYN-7535 Record package information in graphs(NodeLibraryDependencies) when using engines loaded from them #15530

Closed
wants to merge 2 commits into from

Conversation

zeusongit
Copy link
Contributor

Purpose

The PR adds package information if a python engine loaded from them is used inside a dynamo graph. Leveraging NodeLibraryDependencies which is also used to record nodes which are dependent on packages.

When a python engine is served from a package, the package name and version information is lost, due to which we were unable to display dependency mismatch in workspace references. This PR will preserve package name and version information as part of the NodeLibraryDependencies in the graph.

Implementation Notes:

  • A map is added to maintain a list of all extensions and the respective Python engine they added, if any
  • A map is added to maintain a list of all extensions and the Dynamo Package that loaded that extension.
  • This gives us a way to link a package to python engine, since loading of an extension is decoupled from package loading process, this seemed necessary. Also saves us from assembly mining and reflection.

Possible cases to look into:

  • One extension loading multiple python engines? possible? Should we guard against that, only consider the last one?
  • Multiple python engines with same name? Update engine names to a more qualified name. (like @pinzart mentioned attaching the PackageName and Version to it.)

Opening a graph with PythonEngine not loaded in Dynamo:

DynamoSandbox_0MUnzezPDs

Opening a graph with PythonEngine loaded in Dynamo, but with a version mis-match:

DynamoSandbox_q0lBECS4VN.mp4

Declarations

Check these if you believe they are true

  • The codebase is in a better state after this PR
  • Is documented according to the standards
  • The level of testing this PR includes is appropriate
  • User facing strings, if any, are extracted into *.resx files
  • All tests pass using the self-service CI.
  • Snapshot of UI changes, if any.
  • Changes to the API follow Semantic Versioning and are documented in the API Changes document.
  • This PR modifies some build requirements and the readme is updated
  • This PR contains no files larger than 50 MB

Release Notes

  • Record package information in graphs(NodeLibraryDependencies) when using engines loaded from them

Reviewers

@DynamoDS/dynamo

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See the ticket for this pull request: https://jira.autodesk.com/browse/DYN-7535

Copy link

github-actions bot commented Oct 8, 2024

UI Smoke Tests

Test: success. 11 passed, 0 failed.
TestComplete Test Result
Workflow Run: UI Smoke Tests
Check: UI Smoke Tests

@twastvedt
Copy link
Contributor

  • Multiple python engines with same name? Update engine names to a more qualified name. (like @pinzart mentioned attaching the PackageName and Version to it.)

I'm not sure what use there would be for wanting multiple engines to work with the same name, at the same time. (Also, I think parallel engines will be severely limited by only being able to load one instance of the native Python.) Might actually be nice to be able to override a given engine with your own version and have it "seen as" the same engine.

@@ -2174,6 +2180,11 @@ internal PackageInfo GetNodePackage(NodeModel node)
throw new Exception("There are multiple subscribers to Workspace.CollectingNodePackageDependencies. " +
"Only PackageManagerExtension should subscribe to this event.");
}
//in case of python node, return python engine dependency
if (node.IsPython && CollectingPythonEnginePackageDependencies != null)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why can't we just check for inheritance of PythonNodeBase?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess to allow other base classes? Just not sure how likely that is, and it feels super specific to include in NodeModel.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is what I wanted to do as well, but it will introduce a PythonNodeModel project dependency in Core, so I refrained from that. But can go that route as well, thoughts? @pinzart90

/// <returns>List of available unique python engines</returns>
private List<string> GetEngineList()
{
return PythonServices.PythonEngineManager.Instance.AvailableEngines.GroupBy(x => x.Name).Select(g => g.FirstOrDefault().Name).ToList();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this can just be .Select(e => e.Name).Distinct().ToList()

@pinzart90
Copy link
Contributor

Will add review tonight

}
}
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I say get rid of this new property and rather add a new internal virtual method GetNameOfAssemblyReferencedByNode
It is already specialized for DSFunction, you could do the same for PythonNodeBase (return the assembly location of the PythoNEngine type that corresponds with the currently selected engine name)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So...add PythonNodeModels dependency to Core?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. You can just override the base method in whatever class needs to customize it. Like DSFunction and PythonNodeModelBase

Copy link
Contributor Author

@zeusongit zeusongit Oct 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, but where should this new internal virtual method be? in WorkspaceModel? NodeModel?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NodeModel

}
ExtensionPackageMap.TryAdd(ext, pkg);
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instead of all these new data structures, you could just add a new handler to the PythonEngineManager.Instance.AvailableEngines.CollectionChanged event.
When a new engine is added, you could check what already loaded package contains its assembly localtion. THat way you could match the engine to its package.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried that, but that approach was missing something. When you trigger the CollectionChanged event you get all the engines added but to do what you said you would again have to do Assembly mining using some load context or, file search in package directories.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You would only have to iterate through the already loaded packages and match the root directory of each package to the python engine assembly location

@pinzart90
Copy link
Contributor

It looks like quite of effort to make Dynamo itself support non-standard PythonEngine packages (by which I mean packages that actually wrap extensions - for assembly isolation purposes).
PythonEngines were originally meant to be loaded directly from a package, with no extensions. The only reason we added extensions, is to enable assembly isolation before Dynamo fully supported it. I think in the near future we will want all these Python engines to go back to being simple packages again, and for Dynamo to manage all the assembly isolation and the actual loading of types.

}
var newEng = availableEngines.Last();
PythonEngineExtensionMap.TryAdd(newEng, ext);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you shouldn't need this if you just hook into the AvailableEngines.CollectionChanged event.

@zeusongit
Copy link
Contributor Author

It looks like quite of effort to make Dynamo itself support non-standard PythonEngine packages (by which I mean packages that actually wrap extensions - for assembly isolation purposes). PythonEngines were originally meant to be loaded directly from a package, with no extensions. The only reason we added extensions, is to enable assembly isolation before Dynamo fully supported it. I think in the near future we will want all these Python engines to go back to being simple packages again, and for Dynamo to manage all the assembly isolation and the actual loading of types.

that would make this a lot straight forward, but until that happens, we will have to deal with extensions I guess

@zeusongit zeusongit closed this Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants