Skip to content

Conversation

@ailzhang
Copy link

@ailzhang ailzhang commented Nov 11, 2025

Stack from ghstack (oldest at bottom):

[ghstack-poisoned]
ailzhang added a commit that referenced this pull request Nov 11, 2025
ghstack-source-id: 6bb849b
Pull-Request: #244
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 11, 2025
@ailzhang ailzhang requested review from ezyang and wconstab November 11, 2025 01:20
@ailzhang
Copy link
Author

The first PR is just adding a unit test (with claude :P ) to make sure I understand the current behavior 🙏 Any feedback is appreciated!

@ailzhang ailzhang changed the title Add unit test for existing ac behavior Add unit test for existing ac API behavior Nov 11, 2025
return Transformer(model_args)


def create_joint_graph_from_model(model, input_args):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this implementation is strange, any issues just using the same joint graph capture frontend as the rest of the repo?

torch_ir_with_fqn = _export(self.model, inputs)
# TODO Cna't use fake mode here because it clashes with the user level
# fake mode. Ideally dynamo should reuse the user level fake mode.
self.joint_with_descriptors = aot_export_joint_with_descriptors(
self.stack,
torch_ir_with_fqn,
inputs,
decompositions=decomp_table,
)
gm = self.joint_with_descriptors.graph_module

# Define save list with operations that might be in the graph
save_list = {
torch.ops.aten.mm.default,
torch.ops.aten.addmm.default,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this always gets decomposed away. and if it doesn't, it will mess up your saved node count lol

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants