Skip to content

Conversation

@samuelt0
Copy link
Contributor

@samuelt0 samuelt0 commented Nov 8, 2025

What does this PR do?

Adds the cross attention module to the LTX Attention Class

Fixes # (issue)
Mirrors the previous Attention class and allows us to distinguish between attention instances

@samuelt0
Copy link
Contributor Author

samuelt0 commented Nov 8, 2025

@sayakpaul

self.dropout = dropout
self.out_dim = query_dim
self.heads = heads
self.is_cross_attention: Optional[bool] = None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can set it to True / False based on the value of cross_attention_dim no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.cross_attention_dim is always 4096 for attn1 and attn2 though.

self.cross_attention_dim either gets set to cross_attention_dim or query_dim which are both 4096 in my runs

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if cross_attention_dim is None: self.is_cross_attention = False -- what's up with this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants