Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LoRALinearLayer #2

Merged
merged 2 commits into from
Aug 2, 2023
Merged

Add LoRALinearLayer #2

merged 2 commits into from
Aug 2, 2023

Conversation

RyanJDick
Copy link
Collaborator

@RyanJDick RyanJDick commented Aug 1, 2023

Adds a LoRALinearLayer with unit tests.

@RyanJDick RyanJDick mentioned this pull request Aug 1, 2023
1 task
out_features: int,
rank: int = 4,
alpha: float = 1.0,
device: torch.device = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, here's the device I was asking about in the other PR

Base automatically changed from ryan/init-repo to main August 2, 2023 13:40
Copy link
Collaborator

@brandonrising brandonrising left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nothing stands out as breaking to me, I look forward to learning how to test it myself!

@RyanJDick
Copy link
Collaborator Author

Added 2225965 to address #3 (comment)

@RyanJDick RyanJDick merged commit 805097f into main Aug 2, 2023
1 check passed
@RyanJDick RyanJDick deleted the ryan/lora-linear-layer branch August 2, 2023 15:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants