Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Type Conversions in Layer Norm Fusion #17281

Closed
wants to merge 2 commits into from

Conversation

philipphack
Copy link
Contributor

Enables the fusion of layer norm graphs with type conversions of input, scale and bias.

@NaiyerRizz NaiyerRizz self-assigned this Sep 18, 2024
Comment on lines 959 to 963
// shapes of scale and bias must match. If a conversion to the type of the
// input is the only user of the output, set the output to the conversion.
// Similarly, if one of the users of the scale/bias is a conversion to the
// type of the bias/scale, set the scale/bias to the conversion.
if (instr->user_count() == 1 &&
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am nervous about doing an incorrect rewrite based on these conversions you allow. What if the user, e.g., converts the input of layer norm from f32 to s2 and does all the layer norm logic in s2, before casting back to f32? It would be incorrect i think to rewrite this to a cudnn layer norm of full precision. Can we check at least all the types are floats of at least precision bf16?

Comment on lines 969 to 975
for (HloInstruction* scale_user : scale->users()) {
if (scale_user->opcode() == HloOpcode::kConvert &&
ShapeUtil::SameElementType(scale_user->shape(), bias->shape())) {
scale = scale_user;
break;
}
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why check if there is a convert user? This user might not even be part of the layer norm.

@reedwm
Copy link
Member

reedwm commented Oct 4, 2024

This looks good to me. As we discussed offline, you want to do some more testing on models first, so I'll wait until you do that before approving, otherwise this may get prematurely merged.

@reedwm
Copy link
Member

reedwm commented Oct 8, 2024

Approving as you confirmed the testing did not find issues.

copybara-service bot pushed a commit that referenced this pull request Oct 8, 2024
Imported from GitHub PR #17281

Enables the fusion of layer norm graphs with type conversions of input, scale and bias.
Copybara import of the project:

--
2880fde by Philipp Hack <[email protected]>:

Layer norm fusion with type conversions of input, scale and bias.

--
898f002 by Philipp Hack <[email protected]>:

Layer norm fusion with type conversions of input, scale and bias.

Merging this change closes #17281

FUTURE_COPYBARA_INTEGRATE_REVIEW=#17281 from philipphack:u_layer_convert_xla 898f002
PiperOrigin-RevId: 683664734
copybara-service bot pushed a commit to tensorflow/tensorflow that referenced this pull request Oct 8, 2024
Imported from GitHub PR openxla/xla#17281

Enables the fusion of layer norm graphs with type conversions of input, scale and bias.
Copybara import of the project:

--
2880fde5f71ad1aba23651ff866fd00becc706e5 by Philipp Hack <[email protected]>:

Layer norm fusion with type conversions of input, scale and bias.

--
898f002f419909f367e6f9eedc72c61f4c73d201 by Philipp Hack <[email protected]>:

Layer norm fusion with type conversions of input, scale and bias.

Merging this change closes #17281

FUTURE_COPYBARA_INTEGRATE_REVIEW=openxla/xla#17281 from philipphack:u_layer_convert_xla 898f002f419909f367e6f9eedc72c61f4c73d201
PiperOrigin-RevId: 683664734
@copybara-service copybara-service bot closed this in becd9d6 Oct 8, 2024
copybara-service bot pushed a commit to tensorflow/tensorflow that referenced this pull request Oct 8, 2024
Imported from GitHub PR openxla/xla#17281

Enables the fusion of layer norm graphs with type conversions of input, scale and bias.
Copybara import of the project:

--
2880fde5f71ad1aba23651ff866fd00becc706e5 by Philipp Hack <[email protected]>:

Layer norm fusion with type conversions of input, scale and bias.

--
898f002f419909f367e6f9eedc72c61f4c73d201 by Philipp Hack <[email protected]>:

Layer norm fusion with type conversions of input, scale and bias.

Merging this change closes #17281

PiperOrigin-RevId: 683682296
copybara-service bot pushed a commit to tensorflow/tensorflow that referenced this pull request Oct 8, 2024
FUTURE_COPYBARA_INTEGRATE_REVIEW=openxla/xla#17281 from philipphack:u_layer_convert_xla 898f002f419909f367e6f9eedc72c61f4c73d201
PiperOrigin-RevId: 671073597
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants