Skip to content

Conversation

@quic-muchhsu
Copy link
Contributor

Description

Layernorm pattern without Mul(scale) and Add(bias) will be fuse to Layernorm(scale =1, bias=nullptr).
If pattern matching breaks at missing Mul(scale), it won't fallback, but instead return Div as last node of the pattern, create a initializer of ones as scale, and return a success Layernorm pattern with scale=1 and bias=nullptr.

Motivation and Context

Layernorm pattern without fusion is very computational expensive. We should consider a missing Scale pattern as Layernorm to have better performance.

Signed-off-by: Mu-Chein Hsu <[email protected]>
@quic-muchhsu quic-muchhsu changed the title Add missing scale and bias patter to Layernorm fusion with scale =1, bias = 0. [QNN-EP] Add missing scale and bias patter to Layernorm fusion with scale =1, bias = 0. Dec 18, 2025
Signed-off-by: Mu-Chein Hsu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant