You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which returns an empty array meaning that both quantizers return the same values.
However with recurrent networks changing between these two changes the result. I have attached a small reproducible output that is based on the test code from qrecurrent_test.py to demonstrate the behavior. QLSTM and QSimpleRNN both give results that do not match.
quantized_bits
should be the same using the default valueNone
andalpha = 1.0
.scale
set as1.0
withself.alpha = None
:https://github.com/google/qkeras/blob/master/qkeras/quantizers.py#L550
scale
set as1.0
withself.alpha = 1.0
:https://github.com/google/qkeras/blob/master/qkeras/quantizers.py#L597
And can they return the same values as demonstrated with this example:
Which returns an empty array meaning that both quantizers return the same values.
However with recurrent networks changing between these two changes the result. I have attached a small reproducible output that is based on the test code from
qrecurrent_test.py
to demonstrate the behavior.QLSTM
andQSimpleRNN
both give results that do not match.Expected output:
Actual output:
The text was updated successfully, but these errors were encountered: