You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when sending a tensor from a Relu activation function to float2bit, f can take the value 0. Then, the log2 will return -inf with all the complications it ensues (out of bound error for the gather function).
One quick fix is to add a small constant to the tensor f such that it won't change the e_scientific value.
Thanks for this library, been very helpful ! :)
The text was updated successfully, but these errors were encountered:
Hi,
when sending a tensor from a Relu activation function to float2bit, f can take the value 0. Then, the log2 will return -inf with all the complications it ensues (out of bound error for the gather function).
One quick fix is to add a small constant to the tensor f such that it won't change the e_scientific value.
Thanks for this library, been very helpful ! :)
The text was updated successfully, but these errors were encountered: