Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning from LuxLib when using OneHotArrays about Mixed Precision #1197

Open
oxinabox opened this issue Jan 9, 2025 · 0 comments
Open

Warning from LuxLib when using OneHotArrays about Mixed Precision #1197

oxinabox opened this issue Jan 9, 2025 · 0 comments

Comments

@oxinabox
Copy link
Contributor

oxinabox commented Jan 9, 2025

What is going wrong with this:

julia> 
       using OneHotArrays
       using Lux
       using Random
       
       embed = Dense(5=>2)
       ps, st = Lux.setup(Xoshiro(1), embed)
       xs = onehotbatch("aabc", "abcde")
       ys, _ = embed(xs, ps, st)
┌ Warning: Mixed-Precision `matmul_cpu_fallback!` detected and Octavian.jl cannot be used for this set of inputs (C [Matrix{Float32}]: A [Matrix{Float32}] x B [OneHotMatrix{UInt32, Vector{UInt32}}]). Falling back to generic implementation. This may be slow.
└ @ LuxLib.Impl ~/.julia/packages/LuxLib/ru5RQ/src/impl/matmul.jl:145
(Float32[-0.50204086 -0.50204086 -0.5297963 -0.5723163; -0.93569875 -0.93569875 -0.50831497 0.033260167], NamedTuple())

OneHotArrays should be really fast, since it should convert multiplication into indexing.
But I guess it's overloaded matrix multiplication isn't being hit, or Lux is failing to detect that it is being hit.

It is used in the Lux docs a bunch, so i thought it would be well supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant