mirror of
https://github.com/dogkeeper886/ollama37.git
synced 2025-12-10 07:46:59 +00:00
cross attention Q and K projections needs to have their heads swapped, similar to non-cross attention Q and K tensors
6.5 KiB
6.5 KiB