Files
ollama37/convert
Michael Yang adff143bcd fix: mllama quality (#10807)
* fix mllama convert

- transform attn_gate and ffn_gate
- swap attention heads for vision models

* fix mllama

the mlp gate which was applied in the wrong place
2025-05-22 11:30:49 -07:00
..
2025-02-13 16:31:21 -08:00
2025-05-22 11:30:49 -07:00
2025-04-25 16:59:20 -07:00
2025-04-25 16:59:20 -07:00