Files
ollama37/model/models
Michael Yang adff143bcd fix: mllama quality (#10807)
* fix mllama convert

- transform attn_gate and ffn_gate
- swap attention heads for vision models

* fix mllama

the mlp gate which was applied in the wrong place
2025-05-22 11:30:49 -07:00
..
2025-05-20 15:51:08 -07:00
2025-05-20 15:51:08 -07:00
2025-05-21 10:21:24 -07:00
2025-05-22 11:30:49 -07:00
2025-05-21 10:21:24 -07:00
2025-05-21 10:21:24 -07:00