Releases: Blaizzy/mlx-vlm
Releases · Blaizzy/mlx-vlm
v0.1.13
v0.1.12
v0.1.11
What's Changed
- Chat in CLI. by @chigkim in #168
- Fix skip-vision predicate and add utils unit test (quantize and inputs) by @Blaizzy in #172
- Refactor topk to use mlx.core (DS-VL2) by @Blaizzy in #175
- Fix trainer and Qwen2-VL by @Blaizzy in #179
- Pin latest mlx by @Blaizzy in #184
New Contributors
Full Changelog: v0.1.10...v0.1.11
v0.1.10
v0.1.9
v0.1.8
v0.1.7
What's Changed
- Fix multi-image and 2x speed improvements (DS-VL2) by @Blaizzy in #157
- Refactor utils (model loading, inference and output processing) by @Blaizzy in #161
- Fix Llama-3.2-Vision (18x faster generation and 75% less memory usage) by @Blaizzy in #163
⚠️ Breaking Changes
This release introduces some breaking changes. If you encounter any issues, please open an issue or submit a PR.
Full Changelog: v0.1.6...v0.1.7