FLUX.2-dev (MLX) 4-bit (transformer + VAE)

This repo contains only the MLX transformer + VAE weights for black-forest-labs/FLUX.2-dev, quantized to 4-bit using mflux/MLX.

Notes:

  • The FLUX.2-dev text encoder is not included here (MFLUX-WEBUI loads it via Torch/Transformers).
  • Intended for use with the experimental flux2-dev loader in MFLUX-WEBUI.
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support