Quantized Olmo 3
Collection
Verified models. All compatible with vLLM for very fast inference. Use the 3.1 models as they are more recent.
•
23 items
•
Updated
•
2
This is allenai/Olmo-3-7B-Instruct quantized with LLM Compressor with NVFP4. The model has been created, tested, and evaluated by The Kaitchup. The model is compatible with vLLM (tested: v0.11.1). Tested with an RTX 5090.
Subscribe to The Kaitchup. This helps me a lot to continue quantizing and evaluating models for free. Or you can "buy me a kofi".
Base model
allenai/Olmo-3-1025-7B