--- library_name: kernels license: apache-2.0 --- This is the repository card of kernels-community/quantization-eetq that has been pushed on the Hub. It was built to be used with the [`kernels` library](https://github.com/huggingface/kernels). This card was automatically generated. ## How to use ```python # make sure `kernels` is installed: `pip install -U kernels` from kernels import get_kernel kernel_module = get_kernel("kernels-community/quantization-eetq") w8_a16_gemm = kernel_module.w8_a16_gemm w8_a16_gemm(...) ``` ## Available functions - `w8_a16_gemm` - `w8_a16_gemm_` - `preprocess_weights` - `quant_weights` ## Benchmarks No benchmark available yet.