Liquid AI Quantised Models
Collection
This repository contains Liquid AI Quantised Models to enable direct deployment with Ollama
โข
3 items
โข
Updated
This is a quantized GGUF model (Q4_K_M) compatible with Ollama.
You can pull and run this model directly with Ollama:
ollama pull hf.co/Sadiah/ollama-q4_k_m-LFM2-2.6B:Q4_K_M
Then run it:
ollama run hf.co/Sadiah/ollama-q4_k_m-LFM2-2.6B:Q4_K_M "Write your prompt here"
Please refer to the original model card for licensing information.
4-bit
Base model
LiquidAI/LFM2-2.6B