AI4Bharat State Expert โ€” Rajasthan

Fine-tuned Qwen3-1.7B on AI4Bharat Indic Languages dataset for Rajasthan. Trained at SRMIST Vadapalani AI4Bharat Tune-Athon (Feb 26, 2026). Checkpoint: saved.

Training

Param Value
Base Qwen3-1.7B 4-bit
Modules q_proj, k_proj, v_proj, o_proj, gate_proj, up_proj, down_proj
Splits indic + conv + cult

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Kitler2205/AI4Bharat-State-Expert-Rajasthan")
tok = AutoTokenizer.from_pretrained("Kitler2205/AI4Bharat-State-Expert-Rajasthan")

Merged FP16 model โ€” no PEFT required.

Downloads last month
14
Safetensors
Model size
2B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Kitler2205/AI4Bharat-State-Expert-Rajasthan

Finetuned
Qwen/Qwen3-1.7B
Adapter
(342)
this model