BGE-Reranker-v2-m3 LoRA for Multi-hop Path Ranking

Fine-tuned LoRA adapter for BAAI/bge-reranker-v2-m3.

Model Details

  • Base Model: BAAI/bge-reranker-v2-m3 (568M)
  • Adapter Type: LoRA
  • Trainable Parameters: 1,247,233

Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification
from peft import PeftModel
base_model = AutoModelForSequenceClassification.from_pretrained(
    "BAAI/bge-reranker-v2-m3", trust_remote_code=True
)
model = PeftModel.from_pretrained(base_model, "minhnv7/bge-reranker-v2m3-lora")
tokenizer = AutoTokenizer.from_pretrained("minhnv7/bge-reranker-v2m3-lora")
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for minhnv7/bge-reranker-v2m3-lora

Adapter
(1)
this model