Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Samzy17
/
dpo-mistral-7b-harmless-v2
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
dpo-mistral-7b-harmless-v2
84.5 MB
1 contributor
History:
4 commits
Samzy17
Update README.md
aebbfba
verified
4 months ago
.gitattributes
1.52 kB
initial commit
4 months ago
README.md
5.31 kB
Update README.md
4 months ago
adapter_config.json
736 Bytes
Upload folder using huggingface_hub
4 months ago
adapter_model.safetensors
83.9 MB
xet
Upload folder using huggingface_hub
4 months ago
eval_results_50_samples.json
80.8 kB
Upload eval_results_50_samples.json with huggingface_hub
4 months ago
special_tokens_map.json
437 Bytes
Upload folder using huggingface_hub
4 months ago
tokenizer.model
493 kB
xet
Upload folder using huggingface_hub
4 months ago
tokenizer_config.json
2.13 kB
Upload folder using huggingface_hub
4 months ago
training_args.bin
5.33 kB
xet
Upload folder using huggingface_hub
4 months ago