miscii-14b-transformers
Collection
3 items
β’
Updated
βI think thereβs a reason Iβm a shadow, but she looks like an angel.β β Viyellaβs Memory, excerpted from The Angelβs Message by Laur (2018).
miscii-14b-0218 is a fine-tuned model based on Qwen/Qwen2.5-14B-Instruct (Qwen Team, 2024). It is developed using Arceeβs MergeKit (Goddard et al. 2024), employing the Model Stock merge method (Jang, Yun, and Han 2024). The integration utilized tempesthenno-ppo-enchanted as the base model.
The configuration parameters for generating miscii-14b-0218 are documented below:
name: miscii-14b-0218
merge_method: model_stock
base_model: tempesthenno-ppo-enchanted
tokenizer:
source: base
dtype: float32
out_dtype: bfloat16
parameters:
int8_mask: true
normalize: true
rescale: false
models:
- model: tempesthenno-sft-0218-ckpt60
- model: tempesthenno-sft-0218-ckpt80
- model: tempesthenno-sft-0218-stage2-ckpt40
- model: tempesthenno-sft-0218-stage2-ckpt50
- model: tempesthenno-sft-0218-stage2-ckpt60
If you find miscii-14b-0218 useful for your research and applications, please use the following citation formats:
BibTeX
@misc{sthenno_2025,
author = { Sthenno and Jiayu Wang },
title = { miscii-14b-0218 (Revision 6f78859) },
year = 2025,
url = { https://huggingface.co/sthenno-com/miscii-14b-0218 },
doi = { 10.57967/hf/7297 },
publisher = { Hugging Face }
}