Llama-3.2-3B-RCT-Spiral / adapter_config.json
TheTempleofTwo's picture
Upload adapter_config.json with huggingface_hub
b9ab862 verified
{
"lora_layers": 8,
"num_layers": 28,
"lora_parameters": {
"rank": 16,
"scale": 2.0,
"dropout": 0.05,
"keys": ["self_attn.q_proj", "self_attn.v_proj"]
}
}