Update README.md
Browse files
README.md
CHANGED
|
@@ -9,7 +9,7 @@ tags:
|
|
| 9 |
|
| 10 |

|
| 11 |
|
| 12 |
-
Borealis-10.7B is a 10.7B model made of 48 Mistral 7B layers, finetuned for +70h on 2xA6000 on a big RP and Conversational dataset.
|
| 13 |
|
| 14 |
Next step would be to do a DPO train on top, but I don't know if it would be benefical.
|
| 15 |
|
|
|
|
| 9 |
|
| 10 |

|
| 11 |
|
| 12 |
+
Borealis-10.7B is a 10.7B model made of 48 Mistral 7B layers, finetuned for +70h on 2xA6000 on a big RP and Conversational dataset with llama2 configuration of Axolotl, like SOLAR.
|
| 13 |
|
| 14 |
Next step would be to do a DPO train on top, but I don't know if it would be benefical.
|
| 15 |
|