Update README.md
Browse files
README.md
CHANGED
|
@@ -13,15 +13,21 @@ tags:
|
|
| 13 |
|
| 14 |
This is a [BAdam](https://arxiv.org/abs/2404.02827 "BAdam: A Memory Efficient Full Parameter Optimization Method for Large Language Models") and [LoRA+](https://arxiv.org/abs/2402.12354 "LoRA+: Efficient Low Rank Adaptation of Large Models") fine-tuned danube2 base model. It uses the ChatML template and was trained on the [SystemChat-1.1](https://huggingface.co/datasets/abacusai/SystemChat-1.1) from [Abacus.AI](https://huggingface.co/abacusai).
|
| 15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
## Template
|
| 17 |
|
| 18 |
```jinja
|
| 19 |
-
<|im_start
|
| 20 |
{{system}}<|im_end|>
|
| 21 |
-
<|im_start
|
| 22 |
{{instruction}}<|im_end|>
|
| 23 |
-
<|im_start
|
| 24 |
-
{{response}}<|im_end
|
| 25 |
```
|
| 26 |
|
| 27 |
## BAdam
|
|
|
|
| 13 |
|
| 14 |
This is a [BAdam](https://arxiv.org/abs/2404.02827 "BAdam: A Memory Efficient Full Parameter Optimization Method for Large Language Models") and [LoRA+](https://arxiv.org/abs/2402.12354 "LoRA+: Efficient Low Rank Adaptation of Large Models") fine-tuned danube2 base model. It uses the ChatML template and was trained on the [SystemChat-1.1](https://huggingface.co/datasets/abacusai/SystemChat-1.1) from [Abacus.AI](https://huggingface.co/abacusai).
|
| 15 |
|
| 16 |
+
## Quants
|
| 17 |
+
|
| 18 |
+
Thank you [mradermacher](https://huggingface.co/mradermacher)!
|
| 19 |
+
|
| 20 |
+
- [mradermacher/danube2-1.8b-SystemChat-1.1-GGUF](https://huggingface.co/mradermacher/danube2-1.8b-SystemChat-1.1-GGUF)
|
| 21 |
+
|
| 22 |
## Template
|
| 23 |
|
| 24 |
```jinja
|
| 25 |
+
<|im_start|>system
|
| 26 |
{{system}}<|im_end|>
|
| 27 |
+
<|im_start|>user
|
| 28 |
{{instruction}}<|im_end|>
|
| 29 |
+
<|im_start|>assistant
|
| 30 |
+
{{response}}<|im_end|>
|
| 31 |
```
|
| 32 |
|
| 33 |
## BAdam
|