Update README.md
Browse files
README.md
CHANGED
|
@@ -1,5 +1,4 @@
|
|
| 1 |
---
|
| 2 |
-
|
| 3 |
language:
|
| 4 |
- en
|
| 5 |
- es
|
|
@@ -9,11 +8,11 @@ tags:
|
|
| 9 |
base_model:
|
| 10 |
- yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
|
| 11 |
- TomGrc/FusionNet_7Bx2_MoE_14B
|
|
|
|
| 12 |
---
|
| 13 |
|
| 14 |
# LogoS-7Bx2-MoE-13B-v0.1
|
| 15 |
|
| 16 |
Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.
|
| 17 |
|
| 18 |
-
The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.
|
| 19 |
-
|
|
|
|
| 1 |
---
|
|
|
|
| 2 |
language:
|
| 3 |
- en
|
| 4 |
- es
|
|
|
|
| 8 |
base_model:
|
| 9 |
- yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
|
| 10 |
- TomGrc/FusionNet_7Bx2_MoE_14B
|
| 11 |
+
license: apache-2.0
|
| 12 |
---
|
| 13 |
|
| 14 |
# LogoS-7Bx2-MoE-13B-v0.1
|
| 15 |
|
| 16 |
Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.
|
| 17 |
|
| 18 |
+
The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.
|
|
|