Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
15
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using dphn/Dolphin-Mistral-24B-Venice-Edition as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: dphn/Dolphin-Mistral-24B-Venice-Edition
merge_method: ties
models:
- model: Gryphe/Codex-24B-Small-3.2
parameters:
density: 0.4
weight: 1.0
dtype: bfloat16