tags:
- gguf
- llama.cpp
- unsloth
license: mit
datasets:
- b-mc2/sql-create-context
language:
- en
metrics:
- accuracy
base_model:
- microsoft/Phi-3-mini-4k-instruct
This is a specialized Text-to-SQL model fine-tuned from the Microsoft Phi-3-mini-4k-instruct architecture. It has been optimized using Unsloth to provide high-accuracy SQL generation while remaining lightweight enough to run on consumer hardware.
Key Features
- Architecture: Phi-3-mini (3.8B parameters)
- Quantization: Q4_K_M GGUF & Q5_K_M
- Training Technique: Fine-tuned using Lora with Unsloth.
- Format: GGUF (Ready for Ollama, LM Studio, and llama.cpp)
- phi-3-mini-4k-instruct.Q4_K_M.gguf
- phi-3-mini-4k-instruct.Q5_K.gguf
Usage Instructions
Ollama (Recommended)
To deploy locally:
Download the
.gguffile (Q4 or Q5).Create the Modelfile with the following instructions
FROM ./phi-3-mini-4k-instruct.Q4_K_M.gguf
TEMPLATE """<s><|user|>
Schema: {{ .System }}
Question: {{ .Prompt }}<|end|>
<|assistant|>
"""
# Parameters for SQL stability
PARAMETER stop "<|end|>"
PARAMETER stop "<s>"
PARAMETER stop "</s>"
PARAMETER temperature 0.0
Run
ollama create phi3-sql-expert -f ModelfileRun
ollama run phi3-sql-expert "schema: CREATE TABLE table_name_7 (nba_draft VARCHAR, school VARCHAR) question: What was the NBA draft status for Northeast High School?"The answer should be
SELECT nba_draft FROM table_name_7 WHERE school = "Northeast"
Evaluation Data
The model was fine-tuned on the sql-create-context dataset, focusing on:
- Mapping natural language to SQL queries with SELECT, WHERE, and JOIN statements.
- Understanding table schemas provided in the prompt.
- Maintaining strict SQL syntax in the response.
Recommended Settings
Temperature: 0.0 or 0.1 (SQL requires deterministic output).
Stop Tokens: Ensure <|end|> is set as a stop sequence to prevent "infinite looping" generation.
Context Window: 2048 tokens.
Model Developer: msquared
Base Model: Phi-3-mini-4k-instruct