Phi-2 SQL LoRA (lr=2e-4)

Fine-tuned microsoft/phi-2 on b-mc2/sql-create-context using QLoRA β€” achieving 76% exact match on SQL generation, up from a 2% baseline.

This is Run 1 (lr=2e-4) β€” the best performing run. See also: phi2-sql-lora-lr5e4 (lr=5e-4, 70% EM)

Results

Model Exact Match ROUGE-L Ξ” vs Base
Phi-2 Base 2.0% 0.886 β€”
This model (lr=2e-4) 76.0% 0.9903 +74pp

Evaluated on 50 held-out samples from sql-create-context (seed=42). Zero regressions β€” every query the base model got right, this model also got right.

Training Details

Parameter Value
Method QLoRA (4-bit NF4 + LoRA)
LoRA rank 16
LoRA alpha 32
Target modules q_proj, v_proj
Dataset 20,000 samples from sql-create-context
Epochs 2
Learning rate 2e-4
Effective batch size 16
Hardware Kaggle T4 x2
Training time ~7 hours

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig
import torch

model_name = "microsoft/phi-2"
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token

config = AutoConfig.from_pretrained(model_name, trust_remote_code=True)
config.__dict__['pad_token_id'] = tokenizer.pad_token_id

base = AutoModelForCausalLM.from_pretrained(
    model_name, config=config,
    dtype=torch.float16, device_map="auto", trust_remote_code=True
)
model = PeftModel.from_pretrained(base, "antony-bryan-3D2Y/phi2-sql-lora-lr2e4")
model.eval()

prompt = """### SQL Schema:
CREATE TABLE employees (id INT, name VARCHAR, department VARCHAR, salary INT)

### Question:
What are the names of employees in the engineering department?

### SQL Query:
"""

inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
with torch.no_grad():
    output = model.generate(**inputs, max_new_tokens=100, do_sample=False,
                            eos_token_id=tokenizer.eos_token_id)
n = inputs['input_ids'].shape[1]
result = tokenizer.decode(output[0][n:], skip_special_tokens=True)
result = result.replace("</s>", "").replace("<|endoftext|>", "").split('\n')[0].strip()
print(result)
# β†’ SELECT name FROM employees WHERE department = "engineering"

Links

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for antony-bryan-3D2Y/phi2-sql-lora-lr2e4

Base model

microsoft/phi-2
Adapter
(985)
this model