Fine-tuned BART for Medical Discharge Summarization

Model Description

Fine-tuned BART-large-CNN on MIMIC III dataset optimized for medical discharge summary simplification with smart extraction preprocessing.

Training Configuration

  • Base Model: facebook/bart-large-cnn
  • Optimization: Smart extraction (150 tokens max)
  • Pipeline: Smart Extract โ†’ BART โ†’ RAG โ†’ Gemini
Downloads last month
224
Safetensors
Model size
0.4B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support