Fine-tuned BART for Medical Discharge Summarization
Model Description
Fine-tuned BART-large-CNN on MIMIC III dataset optimized for medical discharge summary simplification with smart extraction preprocessing.
Training Configuration
- Base Model: facebook/bart-large-cnn
- Optimization: Smart extraction (150 tokens max)
- Pipeline: Smart Extract โ BART โ RAG โ Gemini
- Downloads last month
- 224