Turkish Finance Summarizer
This model is a Turkish financial news abstractive summarization model trained with a custom tokenizer and a custom training pipeline.
Model Architecture
- Transformer encoder-decoder
- mBART-style architecture
- Trained from scratch
- No weights initialized from Meta/Facebook mBART checkpoints
Model Details
- Language: Turkish
- Task: Abstractive Summarization
- Domain: Finance / Economy / Markets
- Training: Custom pretraining + fine-tuning
Usage
Load the model and tokenizer using Hugging Face Transformers and use generate() for abstractive summarization.
Notes
- Optimized for Turkish financial news
- Suitable for research and production inference
- Scratch-trained with domain-specific data
- Downloads last month
- 26