Mol-LLaMA: Towards General Understanding of Molecules in Large Molecular Language Model
Paper
•
2502.13449
•
Published
•
45
[Project Page] [Paper] [GitHub]
This repo contains the weights of Mol-LLaMA including the LoRA weights and projectors, build with Llama: meta-llama/Llama-3.1-8B-Instruct. Llama 3.1 is licensed under the Llama 3.1 Community License, Copyright © Meta Platforms, Inc. All Rights Reserved.
Mol-LLaMA is trained on Mol-LLaMA-Instruct, to learn the fundamental characteristics of molecules with the reasoning ability and explanbility.
If you find our model useful, please consider citing our work.
@misc{kim2025molllama,
title={Mol-LLaMA: Towards General Understanding of Molecules in Large Molecular Language Model},
author={Dongki Kim and Wonbin Lee and Sung Ju Hwang},
year={2025},
eprint={2502.13449},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
We appreciate LLaMA, 3D-MoLM, MoleculeSTM, Uni-Mol and SciBERT for their open-source contributions.
Base model
meta-llama/Llama-3.1-8B