File size: 1,846 Bytes
5b7acc5
dc59e5a
5b7acc5
0a2be6a
5b7acc5
 
 
 
 
 
 
 
1d803e5
5b7acc5
 
 
 
 
 
 
5d7d326
5b7acc5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5d7d326
 
 
 
6a5b6ab
 
5d7d326
 
6a5b6ab
5d7d326
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
---
license: cc-by-nc-4.0
datasets:
- NingLab/MuMOInstruct
language:
- en
base_model:
- meta-llama/Llama-3.1-8B-Instruct
pipeline_tag: text-generation
tags:
- chemistry
- molecule optimization
library_name: transformers
---

### Model Sources

<!-- Provide the basic links for the model. -->

- **Repository:** https://github.com/ninglab/GeLLMO
- **Paper:** https://arxiv.org/abs/2502.13398

## Usage

For instructions to run the model, please refer to our repository.

## Bias, Risks, and Limitations
While our models are designed for research and drug discovery applications, 
they come with ethical and safety considerations:

1. **Potential for Misuse:** Although the model is not explicitly designed to generate toxic,
controlled, or harmful compounds, adversarial prompts or unintended biases in the pretrained model 
may lead to the generation of undesirable molecules.
2. **Unintended Harmful Outputs:** The model does not inherently filter out molecules with high toxicity, 
abuse potential, or environmental hazards. Users must implement additional safeguards to prevent misuse.
3. **Absence of Built-in Safety Mechanisms:** The model does not incorporate explicit regulatory or 
safety filters (e.g., toxicity or compliance checks). 
It is the responsibility of users to validate generated molecules for safety and ethical considerations.

We urge users to adopt best practices, including toxicity prediction pipelines, 
ethical oversight, and responsible AI usage policies, to prevent harmful applications of this model.

## Citation
```
@article{dey2025gellmo,
      title={GeLLMO: Generalizing Large Language Models for Multi-property Molecule Optimization}, 
      author={Vishal Dey and Xiao Hu and Xia Ning},
      year={2025},
      journal={arXiv preprint arXiv:2502.13398},
      url={https://arxiv.org/abs/2502.13398}, 
}
```