Example Usage
import torch
from transformers import AutoTokenizer, T5ForConditionalGeneration
torch_device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
tokenizer = AutoTokenizer.from_pretrained("laituan245/molt5-large", model_max_length=512)
model = T5ForConditionalGeneration.from_pretrained("MantasV/Procedure_molt5-large")
model.config.max_length = 512
model.to(torch_device)
#The reactants and products are separated by the bar (|), canonical SMILES format from rdkit
input = 'Clc1nc(Cl)c2c(n1)CSC2|C1COCCN1>>Clc1nc2c(c(N3CCOCC3)n1)SCC2'
input_enc = tokenizer(input, padding=True, truncation=True, return_tensors='pt').to(torch_device)
output = model.generate(**input_enc,max_new_tokens=512, num_beams=3, early_stopping=True)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Publication is available online Language Models for Predicting Organic Synthesis Procedures
Authors: M. Vaškevičius, J. Kapočiūtė-Dzikienė
GitHub repository all for other models and data
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for MantasV/Procedure_molt5-large
Base model
laituan245/molt5-large-smiles2caption