Mol ID

A transformer encoder model pretrained on 50M ZINC SMILES string using flash attention 2

Hardware:

  • gpu that support flash attention 2 and bf16

Software:

  • flash attention 2
  • lightning for mixed precision (bf16-mixed)
  • wandb for logging
  • huggingface
    • tokenizers
    • datasets

github repo: link

Downloads last month
41
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.