pikhan's picture
update readme
8f96be1 verified
|
raw
history blame
838 Bytes
metadata
datasets:
  - qiaojin/PubMedQA
  - kroshan/BioASQ
language:
  - en
library_name: transformers
pipeline_tag: table-question-answering
tags:
  - chemistry
  - biology
  - molecular
  - synthetic
  - language model

This model is an example of how a fine-tuned LLM even without the full depth, size, and complexity of larger and more expensive models can be useful in context-sensitive situations. In our use-case, we are applying this LLM as part of a broader electronic lab notebook software setup for molecular and computational biologists. This GPT-2 has been finetuned on datasets from BioASQ and PubMedQA and is now knowledgeable enough in biochemistry to assist scientists and integrates as not just a copilot-like tool but also as a lab partner to the overall Design-Built-Test-Learn workflow ever growing in prominence in synthetic biology.