BioGPT: generative pre-trained transformer for biomedical text generation and mining

  • Existing biomedical models like BioBERT and PubMedBERT, based on BERT and its variants, excel in discriminative tasks but lack generative capabilities.

  • This paper introduces BioGPT, a domain-specific generative Transformer language model trained on biomedical literature, which outperforms previous models on various biomedical natural language processing tasks.

  • BioGPT achieves significant F1 scores in tasks like BC5CDR, KD-DTI, and DDI end-to-end relation extraction, along with a new record of 78.2% accuracy on PubMedQA. Additionally, it excels in generating fluent descriptions for biomedical terms.

    https://academic.oup.com/bib/article/23/6/bbac409/6713511?guestAccessKey=a66d9b5d-4f83-4017-bb52-405815c907b9&login=true

1
1 reply