Existing biomedical models like BioBERT and PubMedBERT, based on BERT and its variants, excel in discriminative tasks but lack generative capabilities.
This paper introduces BioGPT, a domain-specific generative Transformer language model trained on biomedical literature, which outperforms previous models on various biomedical natural language processing tasks.
BioGPT achieves significant F1 scores in tasks like BC5CDR, KD-DTI, and DDI end-to-end relation extraction, along with a new record of 78.2% accuracy on PubMedQA. Additionally, it excels in generating fluent descriptions for biomedical terms.
BioGPT: generative pre-trained transformer for biomedical text generation and mining
1
1 reply