Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical-drug relation extraction?

Database (Oxford) 2023-02-15

Summary:

Collecting relations between chemicals and drugs is crucial in biomedical research. The pre-trained transformer model, e.g. Bidirectional Encoder Representations from Transformers (BERT), is shown to have limitations on biomedical texts; more specifically, the lack of annotated data makes relation extraction (RE) from biomedical texts very challenging. In this paper, we hypothesize that enriching a pre-trained transformer model with syntactic information may help improve its performance on...

Link:

https://pubmed.ncbi.nlm.nih.gov/36006843/?utm_source=Other&utm_medium=rss&utm_campaign=journals&utm_content=101517697&fc=None&ff=20230215001918&v=2.17.9.post6+86293ac

From feeds:

📚BioDBS Bibliography » Database (Oxford)

Tags:

Authors:

Anfu Tang, Louise Deléger, Robert Bossy, Pierre Zweigenbaum, Claire Nédellec

Date tagged:

02/15/2023, 00:19

Date published:

08/25/2022, 06:00