HIGHLIGHTS
- who: Yoojoong Kim from the SchoolThe Catholic University of have published the paper: A pre-trained BERT for Korean medical natural language processing, in the Journal: Scientific Reports Scientific Reports of December/31,/2020
- what: Through NLP experiments, the authors evaluated the language understanding capability of KM-BERT and compared its performance with that of other language models. This study shows the feasibility of domain-specific pre-training using a pre-trained languagespecific model. This study demonstrated that the language understanding of the model improved when the prelearning approach was used.
- how: The . . .
If you want to have access to all the content you need to log in!
Thanks :)
If you don't have an account, you can create one here.