A pre-trained bert for korean medical natural language processing

HIGHLIGHTS

  • who: Yoojoong Kim from the SchoolThe Catholic University of have published the paper: A pre-trained BERT for Korean medical natural language processing, in the Journal: Scientific Reports Scientific Reports of December/31,/2020
  • what: Through NLP experiments, the authors evaluated the language understanding capability of KM-BERT and compared its performance with that of other language models. This study shows the feasibility of domain-specific pre-training using a pre-trained languagespecific model. This study demonstrated that the language understanding of the model improved when the prelearning approach was used.
  • how: The . . .

     

    Logo ScioWire Beta black

    If you want to have access to all the content you need to log in!

    Thanks :)

    If you don't have an account, you can create one here.

     

Scroll to Top

Add A Knowledge Base Question !

+ = Verify Human or Spambot ?