HIGHLIGHTS
SUMMARY
The model was pre-trained on full-text paragraphs of 2 million papers randomly drawn from the database in a self-supervised way, i.e., by predicting masked words based on their surrounding context. After training the BERT model, the authors fine-tuned the paragraph classifier using 7,292 paragraphs labeled as either "solid-state synthesis", "sol-gel precursor synthesis", "hydrothermal synthesis", "precipitation synthesis", or "none of the above". A solution-based synthesis procedure includes the precursors and target materials, their quantities, and the synthesis actions and their attributes, properly sequenced. A bi-directional . . .
If you want to have access to all the content you need to log in!
Thanks :)
If you don't have an account, you can create one here.