A multi-attention approach using bert and stacked bidirectional lstm for improved dialogue state tracking

HIGHLIGHTS

  • who: Muhammad Asif Khan and collaborators from the School of Computer Science and Engineering, Southeast University, Nanjing, China have published the article: A Multi-Attention Approach Using BERT and Stacked Bidirectional LSTM for Improved Dialogue State Tracking, in the Journal: (JOURNAL)
  • what: The authors propose a model that exploits the sequential and overall features encoded in BERT to improve the performance of neural_networks. The models for the joint goals were applied on four informable slots (price range, food, area, name), separately. The input of the model consists of 160 features extracted from the feature functions . . .

     

    Logo ScioWire Beta black

    If you want to have access to all the content you need to log in!

    Thanks :)

    If you don't have an account, you can create one here.

     

Scroll to Top

Add A Knowledge Base Question !

+ = Verify Human or Spambot ?