Binarized neural network with parameterized weight clipping and quantization gap minimization for online knowledge distillation

HIGHLIGHTS

  • who: JU YEON KANG and collaborators from the of Electrical Computer Engineering, Sungkyunkwan University, have published the paper: Binarized Neural Network With Parameterized Weight Clipping and Quantization Gap Minimization for Online Knowledge Distillation, in the Journal: (JOURNAL)
  • what: In this study two performance improvements are proposed for online KD when a BNN is applied as a network: parameterized weight clipping (PWC) to reduce dead weights in the network quantization gap-aware adaptive temperature scheduling between the teacher networks. Next, the authors propose a method to alleviate the capacity shortage of binarized student networks caused by . . .

     

    Logo ScioWire Beta black

    If you want to have access to all the content you need to log in!

    Thanks :)

    If you don't have an account, you can create one here.

     

Scroll to Top

Add A Knowledge Base Question !

+ = Verify Human or Spambot ?