An asip for neural network inference on embedded devices with 99% pe utilization and 100% memory hidden under low silicon cost

HIGHLIGHTS

  • who: Muxuan Gao et al. from the School of Information and Electronics, Beijing Institute of Technology, Beijing, China have published the research: An ASIP for Neural Network Inference on Embedded Devices with 99% PE Utilization and 100% Memory Hidden under Low Silicon Cost, in the Journal: Sensors 2022, 3841 of /2022/
  • what: This work focused on low-power and low-latency data access with minimized This research was implemented based on an ASIP (application specific instruction set processor) in which an ISA was based on the caffe2 inference operator and the hardware design was based . . .

     

    Logo ScioWire Beta black

    If you want to have access to all the content you need to log in!

    Thanks :)

    If you don't have an account, you can create one here.

     

Scroll to Top

Add A Knowledge Base Question !

+ = Verify Human or Spambot ?