HIGHLIGHTS
- What: The authors propose a hybrid mutual learning distillation (HMLKD) method for optimizing a pre-trained OCR model`s performance against such challenges. The authors propose a customs (CDKG) developed using CUAD and propose an integrated CDKG postOCR correction method (iCDKG-PostOCR) based on CDKG. The authors propose a hybrid mutual learning knowledge distillation method to address the challenges of low accuracy in text character recognition in customs scenarios. The effectiveness of the HMLKD and iCDKG-PostOCR methods was validated on this model, and the advantages and disadvantages of the methods were analyzed.
- Who: Fengchun . . .

If you want to have access to all the content you need to log in!
Thanks :)
If you don't have an account, you can create one here.