top of page
  • venkata19

Research driven OCR for paychecks



EVALUATION OF GENERATIVE AI Q&A CHATBOT CHAINED TO OPTICAL CHARACTER RECOGNITION MODELS FOR FINANCIAL DOCUMENTS

YU, QIU

College of Professional Studies, Northeastern University, qiu.yu1@northeastern.edu

Venkata Duvvuri

SiriusMindShare LLC., venkata@siriusmindshare.com

PRATIBHA YADAVALLI

ElemenTek LLC., pyadavalli@elmtk.com

NEAL PRASAD

ElemenTek LLC., nprasad@elmtk.com

Financial statements are cornerstones of several analyses, such as loan applications, as well as for legal firms collecting evidence and analysis. They exert a significant influence on the decisions of these institutions. Streamlining the processing of these statements, regardless of their form—be it digital or hard copies—stands as a pivotal objective for banks and similar firms. This research explores the integration of Optical Character Recognition (OCR) and generative AI for automating the extraction of crucial financial data from bank statement images. Furthermore, we design an architecture to make a generic analysis possible on multiple types of financial documents by utilizing a classification model tailored to categorize bank statement documents. This facilitates seamless data preparation for subsequent analysis or model training. Emphasizing precision and efficiency, we investigate OCR model architectures designed specifically to enhance text extraction accuracy from low-resolution bank statement images. The study evaluates two different OCR model architectures—the accuracy of FSRCNN model being the best—achieving an accuracy above 93% in OCR. Additionally, we analyze a generative AI-based Q&A chatbot to simplify analysis for novice users.


Cite as : EVALUATION OF GENERATIVE AI Q&A CHATBOT CHAINED TO OPTICAL CHARACTER RECOGNITION MODELS FOR FINANCIAL DOCUMENTS, https://doi.org/10.1145/3647750.3647766 ICMLSC 2024: 2024 The 8th International Conference on Machine Learning and Soft Computing, Singapore, Singapore, January 2024


13 views0 comments

Recent Posts

See All

Comments


bottom of page