Abstract
In the domain of handwritten character recognition, inpainting occluded offline characters is essential. Relying on the remarkable achievements of transformers in various tasks, we present a novel framework called “Enhanced Inpainting with Multi-head Attention and stacked long short-term memory (LSTM) Network” (E-Inpaint). This framework aims to restore occluded offline handwriting while capturing its online signal counterpart, enriched with dynamic characteristics. The proposed approach employs Convolutional Neural Network (CNN) and Multi-Layer Perceptron (MLP) in order to extract essential hidden features from the handwriting image. These features are then decoded by stacked LSTM with Multi-head Attention, achieving the inpainting process and generating the online signal corresponding to the uncorrupted version. To validate our work, we utilize the recognition system Beta-GRU on Latin, Indian, and Arabic On/Off dual datasets. The obtained results show the efficiency of using stacked-LSTM network with multi-head attention, enhancing the quality of the restored image and significantly improving the recognition rate using the innovative Beta-GRU system. Our research mainly highlights the potential of E-Inpaint in enhancing handwritten character recognition systems.
Original language | English |
---|---|
Article number | 6 |
Journal | Cognitive Computation |
Volume | 17 |
Issue number | 1 |
DOIs | |
Publication status | Published - Feb 2025 |
Keywords
- Attention mechanism
- Inpainting
- LSTM
- Occluded offline handwriting
- Transformer
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition
- Computer Science Applications
- Cognitive Neuroscience