Neural network structure modeling: an application to font recognition

Date

1988-12

Journal Title

Journal ISSN

Volume Title

Publisher

Texas Tech University

Abstract

Two neural network models, Model H-Hl (Hogg and Huberman, 1984) and Model H-H2 (Hogg and Huberman, 1985) have been successfully applied to the font recognition problem and were used to recognize 26 English capital letters, each with six font representations. Recognition rate, memory space requirement, learning speed, and recognition speed were used to measure the models* performances. Model parameters such as memory array size, Smin_Smax, and Mmin_Mmax were varied to elucidate the models' behavior.

As a result, both models achieved a 100% recognition rate when all six fonts were used as the training as well as the recognition set. When three out of six fonts were used for training, Model H-Hl achieved a maximum recognition rate of 87.82% and Model H-H2 achieved a maximum recognition rate of 89.10%. This shows that the basins of attractor states existed for the letters in most of the various font presentations. Model H-H2 significantly outperformed Model H-Hl in terms of recognition rate, use of memory space, and learning speed when all six fonts were used as the training set. This was supported by the results of the Pairwised T Test.

Description

Citation