Header menu link for other important links
X
Decision-level fusion for single-view gait recognition with various carrying and clothing conditions
Al-Tayyan A., , Shanableh T.
Published in Elsevier Ltd
2017
Volume: 61
   
Pages: 54 - 69
Abstract
Gait recognition is one of the latest and attractive biometric techniques, due to its potential in identification of individuals at a distance, unobtrusively and even using low resolution images. In this paper we focus on single lateral view gait recognition with various carrying and clothing conditions. Such a system is needed in access control applications whereby a single view is imposed by the system setup. The gait data is firstly processed using three gait representation methods as the features sources; Accumulated Prediction Image (API) and two new gait representations namely; Accumulated Flow Image (AFI) and Edge-Masked Active Energy Image (EMAEI). Secondly, each of these methods is tested using three matching classification schemes; image projection with Linear Discriminant Functions (LDF), Multilinear Principal Component Analysis (MPCA) with K-Nearest Neighbor (KNN) classifier and the third method: MPCA plus Linear Discriminant Analysis (MPCA + LDA) with KNN classifier. Gait samples are fed into the MPCA and MPCALDA algorithms using a novel tensor-based form of the gait images. This arrangement results into nine recognition sub-systems. Decisions from the nine classifiers are fused using decision-level (majority voting) scheme. A comparison between unweighted and weighted voting schemes is also presented. The methods are evaluated on CASIA B Dataset using four different experimental setups, and on OU-ISIR Dataset B using two different setups. The experimental results show that the classification accuracy of the proposed methods is encouraging and outperforms several state-of-the-art gait recognition approaches reported in the literature. © 2017 Elsevier B.V.
About the journal
JournalData powered by TypesetImage and Vision Computing
PublisherData powered by TypesetElsevier Ltd
ISSN02628856
Open AccessNo