摘要
WhenusingAdaBoosttoselectdiscriminantfeaturesfromsomefeaturespace(e.g.Gaborfeaturespace)forfacerecognition,cascadestructureisusuallyadoptedtoleveragetheasymmetryinthedistributionofpositiveandnegativesamples.EachnodeinthecascadestructureisaclassifiertrainedbyAdaBoostwithanasymmetriclearninggoalofhighrecognitionratebutonlymoderatelowfalsepositiverate.OnelimitationofAdaBoostarisesinthecontextofskewedexampledistributionandcascadeclassifiers:AdaBoostminimizestheclassificationerror,whichisnotguaranteedtoachievetheasymmetricnodelearninggoal.Inthispaper,weproposetousetheasymmetricAdaBoost(Asym-Boost)asamechanismtoaddresstheasymmetricnodelearninggoal.Moreover,thetwopartsoftheselectingfeaturesandformingensembleclassifiersaredecoupled,bothofwhichoccursimultaneouslyinAsymBoostandAdaBoost.FisherLinearDiscriminantAnalysis(FLDA)isusedontheselectedfea-turestolearnalineardiscriminantfunctionthatmaximizestheseparabilityofdataamongthedifferentclasses,whichwethinkcanimprovetherecognitionperformance.Theproposedalgorithmisdem-onstratedwithfacerecognitionusingaGaborbasedrepresentationontheFERETdatabase.Ex-perimentalresultsshowthattheproposedalgorithmyieldsbetterrecognitionperformancethanAdaBoostitself.
出版日期
2008年03月13日(中国期刊网平台首次上网日期,不代表论文的发表时间)