<XML><RECORDS><RECORD><REFERENCE_TYPE>0</REFERENCE_TYPE><REFNUM>7579</REFNUM><AUTHORS><AUTHOR>He,C.</AUTHOR><AUTHOR>Xu,L.X.</AUTHOR><AUTHOR>Zhang,Y.H.</AUTHOR></AUTHORS><YEAR>2001</YEAR><TITLE>Learning Convergence of CMAC Algorithm</TITLE><PLACE_PUBLISHED>Neural Processing Letters, 2001, 14(1): 61-74. </PLACE_PUBLISHED><PUBLISHER>Kluwer</PUBLISHER><PAGES>61-74</PAGES><LABEL>He:2001:7579</LABEL><KEYWORDS><KEYWORD>batch learning</KEYWORD></KEYWORDS<ABSTRACT>CMAC convergence properties both in batch and in incremental learning are analyzed. The previous conclusions about the CMAC convergence, which are deduced under the condition that the articulation matrix is positive definite, are improved into the new less limited and more general conclusions in which no additive conditions are needed. An improved CMAC algorithm with self-optimizing learning rate is proposed from the new conclusions. Simulation results show the correctness of the new conclusions and the advantages of the improved algorithm.</ABSTRACT></RECORD></RECORDS></XML>