<XML><RECORDS><RECORD><REFERENCE_TYPE>0</REFERENCE_TYPE><REFNUM>9140</REFNUM><AUTHORS><AUTHOR>Damoulas,T.</AUTHOR><AUTHOR>Girolami,M.A.</AUTHOR></AUTHORS><YEAR>2009</YEAR><TITLE>Combining Feature Spaces for Classification</TITLE><PLACE_PUBLISHED>Pattern Recognition, Volume 42, Issue 11</PLACE_PUBLISHED><PUBLISHER>Elsevier Science</PUBLISHER><PAGES>2671-2683</PAGES><LABEL>Damoulas:2009:9140</LABEL><KEYWORDS><KEYWORD>Multiple Kernel Learning</KEYWORD></KEYWORDS<ABSTRACT>In this paper we offer a variational Bayes approximation to the multinomial probit model for basis expansion and kernel combination. Our model is well- founded within a hierarchical Bayesian framework and is able to instructively combine available sources of information for multinomial classification. The proposed framework enables informative integration of possibly heterogeneous sources in a multitude of ways, from the simple summation of feature expansions to weighted product of kernels, and it is shown to match and in certain cases outperform the well-known ensemble learning approaches of combining individual classifiers. At the same time the approximation reduces considerably the CPU time and resources required with respect to both the ensemble learning methods and the full Markov chain Monte Carlo, Metropolis-Hastings within Gibbs solution of our model. We present our proposed framework together with extensive experimental studies on synthetic and benchmark datasets and also for the first time report a comparison between summation and product of individual kernels as possible different methods for constructing the composite kernel matrix.</ABSTRACT><URL>http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V14-4W32KJ9-3&_user=121723&_rdoc=1&_fmt=&_orig=search&_sort=d&_docanc</URL></RECORD></RECORDS></XML>