Abstract: In this paper, we study the incorporation of the support vector machine (SVM) into the (hierarchical) mixture of experts model to form a support vector mixture. We show that, in both classification and regression problems, the use of a support vector mixture leads to quadratic programming (QP) problems that are very similar to those for a SVM, with no increase in the dimensionality of the QP problems. Moreover, a support vector mixture, besides allowing for the use of different experts in different regions of the input space, also supports easy combination of different architectures such as polynomial networks and radial basis function networks.
Proceedings of the International Conference on Pattern Recognition (ICPR), pp.255-258, Brisbane, Australia, August 1998.
Postscript: http://www.cs.ust.hk/~jamesk/papers/icpr98.ps.gz