Efficient Hyperkernel Learning using Second-Order Cone Programming

Ivor W. Tsang, James T. Kwok

Abstract: The kernel function plays a central role in kernel methods. Most existing methods can only adapt the kernel parameters or the kernel matrix based on empirical data. Recently, Ong et al introduced the method of hyperkernels which can be used to learn the kernel function directly in an inductive setting. However, the associated optimization problem is a semidefinite program (SDP), which is very computationally expensive even with the recent advances in interior point methods. In this paper, we show that this learning problem can be equivalently reformulated as a second-order cone program (SOCP), which can then be solved more efficiently than SDPs. Experimental results on both toy and real-world data sets show significant speedup. Moreover, in comparison with the kernel matrix learning method proposed by Lanckriet et al, our proposed SOCP-based hyperkernel method yields better generalization performance, with a speed that is comparable to their formulation based on quadratically constrained quadratic programming (QCQP).

Proceedings of the European Conference on Machine Learning (ECML-2004), pp.453-464, Pisa, Italy, September 2004.

Postscript: http://www.cs.ust.hk/~jamesk/papers/ecml04.ps.gz


Back to James Kwok's home page.