Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Information Theoretic Learning with Infinitely Divisible Kernels
Luis Gonzalo Sánchez, Jose C. Principe
Jan 17, 2013 (modified: Jan 17, 2013)ICLR 2013 conference submissionreaders: everyone
Abstract:In this paper, we develop a framework for information theoretic learning based on infinitely divisible matrices. We formulate an entropy-like functional on positive definite matrices based on Renyi's entropy definition and examine some key properties of this functional that lead to the concept of infinite divisibility. The proposed formulation avoids the plug in estimation of density and brings along the representation power of reproducing kernel Hilbert spaces. We show how analogues to quantities such as conditional entropy can be defined, enabling solutions to learning problems. In particular, we derive a supervised metric learning algorithm with very competitive results.
Enter your feedback below and we'll get back to you as soon as possible.