Date 
November 24, 2010 
Speaker 
Dr. Masashi Sugiyama, Tokyo Institute of Technology, Japan 
Title 
Density Ratio Estimation: A New Versatile Tool for Machine Learning

Abstract 
Recently, we developed a new ML framework that allows us to
systematically avoid density estimation. The key idea is to directly
estimate the ratio of density functions, not densities themselves.
Our framework includes various ML tasks such as importance sampling
(e.g., covariate shift adaptation, transfer learning, multitask
learning), divergence estimation (e.g., twosample test, outlier
detection, change detection in timeseries), mutual information
estimation (e.g., independence test, independent component analysis,
feature selection, sufficient dimension reduction, causal inference),
and conditional probability estimation (e.g., probabilistic
classification, conditional density estimation).
In this talk, we introduce the density ratio framework, review methods
of density ratio estimation, and show various realworld applications
including braincomputer interface, speech recognition, image
recognition, and robot control.
http://sugiyamawww.cs.titech.ac.jp/~sugi/publications.html
References;
[Review of the density ratio framework of machine learning] Sugiyama, M., Kanamori, T., Suzuki, T., Hido, S., Sese, J., Takeuchi, I., & Wang, L. A densityratio framework for statistical data processing. IPSJ Transactions on Computer Vision and Applications, vol.1, pp.183208, 2009.
[Review of density ratio estimation methods] Sugiyama, M., Suzuki, T., & Kanamori, T. Density ratio estimation: A comprehensive review: In Statistical Experiment and Its Related Topics, Research Institute for Mathematical Sciences Kokyuroku, 2010.
[Recent papers] Kanamori, T., Hido, S., & Sugiyama, M. A leastsquares approach to direct importance estimation. JMLR, vol.10, pp.13911445, 2009.
Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D. Conditional density estimation via leastsquares density ratio estimation. AISTATS2010
Suzuki, T. & Sugiyama, M. Sufficient dimension reduction via squaredloss mutual information estimation. AISTATS2010
Yamada, M. & Sugiyama, M. Dependence minimizing regression with model selection for nonlinear causal inference under nonGaussian noise. AAAI2010
Sugiyama, M. & Simm, J. A computationallyefficient alternative to kernel logistic regression. MLSP2010
Hachiya, H. & Sugiyama, M. Feature selection for reinforcement learning: Evaluating implicit statereward dependency via conditional mutual information. ECMLPKDD2010

