Error reading XSLT file: \xslt\languageSelector.xslt

当前位置:计算和应用数学中心 >> 学术交流 >> 学术活动 >> 学术报告:Learning Gradients for variable selection and dimension reduction

学术活动


学术报告:Learning Gradients for variable selection and dimension reduction

报告人: 叶桂波 助理教授

          University of Iowa

 

报告题目:Learning Gradients for  variable selection and dimension reduction

 

报告时间:2013年06月13日下午15:30开始

 

报告地点:行政楼703

  

报告摘要: Variable selection and dimension  reduction are two commonly adopted approaches for high-dimensional data  analysis, but have traditionally been treated separately. In this talk, we  first show that  variable selection and dimension reduction are closely  related by learning sparse Gradients. Firstly, we  introduce a sparse gradient model which impose a sparsity  constraint on the gradient. Variable selection is achieved by selecting  variables corresponding to non-zero partial derivatives, and effective  direction spaces are extracted based on the eigenvectors of the derived sparse empirical gradient covariance  matrix. An efficient iterative soft-thresholding algorithm is  developed to solve the sparse gradient learning model,  making the framework practically scalable for medium or large data sets.  The utility of SGL  for variable selection and feature extraction is  explicitly given and illustrated on artificial data as well as real-world  examples. The main advantages of our method include variable selection for both  linear and nonlinear predictions, effective dimension reduction with sparse  loadings, and an efficient algorithm for large $p$, small $n$ problems.

 

报告人简介:叶桂波,2007年博士毕业于复旦大学,2007-2008年在香港城市大学及新加坡国立大学做访问学者,2009年至2012年在美国加州大学Irvine分校做博士后,2012年至今在 美国爱荷华大学(University of Iowa)数学系任助理教授。叶桂波博士的研究兴趣包括学习理论,凸优化及它们在生物信息学和计算生物学方面的应用,目前已发表SCI高水平论文10余篇。