删除或更新信息,请邮件至freekaoyan#163.com(#换成@)

武汉大学数学与统计学院导师教师师资介绍简介-焦雨领

本站小编 Free考研考试/2021-07-21


教师博客

其他栏目

语种 English




焦雨领

性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:



访问量:
开通时间:..
最后更新时间:..


个人简介
Research on scientific computing, statistical computing and machine learning.

Recently,we focus on mathematics on data science, Includingtheoryon deep PDEs
solvers and theory and algorithms fordeepgenerative models, representation
learning, theory for deep estimation, anddesigning and analyzing sampling algorithms.




其他联系方式









邮箱 :


教育经历 2010.9--2014.6
WHU Applied Math


研究方向 Machine Learning
Statistical Computing
Scientific Computing






教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



研究领域
Scientific computing, Statistical computing, Machine learning
Google scholar ID: https://scholar.google.com/citations?user=yFDDsVgAAAAJ&hl=en







论文成果More>> [1]. A Primal Dual Active Set Algorithm with continuation in Compressed Sensing. IEEE Transactions on Signal Processing. 62 (33). 6276-6285. 2014. MATLAB code: http://xllv.whu.edu.cn/pdascl1.zip..
[2]. A Primal Dual Active Set with Continuation Algorithm for the l0-Regularized Optimization Problem. Applied and Computational Harmonic Analysis. 39 (3). 400-426. 2015. MATALB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/pdascl0.zip.
[3]. Alternating Direction Method of Multiplier for Linear Inverse Problems. SIAM Journal on Numerical Analysis. 54 (4). 2114-2137. 2016. MATLAB code: http://xllv.whu.edu.cn/admm_lin_inv.rar.
[4]. Iterative Soft/Hard Thresholding Homotopy Algorithm for Sparse Recovery. IEEE Signal Processing Letter. 24 (6). 784-788. 2017. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/ishtc.zip.
[5]. Preasymptotic Convergence of Randomized Kaczmarz Method. Inverse Problems. 33 (12). 125012. 2017.
[6]. Group Sparse Recovery via the l0(l2) Penalty: Theory and Algorithm. IEEE Transactions on Signal Processing. 65 (4). 998-1012. 2017. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/gpdasc.zip.

专利暂无内容

著作成果暂无内容

科研项目 [1]. Fast Algorithm for Nonconvex Sparsity Regularization.
[2]. Nonconvex Sparsity Regularization: Model and Algorithm.
[3]. Parallel and Distributed Sparse Learning.

科研团队暂无内容






教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



研究领域 当前位置: 中文主页 >> 科学研究 >> 研究领域


Scientific computing, Statistical computing, Machine learning
Google scholar ID: https://scholar.google.com/citations?user=yFDDsVgAAAAJ&hl=en













教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



论文成果 当前位置: 中文主页 >> 科学研究 >> 论文成果

[1]. A Primal Dual Active Set Algorithm with continuation in Compressed Sensing. IEEE Transactions on Signal Processing. 62 (33). 6276-6285. 2014. MATLAB code: http://xllv.whu.edu.cn/pdascl1.zip.
[2]. A Primal Dual Active Set with Continuation Algorithm for the l0-Regularized Optimization Problem. Applied and Computational Harmonic Analysis. 39 (3). 400-426. 2015. MATALB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/pdascl0.zip.
[3]. Alternating Direction Method of Multiplier for Linear Inverse Problems. SIAM Journal on Numerical Analysis. 54 (4). 2114-2137. 2016. MATLAB code: http://xllv.whu.edu.cn/admm_lin_inv.rar.
[4]. Iterative Soft/Hard Thresholding Homotopy Algorithm for Sparse Recovery. IEEE Signal Processing Letter. 24 (6). 784-788. 2017. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/ishtc.zip.
[5]. Preasymptotic Convergence of Randomized Kaczmarz Method. Inverse Problems. 33 (12). 125012. 2017.
[6]. Group Sparse Recovery via the l0(l2) Penalty: Theory and Algorithm. IEEE Transactions on Signal Processing. 65 (4). 998-1012. 2017. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/gpdasc.zip.
[7]. Preconditioned Alternating Direction Method of Multipliers for Solving Inverse Problems with Constraints. Inverse Problems. 33 (2). 025004. 2017.
[8]. A Constructive Approach to L0 Penalized Regression. Journal of Machine Learning Research. 19 (10). 1-37. 2018. MATLAB code: see attachment.
[9]. Robust Decoding from 1-bit Compressive Sampling by Ordinary and Regularized Least Squares. SIAM Journal on Scientific Computing. 40 (4). A2062-A2086. 2018. MATLAB code: see attachment..
[10]. Deep Generative Learning via Variational Gradeint Flow. ICML. 97. 2093-2101. 2019. Pytorch code: https://github.com/xjtuygao/VGrow.
[11]. Fitting Sparse Linear Models under the Sufficient and Necessary Condition for Model Identification. Statistics & Probability Letters. 168. 108925. 2020.
[12]. Generative Learning With Euler Particle Transport. MSML. 145. 1-33. 2021. Pytorch code: https://github.com/xjtuygao/EPT.
[13]. A Unified Primal Dual Active Set Algorithm Nonconvex Sparse Recovery. Statistical Science. 36 (2). 215-238. 2021. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/updasc.zip.
[14]. REMI: Regression with Marginal Information and Its Application in Genome-wide Association Studies. Statistica Sinica. 31 (4). 2021. R code: https://github.com/gordonliu810822/REMI.
[15]. Deep Generative Learning via Schrodinger Bridge. ICML, arXiv preprint arXiv: 2106.10410. 2021. Demo Pytorch code see attachment.
[16]. Robust Decoding from Binary Measurements with Cardinality Constraint Least Squares. arXiv preprint arXiv:2006.02890.
[17]. Deep Dimension Reduction for Supervised Representation Learning. arXiv preprint arXiv:2006.05865. 2020. We provide a framework for deep supervised representation learnling via dimension reduction. Pytoch code: https://github.com/Liao-Xu/DDR.
[18]. Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality on Holder Class. arXiv preprint arXiv:2103.00542. 2021. One of the fundermental question in deep learning is to show the supper approximation power of deep neural network in comparing with traditional tools such as splines and wavelets.We prove that deep neural networks with ReLU-Sine-Exp activations overcome the curse of dimentionality in Holder class.
[19]. Convergence Rate Analysis for Deep Ritz Method. arXiv preprint arXiv:2103.13330. 2021. We prove a convergence rate of deep Ritz method for solving elliptic equation with Neumann boundary conditions.
[20]. Convergence Analysis for the PINNs. Submitted to NIPS 2021. PINNs is a popular deep solver for PDEs. We prove the convergence rate to PINNs for the second order elliptic equations with Dirichlet boundary condition.
[21]. Deep Ritz Methods for Laplace Equations with Dirichlet Boundary Condition. Submitted to NIPS 2021. We provide convergence rate of DRM for Laplace equations with Dirichlet boundary condition via Robin approximation.
[22]. Deep Nonparametric Regression on Approximately Low-dimensional Manifolds. arXiv preprint arXiv:2104.06708. 2021. We derive non-asymptotic upper bounds for the prediction error of deep regression. Our error bounds achieve the minimax optimal rate and significantly improve over the existing ones in the sense that they depend linearly or quadratically on the dimension d of the predictor, instead of exponentially on d. We show that the neural regression estimator can circumvent the curse of dimensionality if the data has intrinsic low dimension structure.
[23]. Non-asymptotic Excess Risk Bounds for Classification with Deep Convolutional Neural Networks. arXiv preprint arXiv:2105.00292. 2021. We prove non-asymptotic Excess Risk Bounds for Classification with Deep Convolutional Neural Networks under a class of convex losses.
[24]. An Error Analysis of Generative Adversarial Networks for Learning Distributions. arXiv preprint arXiv:2105.13010. 2021. We provide the first error analysis of GANs that learn transformations from low-dimensional distributions to high-dimensional distributions. Our main results estimate the convergence rates of GANs under a collection of integral probability metrics defined through Holder classes, including Wasserstein distance as a special case. We also show that GANs are able to adaptively learn distributions with low-dimensional structure or have Holder densities.
[25]. Relative Entropy Gradient Sampler for Unnormalized Distributions. Submitted to NIPS 2021. We propose a relative entropy gradient sampler (REGS) for sampling from unnormalized distributions. The main idear of REGS is an inexact Wasserstein gradient flow of relative entropy, where the velocity fields are estimated from data with deep neural networks.
[26]. Schrodinger-Follmer Sampler for Gaussian Mixtures. Submitted to NIPS 2021. We develop an efficient algorithm for sampling from mixtures of Gaussians based on the Schrodinger-Follmer diffusions. Unlike the existing framework of sampling algirhtms, our method enjoy strong theoritical guarantee without the ergodicity requirement.
[27]. Schrodinger-Follmer Sampler: Sampling without Ergodicity. arXiv preprint arXiv: 2106.10880. 2021. R code and Python code: https://github.com/Liao-Xu/SFS_R and https://github.com/Liao-Xu/SFS_py.

共27条1/1 首页上页下页尾页








教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



专利 当前位置: 中文主页 >> 科学研究 >> 专利


共0条0/0








教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



著作成果 当前位置: 中文主页 >> 科学研究 >> 著作成果


共0条0/0








教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



科研项目 当前位置: 中文主页 >> 科学研究 >> 科研项目

[1]. Fast Algorithm for Nonconvex Sparsity Regularization.
[2]. Nonconvex Sparsity Regularization: Model and Algorithm.
[3]. Parallel and Distributed Sparse Learning.

共3条1/1 首页上页下页尾页








教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



科研团队 当前位置: 中文主页 >> 科学研究 >> 科研团队

共0条0/0









教师博客

其他栏目

语种 English





焦雨领


性别:男
出生日期:1986-07-01
所在单位:数学与统计学院
入职时间: 2020-11-01
办公地点:东北楼 212
电子邮箱:




访问量:
开通时间:..
最后更新时间:..



教学资源 [1]. Introduction to data science
[2]. Statistical Learning

授课信息暂无内容

教学成果暂无内容




相关话题/统计学院 数学