Selected Publications
(Full List)
Statistical Machine Learning, Optimization for Machine Learning and TCS:
Dongfang Sun, Yingzhen Yang.
Locally Regularized Sparse Graph by Fast Proximal Gradient Descent.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2023 (
Spotlight).
[
Paper]
Yingzhen Yang, Ping Li.
Projective Proximal Gradient Descent for Nonconvex Nonsmooth Optimization: Fast Convergence Without Kurdyka-Lojasiewicz (KL) Property.
Proc. of International Conference on Learning Representations (ICLR) 2023.
[
Paper]
Yingzhen Yang, Ping Li.
Noisy L0-Sparse Subspace Clustering on Dimensionality Reduced Data
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2022.
[
arXiv]
Our theoretical analysis reveals the advantage of L0-Sparse Subspace Clustering (L0-SSC) on noisy data in terms of subspace affinity over
other competing SSC methods, and proposes provable random projection based methods to accelerate noisy L0-SSC.
Yingzhen Yang, Ping Li.
Discriminative Similarity for Data Clustering.
Proc. of International Conference on Learning Representations (ICLR) 2022.
[
Paper]
[
arXiv]
Our work analyzes the generalization error of similarity-based classification by Rademacher complexity on an augmented RKHS, where a general similairty function, which is not necessarily a PSD kernel, can be decomposed into
two PSD kernels. Such similarity-based classification is then applied to data clustering in an unsupervised manner (by joint optimization of class labels and classifer parameters).
Yingzhen Yang, Jiahui Yu.
Fast Proximal Gradient Descent for A Class of Non-convex and Non-smooth Sparse Learning Problems.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2019.
[
Paper]
[
Supplementary]
[
Code]
Yingzhen Yang.
Dimensionality Reduced L0-Sparse Subspace Clustering.
Proc. of International Conference on Artificial Intelligence and Statistics (AISTATS) 2018.
[
Paper]
[
Supplementary]
Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
Efficient Proximal Gradient Descent for L0 Sparse Approximation.
The Annual ACM Symposium on Theory of Computing (STOC) 2017 Workshop on New Challenges in Machine Learning
- Robustness and Nonconvexity.
Yingzhen Yang, Jiashi Feng, Jiahui Yu, Jianchao Yang, Pushmeet Kohli, Thomas S. Huang.
Neighborhood Regularized L1-Graph.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2017.
[
Paper]
Yingzhen Yang, Jiahui Yu, Pushmeet Kohli, Jianchao Yang, Thomas S. Huang.
Support Regularized Sparse Coding and Its Fast Encoder.
Proc. of International Conference on Learning Representations (ICLR) 2017.
[
Paper]
Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
Subspace Learning by L0-Induced Sparsity.
International Journal of Computer Vision (IJCV) 2018, special issue on the best of European Conference on Computer Vision (ECCV) 2016.
Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
L0-Sparse Subspace Clustering.
Proc. of European Conference on Computer Vision (ECCV) 2016. (
Oral Presentation, Among 11 Best Paper Candidates)
[
Paper]
[
Supplementary]
[
Slides]
[
Code (Both CUDA C++ for extreme efficiency and MATLAB)]
Our work establishs almost surely equivalence between L0 sparsity and subspace detection property,
under the mild condition of i.i.d. data generation and nondegenerate distribution. This is much milder than previous
conditions required by L1 sparse subspace clustering literature.
Click to see the key points in the talk
The key points in the talk are:
- Goal of subspace clustering
- Block-diagonal similarity matrix (where similarity between data from different subspaces vanishes) due to the Subspace Detection Property (SDP). Partitioning data according to their subspace structure is performed by clustering the data with the similarity matrix (spectral clustering in this work).
- Existing SSC method achieves SDP under various assumptions on the subspaces and data, such as ingradius and subspace incoherence. L0-SSC achieves SDP almost surely under much milder assumptions (Thm .1)
- No free lunch and no better deal: suppose there is a linear representation of data that satisfies SDP, then the optimal solution to L0 sparse representation problem can be obtained almost surely from it in polynomial time. (Thm. 2)
- Therefore, almost surely equivalence between L0-sparsity and SDP is established for the first time.
- Using proximal gradient descent (PGD) for the optimization problem of L0-SSC, with guarantee of convergence (Prop. 1). Moreover, the obtained sub-optimal solution
is close to the globally optimal solution (Thm. 3).
- Property of support shrinkage during the iterations of PGD (Prop. 2). This property is used to significantly reduce the computational burden of the optimization.
Yingzhen Yang, Feng Liang, Shuicheng Yan, Zhangyang Wang, Thomas S. Huang.
On a Theory of Nonparametric Pairwise Similarity for Clustering: Connecting Clustering to Classification.
Proc. of Advances in Neural Information Processing Systems (NIPS) 2014.
[
Paper]
[
BIBTEX]
[
Supplementary]
[
Poster]
Deep Learning (model compression), AutoML:
Yingzhen Yang, Jiahui Yu, Nebojsa Jojic, Jun Huan, Thomas S. Huang.
FSNet: Compression of Deep Convolutional Neural Networks by Filter Summary.
Proc. of International Conference on Learning Representations (ICLR) 2020.
[
OpenReview]
[
ArXiv]
Xiaojie Jin,
Yingzhen Yang, Ning Xu, Jianchao Yang, Nebojsa Jojic, Jiashi Feng, Shuicheng Yan.
WSNet: Compact and Efficient Networks Through Weight Sampling.
Proc. of International Conference on Machine Learning (ICML) 2018.
[
Paper]
[
Supplementary]
Others:
Yingzhen Yang, Zhangyang Wang, Zhaowen Wang, Shiyu Chang, Ding. Liu, Honghui Shi, Thomas S. Huang.
Epitomic Image Super-Resolution.
Proc. of AAAI Conference on Artificial Intelligence (AAAI) 2016 (
Best Poster/Best Presentation Finalist for Student Poster Program).
[
Project&Code]
Full Publication List
(Recent Publications)
Journal Papers:
Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
Subspace Learning by L0-Induced Sparsity.
International Journal of Computer Vision (IJCV) 2018, special issue on the best of European Conference on Computer Vision (ECCV) 2016.
Zhangyang Wang,
Yingzhen Yang, Zhaowen Wang, Shiyu Chang, Jianchao Yang, Thomas S. Huang.
Learning Super-Resolution Jointly from External and Internal Examples.
IEEE Transactions on Image Processing, 2015.
[
Paper]
Shiyu Chang, Guo-Jun Qi,
Yingzhen Yang, Charu C. Aggarwal, Jiayu Zhou, Meng Wang, Thomas S. Huang.
Large-Scale Supervised Similarity Learning in Networks.
Knowledge and Information Systems, 2015. (
the conference version of this paper received ICDM 2014 Best
Student Paper Award)
[
Paper]
Yingzhen Yang, Yichen Wei, Chunxiao Liu, Qunsheng Peng, Yasuyuki Matsushita.
An Improved Belief Propagation Method for Dynamic Collage.
The Visual Computer, 25(5-7):431-439, 2009.
[
Paper]
[
BIBTEX]
[
Slides]
[
Demo of Browsing Butterflies]
[
Demo of Browsing Flowers]
[
Project Page]
Chunxiao Liu,
Yingzhen Yang, Qunsheng Peng, Jin Wang, Wei Chen.
Distortion Optimization based Image Completion from a Large Displacement View.
Computer Graphics Forum, 27(7): 1755-1764, 2008.
[
Paper]
[
BIBTEX]
[
Supporting Figures]
[
Project Page]
Chengfang Song, Yang Yu,
Yingzhen Yang, Fazhi He, Qingzhu, Qunsheng Peng.
Data-Driven Realistic Animation of Large-Scale Forest.
Journal of Computer-Aided Design&Computer Graphics, Vol 20, No.8, 2008.
Conference/Workshop Papers:
Dongfang Sun, Yingzhen Yang.
Locally Regularized Sparse Graph by Fast Proximal Gradient Descent.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2023 (
Spotlight).
[
Paper]
Yingzhen Yang, Ping Li.
Projective Proximal Gradient Descent for Nonconvex Nonsmooth Optimization: Fast Convergence Without Kurdyka-Lojasiewicz (KL) Property.
Proc. of International Conference on Learning Representations (ICLR) 2023.
[
Paper]
Yingzhen Yang, Ping Li.
Noisy L0-Sparse Subspace Clustering on Dimensionality Reduced Data
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2022.
[
arXiv]
Our theoretical analysis reveals the advantage of L0-Sparse Subspace Clustering (L0-SSC) on noisy data in terms of subspace affinity over
other competing SSC methods, and proposes provable random projection based methods to accelerate noisy L0-SSC.
Yingzhen Yang, Ping Li.
Discriminative Similarity for Data Clustering.
Proc. of International Conference on Learning Representations (ICLR) 2022.
[
Paper]
[
arXiv]
Our work analyzes the generalization error of similarity-based classification by Rademacher complexity on an augmented RKHS, where a general similairty function, which is not necessarily a PSD kernel, can be decomposed into
two PSD kernels. Such similarity-based classification is then applied to data clustering in an unsupervised manner (by joint optimization of class labels and classifer parameters).
Yingzhen Yang, Ping Li.
FROS: Fast Regularized Optimization by Sketching.
IEEE International Symposium on Information Theory (ISIT) 2021.
Yingzhen Yang, Jiahui Yu, Nebojsa Jojic, Jun Huan, Thomas S. Huang.
FSNet: Compression of Deep Convolutional Neural Networks by Filter Summary.
Proc. of International Conference on Learning Representations (ICLR) 2020.
[
OpenReview]
[
ArXiv]
Yingzhen Yang, Jiahui Yu.
Fast Proximal Gradient Descent for A Class of Non-convex and Non-smooth Sparse Learning Problems.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2019.
[
Paper]
[
Supplementary]
[
Code]
Yingzhen Yang.
Dimensionality Reduced L0-Sparse Subspace Clustering.
Proc. of International Conference on Artificial Intelligence and Statistics (AISTATS) 2018.
[
Paper]
[
Supplementary]
Xiaojie Jin,
Yingzhen Yang, Ning Xu, Jianchao Yang, Nebojsa Jojic, Jiashi Feng, Shuicheng Yan.
WSNet: Compact and Efficient Networks Through Weight Sampling.
Proc. of International Conference on Machine Learning (ICML) 2018.
[
Paper]
[
Supplementary]
Yingzhen Yang, Jiashi Feng, Jiahui Yu, Jianchao Yang, Pushmeet Kohli, Thomas S. Huang.
Neighborhood Regularized L1-Graph.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2017.
Yingzhen Yang, Jiahui Yu, Pushmeet Kohli, Jianchao Yang, Thomas S. Huang.
Support Regularized Sparse Coding and Its Fast Encoder.
Proc. of International Conference on Learning Representations (ICLR) 2017.
Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
L0-Sparse Subspace Clustering.
Proc. of European Conference on Computer Vision (ECCV) 2016. (
Oral Presentation, Among 11 Best Paper Candidates)
[
Paper]
[
Supplementary]
[
Slides]
[
Code (Both CUDA C++ for extreme efficiency and MATLAB)]
Our work establishs almost surely equivalence between L0 sparsity and subspace detection property,
under the mild condition of i.i.d. data generation and nondegenerate distribution. This is much milder than previous
conditions required by L1 sparse subspace clustering literature.
Click to see the key points in the talk
The key points in the talk are:
- Goal of subspace clustering
- Block-diagonal similarity matrix (where similarity between data from different subspaces vanishes) due to the Subspace Detection Property (SDP). Partitioning data according to their subspace structure is performed by clustering the data with the similarity matrix (spectral clustering in this work).
- Existing SSC method achieves SDP under various assumptions on the subspaces and data, such as ingradius and subspace incoherence. L0-SSC achieves SDP almost surely under much milder assumptions (Thm .1)
- No free lunch and no better deal: suppose there is a linear representation of data that satisfies SDP, then the optimal solution to L0 sparse representation problem can be obtained almost surely from it in polynomial time. (Thm. 2)
- Therefore, almost surely equivalence between L0-sparsity and SDP is established for the first time.
- Using proximal gradient descent (PGD) for the optimization problem of L0-SSC, with guarantee of convergence (Prop. 1). Moreover, the obtained sub-optimal solution
is close to the globally optimal solution (Thm. 3).
- Property of support shrinkage during the iterations of PGD (Prop. 2). This property is used to significantly reduce the computational burden of the optimization.
Yingzhen Yang, Jianchao Yang, Wei Han, Thomas S. Huang.
On the Sub-Optimality of Proximal Gradient Descent for L0 sparse approximation.
ICML 2016 workshop on Advances in Non-Convex Analysis and Optimization.
[
Paper]
Zhiding Yu, Weiyang Liu, Wenbo Liu,
Yingzhen Yang, Ming Li and Vijayakumar Bhagavatula.
On Order-Constrained Transitive Distance Clustering.
Proc. of AAAI Conference on Artificial Intelligence (AAAI) 2016.
Zhangyang Wang,
Yingzhen Yang, Shiyu Chang, Jinyan Li, Simon Fong, Thomas S. Huang.
A Joint Optimization Framework of Sparse Coding and Discriminative Clustering.
Proc. of International Joint Conferences on Artificial Intelligence (IJCAI) 2015.
Yingzhen Yang, Zhangyang Wang, Zhaowen Wang, Shiyu Chang, Ding. Liu, Honghui Shi, Thomas S. Huang.
Epitomic Image Super-Resolution.
Proc. of AAAI Conference on Artificial Intelligence (AAAI) 2016 (
Best Poster/Best Presentation Finalist for Student Poster Program).
[
Project&Code]
Yingzhen Yang, Xinqi Chu, Zhangyang Wang, Thomas S. Huang.
Nonparametric Maximum Margin Similarity for Semi-Supervised Learning.
Advances in Neural Information Processing Systems (NIPS) 2014 workshop on Modern Nonparametrics: Automating the Learning Pipeline, spotlight talk.
[
Paper]
[
Slides]
[
Poster]
Yingzhen Yang, Feng Liang, Shuicheng Yan, Zhangyang Wang, Thomas S. Huang.
On a Theory of Nonparametric Pairwise Similarity for Clustering: Connecting Clustering to Classification.
Proc. of Advances in Neural Information Processing Systems (NIPS) 2014.
[
Paper]
[
BIBTEX]
[
Supplementary]
[
Poster]
Yingzhen Yang, Zhangyang Wang, Jianchao Yang, Jiawei Han, Thomas S. Huang.
Regularized L1-Graph for Data Clustering.
Proc. of British Machine Vision Conference (BMVC) 2014.
Yingzhen Yang, Zhangyang Wang, Jianchao Yang, Thomas S. Huang.
Data Clustering by Laplacian Regularized L1-Graph.
Proc. of AAAI Conference on Artificial Intelligence (AAAI) 2014.
Yingzhen Yang, Xinqi Chu, Thomas S. Huang.
Nonparametric Pairwise Similarity for Discriminative Clustering.
Advances in Neural Information Processing Systems (NIPS) 2013 workshop on Modern Nonparametric Methods in
Machine Learning, spotlight talk.
[
Link]
Yingzhen Yang, Xinqi Chu, Tian-Tsong Ng, Alex Yong-Sang Chia, Jianchao Yang, Hailin Jin, Thomas S. Huang.
Epitomic Image Colorization.
Proc. of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2014. (
Oral Presentation)
[
Paper]
[
BIBTEX]
[
Slides]
Yingzhen Yang, Feng Liang, Thomas S. Huang.
Discriminative Exemplar Clustering.
Proc. of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2014.
Yingzhen Yang, Xinqi Chu, Feng Liang, Thomas S. Huang.
Pairwise Exemplar Clustering.
Proc. of AAAI Conference on Artificial Intelligence (AAAI) 2012.
[
Paper]
Yingzhen Yang, Yang Cai.
Virtual Gazing in Video Surveillance.
ACM Multimedia Workshop on Surreal Media and Virtual Cloning, in conjunction with ACM Multimedia 2010.
[
Paper]
Yingzhen Yang, Yin Zhu, Qunsheng Peng.
Image Completion Using Structural Priority Belief Propagation.
Proc. of ACM International Conference on Multimedia 2009.
Yingzhen Yang, Yin Zhu, Qunsheng Peng.
Entertaining Video Warping.
Proc. of the IEEE international conference on CAD/Graphics (HCI Session) 2009.
[
Paper]
Chunxiao Liu,
Yingzhen Yang, Qunsheng Peng, Yanwen Guo.
A New Distortion Minimization Approach for Image Completion based on a Large Displacement View.
Proc. of Computer Graphics International (Texture Session) 2008.
Technical Reports:
Dan Gelb,
Yingzhen Yang, Mitch Trott.
Efficient Markerless Augmented Reality.
Submitted to HP Tech Con. 2012.
Yichen Wei, Yasuyuki Matsushita and
Yingzhen Yang.
Efficient Optimization of Photo Collage.
Technical Report, Microsoft Research, MSR-TR-2009-59, May, 2009.
[
Link]
[
Paper]
[
Project Page with Demos]
Others:
Yingzhen Yang, Yang Cai.
Virtual Gazing in Video Surveillance.
ACM Multimedia Workshop on Surreal Media and Virtual Cloning, in conjunction with ACM Multimedia 2010.