Yingzhen Yang


About me

I am an assistant professor at Arizona State University. My research covers statistical machine learning, deep learning, optimization and theoretical computer science.

Open Positions

I am always looking for self-motivated students. One PhD research assistantship is available, and students with backgound in deep learning or machine learning and optimization are highly encouraged to apply. If you would like to work with me, please send your CV to yingzhen.yang@asu.edu with a brief introduction to your research interest and the projects/directions that mostly interest you. Undergraduate students with said background and strong programming skills are also welcome to conduct research in my group.

Research Interests

I am intersted in statistical machine learning and its theory, including theory and application of deep learning, subspace learning, manifold learning, sparse representation and compressive sensing, nonparametric models, probabilistic graphical models and generalization analysis of classification, semi-supervised learning and clustering. I also deveote efforts to optimization theory for machine learning and theoretical computer science.

In my early years I also conducted research on computer vision and computer graphics. Click to see the details

Contact

Office: BYENG 590, 699 S Mill Ave. Tempe, AZ 85281
Email: yingzhen.yang -AT- asu.edu (official), superyyzg -AT- gmail.com (personal)

Honors and Awards

2016 ECCV Best Paper Finalist (among 11 out of all submissions)
2010 Carnegie Institute of Technology Dean's Tuition Fellowship
Before 2005: Bronze medal in National Senior High School Mathematics Competition, First Prize of National Junior High School Mathematics Competition


Professional Services & Activities

Senior Program Committee Member: AAAI 2021-2022
Program Committee Member: IJCAI 2015, IJCAI 2017, IJCAI-ECAI 2018, CVPR 2018, IJCAI 2019, AAAI 2020, IJCAI-PRICAI 2020, ICML 2020-2022, NeurIPS 2020-2022
Reviewer: Journal of Machine Learning Research (JMLR), IEEE Transactions on Image Processing (TIP), Pattern Recognition (PR), Knowledge and Information Systems (KAIS), Machine Vision and Applications (Springer Journal)


Industrial Experience

Over ten years of experience in C/C++ programming and software design.

Research Intern, Microsoft Research at Redmond, WA. May 2015 to Aug. 2015. online probabilistic topic models for large-scale application with CUDA C/C++ programming.
Research Intern, Microsoft Research at Redmond, WA. May 2014 to Aug. 2014. Developed parallelized and accelerated probabilistic topic models with CUDA C/C++ programming.
Research Intern, Hewlett-Packard Labs, Palo Alto, California. May 2011 to Aug. 2011. Efficient markerless augmented reality with C/C++ programming.


Selected Publications (Full List)

Statistical Machine Learning, Optimization for Machine Learning and TCS:

Dongfang Sun, Yingzhen Yang.
Locally Regularized Sparse Graph by Fast Proximal Gradient Descent.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2023 (Spotlight). [Paper]

Yingzhen Yang, Ping Li.
Projective Proximal Gradient Descent for Nonconvex Nonsmooth Optimization: Fast Convergence Without Kurdyka-Lojasiewicz (KL) Property.
Proc. of International Conference on Learning Representations (ICLR) 2023. [Paper]

Yingzhen Yang, Ping Li.
Noisy L0-Sparse Subspace Clustering on Dimensionality Reduced Data
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2022. [arXiv]
Our theoretical analysis reveals the advantage of L0-Sparse Subspace Clustering (L0-SSC) on noisy data in terms of subspace affinity over other competing SSC methods, and proposes provable random projection based methods to accelerate noisy L0-SSC.


Yingzhen Yang, Ping Li.
Discriminative Similarity for Data Clustering.
Proc. of International Conference on Learning Representations (ICLR) 2022. [Paper] [arXiv]
Our work analyzes the generalization error of similarity-based classification by Rademacher complexity on an augmented RKHS, where a general similairty function, which is not necessarily a PSD kernel, can be decomposed into two PSD kernels. Such similarity-based classification is then applied to data clustering in an unsupervised manner (by joint optimization of class labels and classifer parameters).

Yingzhen Yang, Jiahui Yu.
Fast Proximal Gradient Descent for A Class of Non-convex and Non-smooth Sparse Learning Problems.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2019. [Paper] [Supplementary] [Code]

Yingzhen Yang.
Dimensionality Reduced L0-Sparse Subspace Clustering.
Proc. of International Conference on Artificial Intelligence and Statistics (AISTATS) 2018. [Paper] [Supplementary]

Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
Efficient Proximal Gradient Descent for L0 Sparse Approximation.
The Annual ACM Symposium on Theory of Computing (STOC) 2017 Workshop on New Challenges in Machine Learning - Robustness and Nonconvexity.

Yingzhen Yang, Jiashi Feng, Jiahui Yu, Jianchao Yang, Pushmeet Kohli, Thomas S. Huang.
Neighborhood Regularized L1-Graph.
Proc. of Conference on Uncertainty in Artificial Intelligence (UAI) 2017. [Paper]

Yingzhen Yang, Jiahui Yu, Pushmeet Kohli, Jianchao Yang, Thomas S. Huang.
Support Regularized Sparse Coding and Its Fast Encoder.
Proc. of International Conference on Learning Representations (ICLR) 2017. [Paper]

Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
Subspace Learning by L0-Induced Sparsity.
International Journal of Computer Vision (IJCV) 2018, special issue on the best of European Conference on Computer Vision (ECCV) 2016.

Yingzhen Yang, Jiashi Feng, Nebojsa Jojic, Jianchao Yang, Thomas S. Huang.
L0-Sparse Subspace Clustering.
Proc. of European Conference on Computer Vision (ECCV) 2016. (Oral Presentation, Among 11 Best Paper Candidates) [Paper] [Supplementary] [Slides] [Code (Both CUDA C++ for extreme efficiency and MATLAB)]
Our work establishs almost surely equivalence between L0 sparsity and subspace detection property, under the mild condition of i.i.d. data generation and nondegenerate distribution. This is much milder than previous conditions required by L1 sparse subspace clustering literature. Click to see the key points in the talk

Yingzhen Yang, Feng Liang, Shuicheng Yan, Zhangyang Wang, Thomas S. Huang.
On a Theory of Nonparametric Pairwise Similarity for Clustering: Connecting Clustering to Classification.
Proc. of Advances in Neural Information Processing Systems (NIPS) 2014. [Paper] [BIBTEX] [Supplementary] [Poster]

Deep Learning (model compression), AutoML:

Yingzhen Yang, Jiahui Yu, Nebojsa Jojic, Jun Huan, Thomas S. Huang.
FSNet: Compression of Deep Convolutional Neural Networks by Filter Summary.
Proc. of International Conference on Learning Representations (ICLR) 2020. [OpenReview] [ArXiv]

Xiaojie Jin, Yingzhen Yang, Ning Xu, Jianchao Yang, Nebojsa Jojic, Jiashi Feng, Shuicheng Yan.
WSNet: Compact and Efficient Networks Through Weight Sampling.
Proc. of International Conference on Machine Learning (ICML) 2018. [Paper] [Supplementary]

Others:

Yingzhen Yang, Zhangyang Wang, Zhaowen Wang, Shiyu Chang, Ding. Liu, Honghui Shi, Thomas S. Huang.
Epitomic Image Super-Resolution.
Proc. of AAAI Conference on Artificial Intelligence (AAAI) 2016 (Best Poster/Best Presentation Finalist for Student Poster Program). [Project&Code]