Hongyang Zhang

hongyanz AT ttic.edu

Postdoc fellow

Toyota Technological Institute at Chicago (TTIC)

Personal Site


I am a Postdoc fellow at Toyota Technological Institute at Chicago (TTIC), hosted by Avrim Blum and Greg Shakhnarovich. Before joining TTIC, I completed my Ph.D. degree in 2019 with wonderful 4-year study at Machine Learning Department, Carnegie Mellon University. I was fortunate to be co-advised by Maria-Florina Balcan and David P. Woodruff. Before joining CMU, I graduated from Peking University in 2015, working with Zhouchen Lin and Chao Zhang. My research interests broadly include theories and applications of machine learning and algorithms, such as adversarial defenses and attacks, non-convex/convex optimization, deep learning, low-rank subspace recovery, noise-tolerant active learning, property testing, and compressed sensing.


  • Machine Learning
  • AI Security
  • Optimization
  • Theoretical Computer Science


  • Ph.D. in Machine Learning, 2015-2019

    Carnegie Mellon University

  • MSc in Intelligence Science and Technology, 2012-2015

    Peking University

Curriculum Vitae


Postdoc Fellow

Toyota Technological Institute at Chicago (TTIC)

Aug 2019 – Present Chicago

Research on:

  • Machine Leanring
  • AI Security and Interpretability
  • Optimization

Visiting Researcher

Simons Institute for the Theory of Computing

Aug 2018 – Dec 2018 Berkeley

Research on:

  • Machine Leanring


Petuum Inc.

May 2018 – Aug 2018 Pittsburgh

Research on:

  • Machine Leanring

Visiting Researcher

IBM Research, Almaden

Jun 2017 – Aug 2017 San Jose

Research on:

  • Theoretical Computer Science
  • Machine Leanring

Publications @ZERO Lab

On the Applications of Robust PCA in Image and Video Processing. PIEEE, 2018.

We survey the applications of RPCA in computer vision

Completing Low-Rank Matrices with Corrupted Samples from Few Coefficients in General Basis. TIT, 2017.

In this paper, we prove that the range space of an m × n matrix with rank r can be exactly recovered from a few coefficients with …

Fast Compressive Phase Retrieval under Bounded Noise. AAAI, 2017.

We study the problem of recovering a t-sparse vector ±x0 in R^n from m quadratic equations yi = (a^T_i x)^2 with noisy measurements …

Relations among Some Low Rank Subspace Recovery Models. Neural Computation, 2015.

We discover that once a solution to one of the models is obtained, we can obtain the solutions to other models in closed-form …

Exact Recoverability of Robust PCA via Outlier Pursuit with Tight Recovery Bounds. AAAI, 2015.

We have investigated the exact recovery problem of R-PCA via Outlier Pursuit.

Robust Latent Low Rank Representation for Subspace Clustering. Neurocomputing, 2014.

We propose choosing the sparest solution in the solution set.

A Counterexample for The Validaity of Using Nuclear Norm as A Comvex Surrogate of Rank. ECML/PKDD, 2013.

We conclude that even for rank minimization problems as simple as noiseless LatLRR, replacing rank with nuclear norm is not valid and …

Academic Acativities

Reviewer to Journals

Chair/Reviewer to Conferences