Richard Y. Zhang

I am joining ECE Illinois (UIUC) in Fall 2019.

Postdoctoral Scholar in Berkeley IEOR mentored by Javad Lavaei.

Ph.D., MIT, EECS, 2017. Advisor: Jacob K. White.
S.M., MIT, EECS, 2012. Advisor: John G. Kassakian.
B.E. (hons), Univ. of Canterbury (NZ), EE, 2009.

My research is on computation, optimization, and machine learning, with applications in power and energy systems. My goal is to use advanced computational capabilities to learn from large datasets and solve societal problems.

Keywords. Semidefinite programming. Numerical linear algebra. Nonconvex optimization. Power systems. Power electronics. Transportation networks.

News (Recent / All)

Videos (Recent / All)

How Much Restricted Isometry is Needed In Nonconvex Matrix Recovery?
NeurIPS 2018 Spotlight (5 min)
[paper] [slides] [poster]

Recommendation engines (think YouTube and Netflix) frequently make use of low-rank matrix models. In practice, these are easily trained using SGD, apparently without getting stuck at a local minimum. In this paper, we show that we've just been getting lucky—SGD is readily defeated by bad models that "look easy" to train.

Large-Scale Sparse Inverse Covariance Estimation via Thresholding and Max-Det Matrix Completion
ICML 2018 (10 min)
[paper] [slides] [poster]

Graphical lasso is able to estimate a graphical model on \(n\) vertices from \(O(n\log(n))\) data points. We describe an algorithm that solves graphical lasso in linear \(O(n)\) time and memory, thereby allowing extremely large graphical models to be learned on laptop computers.

Publications (Select / All)

Preprints 2019 2018 2017 2016 2015 2014 2011-2013

Trivia

I am a New Zealander of Chinese descent from Christchurch, New Zealand. I played guitar in the post-rock band Mammoth (see our hits Two Weeks and Life without Light). My last name 张/張 (Zhāng) is pronounced "Djahng", but I'm fine with the anglicized "Zang" and frequently use it myself. 我会说普通话。

© 2018 Richard Y. Zhang.