Liu Ziyin, UTokyo

Liu Ziyin

Email: zliu (at)
Office: Room 949, Graduate School of Science Building 1
Department of Physics, University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033

I am a graduate student in physics at the University of Tokyo. I study and do research in theoretical physics in the Ueda Group. I am also affiliated with the newly founded Institute for Physics of Intelligence. Currently, I also work and do research at Preferred Networks in Tokyo. In the past, I worked in RIKEN AIP in collaboration with Prof. Makoto Yamada. I graduated from Carnegie Mellon University. When I was an undergraduate, I worked in the Multicomp lab in LTI under Prof. Louis-Phillipe Morency. I have also worked on experimental condensed matter physics in the LASE lab under Prof. Kenneth Burch. I also worked in Pittsburgh Supercomputing Center under Dr. Wang Yang on computational physics. If you have questions or want to collaborate, or just want to say hi, please shoot an email.

Reviewing Services: ICML, IJCAI, CVPR, ICCV, AISTATS, UAI, NeurIPS, ICLR...

Master's thesis: Mean-field learning dynamics of deep neural networks (2020).

Research Interest

I am broadly interested in physics, statistics, applied mathematics and machine learning. In particular, the following are the keywords that occupy my mind recently:

Recent Preprints

  1. SGD May Never Escape Saddle Points
  2. Liu Ziyin, Botao Li, Masahito Ueda
    Preprint 2021
    [paper] [arXiv]
  3. What Data Augmentation Do We Need for Deep-Learning-Based Finance?
  4. Liu Ziyin, Kentaro Minami, Kentaro Imajo
    Preprint 2021
    [paper] [arXiv]
  5. Strength of Minibatch Noise in SGD
  6. Liu Ziyin*, Kangqiao Liu*, Takashi Mori, Masahito Ueda
    Preprint 2021
    [paper] [arXiv]
  7. Logarithmic landscape and power-law escape rate of SGD
  8. Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda
    Preprint 2021
    [paper] [arXiv]

Selected Works

For the full list of my works, see here.
  1. On the Distributional Properties of Adaptive Gradients
  2. Zhang Zhiyi, Liu Ziyin
    UAI 2021
  3. Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent
  4. Kangqiao Liu*, Liu Ziyin*, Masahito Ueda
    ICML 2021
    [paper] [arXiv]
  5. Cross-Modal Generalization: Learning in Low Resource Modalities via Meta-Alignment
  6. Paul Pu Liang*, Peter Wu*, Liu Ziyin, Louis-Philippe Morency, Ruslan Salakhutdinov
    ACM Multimedia 2021
    NeurIPS 2020 Workshop on Meta Learning
    [arXiv] [code]
  7. Neural Networks Fail to Learn Periodic Functions and How to Fix It
  8. Liu Ziyin, Tilman Hartwig, Masahito Ueda
    NeurIPS 2020
    [paper] [arXiv]
  9. Deep Gamblers: Learning to Abstain with Portfolio Theory
  10. Liu Ziyin, Zhikang Wang, Paul Pu Liang, Ruslan Salakhutdinov, Louis-Philippe Morency, Masahito Ueda
    NeuRIPS 2019
    [paper] [arXiv] [code]
  11. Think Locally, Act Globally: Federated Learning with Local and Global Representations
  12. Paul Pu Liang*, Terrance Liu*, Liu Ziyin, Ruslan Salakhutdinov, Louis-Philippe Morency
    NeurIPS 2019 Workshop on Federated Learning (oral, distinguished student paper award)
    [paper] [arXiv] [code]
  13. Multimodal Language Analysis with Recurrent Multistage Fusion
  14. Paul Pu Liang, Ziyin Liu, Amir Zadeh, Louis-Philippe Morency
    EMNLP 2018 (oral presentation)
    [paper] [supp] [arXiv] [slides]

Presentation and Invited Talks

  1. Stochastic Gradient Descent with Multiplicative Noise
  2. RIKEN AIP, 2021
  3. Careful Deep Learning: Learning to Abstain by Training on A Simple Loss Function
  4. IPI Seminar, 2019
  5. A Full Potential Approach to the Solution of Core States
  6. American Physical Society March Conference, 2018
    [abstract] [ppt] [poster]


Here I keep a personal review of scientific books I own or owned.

This page has been accessed
several times since July 07, 2018.