Liu Ziyin, UTokyo

Liu Ziyin

Email: zliu (at)
Office: Room 949, Graduate School of Science Building 1
Department of Physics, University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033

I am a graduate student in physics at the University of Tokyo. I study and do research in theoretical physics in the Ueda Group. I am also affiliated with the newly founded Institute for Physics of Intelligence. Currently, I also work and do research at Preferred Networks in Tokyo. In the past, I worked in RIKEN AIP in collaboration with Prof. Makoto Yamada. I graduated from Carnegie Mellon University. When I was an undergraduate, I worked in the Multicomp lab in LTI under Prof. Louis-Phillipe Morency. I have also worked on experimental condensed matter physics in the LASE lab under Prof. Kenneth Burch. I also worked in Pittsburgh Supercomputing Center under Dr. Wang Yang on computational physics. If you have questions or want to collaborate, or just want to say hi, please shoot an email.

Master's thesis: Mean-field learning dynamics of deep neural networks (2020).

Research Interest

I am broadly interested in physics, statistics, applied mathematics and machine learning. In particular, the following are the keywords that occupy my mind recently:

Recent Preprints

  1. What Data Augmentation Do We Need for Deep-Learning-Based Finance?
  2. Liu Ziyin, Kentaro Minami, Kentaro Imajo
    Preprint 2021
    [paper] [arXiv]
  3. Strength of Minibatch Noise in SGD
  4. Liu Ziyin*, Kangqiao Liu*, Takashi Mori, Masahito Ueda
    Preprint 2021
    [paper] [arXiv]
  5. Logarithmic landscape and power-law escape rate of SGD
  6. Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda
    Preprint 2021
    [paper] [arXiv]

Selected Works

For the full list of my works, see here.
  1. On the Distributional Properties of Adaptive Gradients
  2. Zhang Zhiyi, Liu Ziyin
    UAI 2021
  3. Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent
  4. Kangqiao Liu*, Liu Ziyin*, Masahito Ueda
    ICML 2021
    [paper] [arXiv]
  5. Neural Networks Fail to Learn Periodic Functions and How to Fix It
  6. Liu Ziyin, Tilman Hartwig, Masahito Ueda
    NeurIPS 2020
    [paper] [arXiv]
  7. Deep Gamblers: Learning to Abstain with Portfolio Theory
  8. Liu Ziyin, Zhikang Wang, Paul Pu Liang, Ruslan Salakhutdinov, Louis-Philippe Morency, Masahito Ueda
    NeuRIPS 2019
    [paper] [arXiv] [code]
  9. Think Locally, Act Globally: Federated Learning with Local and Global Representations
  10. Paul Pu Liang*, Terrance Liu*, Liu Ziyin, Ruslan Salakhutdinov, Louis-Philippe Morency
    NeurIPS 2019 Workshop on Federated Learning (oral, distinguished student paper award)
    [paper] [arXiv] [code]
  11. Multimodal Language Analysis with Recurrent Multistage Fusion
  12. Paul Pu Liang, Ziyin Liu, Amir Zadeh, Louis-Philippe Morency
    EMNLP 2018 (oral presentation)
    [paper] [supp] [arXiv] [slides]

Presentation and Invited Talks

  1. Careful Deep Learning: Learning to Abstain by Training on A Simple Loss Function
  2. Liu Ziyin
    IPI Seminar, 2019
  3. A Full Potential Approach to the Solution of Core States
  4. Ziyin Liu, Wang Yang, Xianglin Liu
    American Physical Society March Conference, 2018
    [abstract] [ppt] [poster]

Reviewing Services: ICML, IJCAI, CVPR, ICCV, AISTATS, UAI, NeurIPS, ICLR...


Here I keep a personal review of scientific books I own or owned.

This page has been accessed
several times since July 07, 2018.