Liu Ziyin, UTokyo 刘子寅 Note: Currently, I am visiting Harvard University. I can be found in the room 180.01 of the Northwest Building.

Liu Ziyin 刘子寅

Email: zliu (at) cat.phys.s.u-tokyo.ac.jp
Office: Room 949, Graduate School of Science Building 1
Department of Physics, University of Tokyo


I am a graduate student in physics at the University of Tokyo. I study and do research in, technically speaking, theoretical physics in the Ueda Group. Personally, I am interested in art, literature and philosophy. I also play Go. Even if I do my best, I speak poor English, awful Japanese, and even worse Shanghainese. If you have questions or want to collaborate, or just want to say hi, please shoot an email.

Reviewing Services: ICML, IJCAI, CVPR, ICCV, AISTATS, UAI, NeurIPS, ICLR, TMLR, IEEE-TSP...

Master's thesis: Mean-field learning dynamics of deep neural networks (2020).

Research Interest

I am particularly interested in the theoretical foundations of deep learning. In particular, the following are the keywords that occupy my mind recently:

Recent Preprints

  1. Exact Phase Transitions in Deep Learning
  2. Liu Ziyin, Masahito Ueda
    Preprint 2022
    [paper] [arXiv]
  3. Universal Thermodynamic Uncertainty Relation in Non-Equilibrium Dynamics
  4. Liu Ziyin, Masahito Ueda
    Preprint 2022
    [paper] [arXiv]
  5. Stochastic Neural Networks with Infinite Width are Deterministic
  6. Liu Ziyin, Hanlin Zhang, Xiangming Meng, Yuting Lu, Eric Xing, Masahito Ueda
    Preprint 2022
    [paper] [arXiv]

Selected Works

For the full list of my works, see here. (warning: severely outdated)
  1. Exact Solutions of a Deep Linear Network
  2. Liu Ziyin, Botao Li, Xiangming Meng
    NeurIPS 2022
    [paper]
    [arXiv]
  3. Posterior Collapse of a Linear Latent Variable Model
  4. Zihao Wang*, Liu Ziyin*
    NeurIPS 2022
    [paper] [arXiv]
  5. Theoretically Motivated Data Augmentation and Regularization for Portfolio Construction
  6. Liu Ziyin, Kentaro Minami, Kentaro Imajo
    ICAIF 2022 (3rd ACM International Conference on AI in Finance)
    [paper] [arXiv]
  7. Power Laws and Symmetries in a Minimal Model of Financial Market Economy
  8. Liu Ziyin, Katsuya Ito, Kentaro Imajo, Kentaro Minami
    Physical Review Research (2022)
    [paper] [arXiv]
  9. Logarithmic landscape and power-law escape rate of SGD
  10. Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda
    ICML 2022
    [paper] [arXiv]
  11. SGD with a Constant Large Learning Rate Can Converge to Local Maxima
  12. Liu Ziyin, Botao Li, James B. Simon, Masahito Ueda
    ICLR 2022 (spotlight: 5% of all submissions)
    [paper] [arXiv]
  13. Strength of Minibatch Noise in SGD
  14. Liu Ziyin*, Kangqiao Liu*, Takashi Mori, Masahito Ueda
    ICLR 2022 (spotlight: 5% of all submissions)
    [paper] [arXiv]
  15. On the Distributional Properties of Adaptive Gradients
  16. Zhang Zhiyi*, Liu Ziyin*
    UAI 2021
    [paper] [arXiv]
  17. Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent
  18. Kangqiao Liu*, Liu Ziyin*, Masahito Ueda
    ICML 2021
    [paper] [arXiv]
  19. Cross-Modal Generalization: Learning in Low Resource Modalities via Meta-Alignment
  20. Paul Pu Liang*, Peter Wu*, Liu Ziyin, Louis-Philippe Morency, Ruslan Salakhutdinov
    ACM Multimedia 2021
    NeurIPS 2020 Workshop on Meta Learning
    [arXiv] [code]
  21. Neural Networks Fail to Learn Periodic Functions and How to Fix It
  22. Liu Ziyin, Tilman Hartwig, Masahito Ueda
    NeurIPS 2020
    [paper] [arXiv]
  23. Deep Gamblers: Learning to Abstain with Portfolio Theory
  24. Liu Ziyin, Zhikang Wang, Paul Pu Liang, Ruslan Salakhutdinov, Louis-Philippe Morency, Masahito Ueda
    NeuRIPS 2019
    [paper] [arXiv] [code]
  25. Think Locally, Act Globally: Federated Learning with Local and Global Representations
  26. Paul Pu Liang*, Terrance Liu*, Liu Ziyin, Ruslan Salakhutdinov, Louis-Philippe Morency
    NeurIPS 2019 Workshop on Federated Learning (oral, distinguished student paper award)
    [paper] [arXiv] [code]
  27. Multimodal Language Analysis with Recurrent Multistage Fusion
  28. Paul Pu Liang, Ziyin Liu, Amir Zadeh, Louis-Philippe Morency
    EMNLP 2018 (oral presentation)
    [paper] [supp] [arXiv] [slides]

Presentation and Invited Talks

  1. Phase Transitions in Deep Learning
  2. NeuroTheory Seminar, Harvard University, October 11th, 2022
    [poster]
  3. Does the Refractory Period Help Learning? A Spiking Neural Network Perspective
  4. Harvard-LMU Young Scientists' Forum, 2022
    [poster]
  5. Stochastic Gradient Descent with Multiplicative Noise
  6. RIKEN AIP, 2021
    [link] [video]
  7. Careful Deep Learning: Learning to Abstain by Training on A Simple Loss Function
  8. IPI Seminar, 2019
    [ppt]
  9. A Full Potential Approach to the Solution of Core States
  10. American Physical Society March Conference, 2018
    [abstract] [ppt] [poster]

Miscellanea

Here I keep a personal review of scientific books I own or owned.


This page has been accessed
several times since July 07, 2018.