Liu Ziyin, UTokyo
Email: zliu (at) cat.phys.s.u-tokyo.ac.jp
Office: Room 949, Graduate School of Science Building 1
Department of Physics, University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033
I am a graduate student in physics at the University of Tokyo. I study and do research in theoretical physics in the Ueda Group. I am also affiliated with the newly founded Institute for Physics of Intelligence. Currently, I also work and do research at Preferred Networks in Tokyo. In the past, I worked in RIKEN AIP in collaboration with Prof. Makoto Yamada. I graduated from Carnegie Mellon University. When I was an undergraduate, I worked in the Multicomp lab in LTI under Prof. Louis-Phillipe Morency. I have also worked on experimental condensed matter physics in the LASE lab under Prof. Kenneth Burch. I also worked in Pittsburgh Supercomputing Center under Dr. Wang Yang on computational physics. If you have questions or want to collaborate, or just want to say hi, please shoot an email.
Reviewing Services: ICML, IJCAI, CVPR, ICCV, AISTATS, UAI, NeurIPS, ICLR...
Master's thesis: Mean-field learning dynamics of deep neural networks (2020).
I am broadly interested in physics, statistics, applied mathematics and machine learning. In particular, the following are the keywords that occupy my mind recently:
- SGD May Never Escape Saddle Points
Liu Ziyin, Botao Li, Masahito Ueda
- What Data Augmentation Do We Need for Deep-Learning-Based Finance?
Liu Ziyin, Kentaro Minami, Kentaro Imajo
- Strength of Minibatch Noise in SGD
Liu Ziyin*, Kangqiao Liu*, Takashi Mori, Masahito Ueda
- Logarithmic landscape and power-law escape rate of SGD
Takashi Mori, Liu Ziyin, Kangqiao Liu, Masahito Ueda
For the full list of my works, see here.
- On the Distributional Properties of Adaptive Gradients
Zhang Zhiyi, Liu Ziyin
- Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent
Kangqiao Liu*, Liu Ziyin*, Masahito Ueda
- Cross-Modal Generalization: Learning in Low Resource Modalities via Meta-Alignment
Paul Pu Liang*, Peter Wu*, Liu Ziyin, Louis-Philippe Morency, Ruslan Salakhutdinov
ACM Multimedia 2021
NeurIPS 2020 Workshop on Meta Learning
- Neural Networks Fail to Learn Periodic Functions and How to Fix It
Liu Ziyin, Tilman Hartwig, Masahito Ueda
- Deep Gamblers: Learning to Abstain with Portfolio Theory
Liu Ziyin, Zhikang Wang, Paul Pu Liang, Ruslan Salakhutdinov, Louis-Philippe Morency, Masahito Ueda
[paper] [arXiv] [code]
- Think Locally, Act Globally: Federated Learning with Local and Global Representations
Paul Pu Liang*, Terrance Liu*, Liu Ziyin, Ruslan Salakhutdinov, Louis-Philippe Morency
NeurIPS 2019 Workshop on Federated Learning (oral, distinguished student paper award)
[paper] [arXiv] [code]
- Multimodal Language Analysis with Recurrent Multistage Fusion
Paul Pu Liang, Ziyin Liu, Amir Zadeh, Louis-Philippe Morency
EMNLP 2018 (oral presentation)
[paper] [supp] [arXiv] [slides]
Presentation and Invited Talks
- Stochastic Gradient Descent with Multiplicative Noise
RIKEN AIP, 2021
- Careful Deep Learning: Learning to Abstain by Training on A Simple Loss Function
IPI Seminar, 2019
- A Full Potential Approach to the Solution of Core States
American Physical Society March Conference, 2018
[abstract] [ppt] [poster]
Here I keep a personal review of scientific books I own or owned.
This page has been accessed
times since July 07, 2018.