Welcome to my Academic site, where I share updates on my research projects.
2021 - now | I am a Master’s student at University of California, San Diego majoring in Computer Science & Engineering.
2020 - 2021 | Due to COVID-19 hardships and visa delays, I deferred my master’s to Fall 2021, and worked as a research intern at ByteDance AI Lab. I worked on several research projects on large-scale Machine Learning systems, DNN Compilers, and Automatic Parallelization algorithms.
2019 - 2020 | In the 2019 school year, I was working as a year-round Research Intern at Alibaba Platform of AI (PAI), where I was lucky enough to work with a fantastic team on solving problems in the systems side of Machine Learning. My main focus was the automatic planning of Hybrid Parallelism strategy for Deep Learning.
2016 - 2019 | I obtained my Bacholar’s degree at University of Wisconsin - Madison. In my undergraduate years, I double-majored in Computer Science and Mathematics there, and maintained both major GPAs above 3.91. I graduated with distinction in my junior year, 2019.
M.S. in Computer Science, 2021-2023
University of California, San Diego
B.S. in Computer Science, 2016-2019
University of Wisconsin - Madison
B.S. in Mathematics, 2016-2019
University of Wisconsin - Madison
We propose DAPPLE, a synchronous training framework which combines data parallelism and pipeline parallelism for large DNN models. It features a novel parallelization strategy planner to solve the partition and placement problems, and explores the optimal hybrid strategy of data and pipeline parallelism. We also propose a new runtime scheduling algorithm…
IR-level AutoParallel
Large Model Training
DAPPLE
Auto-MAP
XLA AutoParallel
AArch64 binary instrumentation & rewriting
Conference Papers, Journal Articles, and preprints
Course project reports and other drafts
WARNING
: These are reports from various courses I have taken in the past, and are of much lower quality in terms of writing style and novelty than my formal publications.