Hello! I am a Ph.D. student at Penn State in the College of IST, where I work on machine learning, natural language processing, and data mining, advised by Xiang Zhang. I received my M.S. in Optimization at the University of Chinese Academy of Sciences, where I was advised by Yingjie Tian. I received my B.E. at the Renmin University of China, advised by Wei Xu.
In the summer of 2020, I was an intern research scientist at Moffett AI fortunately supervised by Ian En-Hsu Yen, working on model compression and inference acceleration. I also spent two wonderful summers (2019, 2018) as Research Intern advised by Wei Cheng and Haifeng Chen at NEC Labs America in the Data Science team, working on trend learning and anomaly precursor detection in time series.
I am interested in model compression, inference acceleration, and knowledge transfer learning. My current research is about investigating how we can compress (NLP, CV, Graph) neural networks with negligible performance drop and how we can successfully reason over graph-structure data in an unsupervised way. My research goal is to make AI learning at the extreme edge efficient and effective. The details are summarized as follows.
Model Compression: Few/Zero-Shot Compression, Inference Acceleration, Lottery Ticket Hypothesis
Knowledge Transfer: Knowledge Distillation, Contrastive Learning, Multi-Task Learning
NLP: BERT-Based Models, Transformer Architectures
Data Mining: Temporal Graph Modeling, Time Series (Anomaly, Trend) Analysis