Tianyu Liu (刘天宇)



Hi! Welcome to Tianyu Liu (刘天宇)’s webpage

I am final-year Ph.D. student (expect to graduate in Jun 2021) in the Institute of Computational Linguistics (ICL, website in Chinese), Peking University. During my Ph.d. journey so far, I am very fortunate to work with Prof. Zhifang Sui and Prof. Baobao Chang in Peking University, Dr. Chin-Yew Lin and Dr. Jin-Ge Yao in Microsft Research Asia, Prof. Kevin Gimpel and Dr. Sam Wiseman in TTIC, Dr. Yizhe Zhang (Microsoft Research) and Dr. Yi Mao (Microsoft Business AI). My educational background and professional experiences can be found here.

My research interests include natural language processing (NLP) and deep learning. Specifically I focus on:

  • Applications of large-scale transformer-based pretraining on natural language generation (NLG)
  • Investigations on efficient Transformer structures
  • Faithful, controllable and efficient neural text generation
  • On-the-fly/static identification of hallucinations in machine generated text
  • Plan-based and template-like neural data-to-text generation
  • Evaluation metrics on the faithfulness/fidelity/factualness of NLG
  • Robustness, efficiency and interpretability in natural language inference
  • Approximate inference for higher-order structure prediction

My publications can be found here or on the google scholar page.

I am now on the industry job market, actively seeking for a NLP researcher or engineer position. Please drop me an email (A@B,A=tianyu0421,B=pku.edu.cn) or contact me via LinkedIn if you want to chat.