Han Wang

Hi! I am a second-year Ph.D. student at University of North Carolina at Chapel Hill, supervised by Prof. Mohit Bansal.

I am currently working as a research intern at AMD GenAI team with Xiaodong Yu, Yusheng (Ethan) Su, and Zicheng Liu. I’ve spent time at Amazon (worked with Rinat Khaziev), NLC Group in Microsoft Research Asia (worked with Lei Cui), Microsoft Cognitive Services Research Group (worked with Chenguang Zhu and Yang Liu), and Westlake University (worked with Prof. Yue Zhang). Previously, I received my master’s degree from New York University and bachelor’s degree from Tongji University.

My research interests include making large language models more robust, efficient, and reliable, evaluating and improving LLMs’ capabilities, and knowledge-enhanced NLP. I always favor and pursue the methods that are simple, intuitive, and effective. I am open to academic collaborations and please drop me an email if you are interested in collaborating with me.

Service
  • PC Member / Reviewer: ACL (2020-2025), EMNLP (2021-2023), NAACL (2024), COLM (2024-2025), ICLR (2024), NeurIPS (2023-2025), AAAI (2023-2024), COLING (2020, 2022, 2024), ACL Rolling Review (2021 - Present)

news

Jun 9, 2025 New preprint! CLaMR is a contextualized late-interaction retriever that jointly encodes all modalities and dynamically selects those containing the relevant signals!
May 27, 2025 I have started an internship at AMD GenAI team!
Apr 18, 2025 Check out Retrieval-Augmented Generation with Conflicting Evidence, our new preprint on arxiv!
Jan 22, 2025 Our paper AdaCAD has been accepted to NAACL 2025!
Sep 11, 2024 New preprint! AdaCAD is a simple yet effective dynamic decoding method to automatically balance the contrast between contextual and parametric knowledge.