Hello, I am An Wang (王安). I received my Ph.D. in Computer Science from Institute of Science Tokyo (formerly Tokyo Institute of Technology) in June 2025, advised by Prof. Naoaki Okazaki. I am currently working at Tencent, on the LLM Pretraining Team (Qingyun Project) of Hunyuan. I am a core contributor to Hy3, mainly responsible for the model architecture design and base model production. Previously, my research focused on model architecture, training strategies, and scaling laws for large language models. Nowadays, I am passionate about AI agents and automating all workflows.

News

2026.04 🚀 Hy3 model released — Tencent Hunyuan's next-generation large language model.
2025.11 🏅 HMoE: Heterogeneous Mixture of Experts for Language Modeling received SAC Highlight Award at EMNLP 2025.
2025.11 📄 Sparsifying Mamba accepted at EMNLP 2025 Findings.
2025.06 🎓 Received Ph.D. degree from Institute of Science Tokyo.
2025.05 📄 Scaling Laws for Floating-Point Quantization Training accepted at ICML 2025.
2025 🏆 Excellence Award from the Association for Natural Language Processing (NLP 2025).
2024.05 💼 Joined Tencent Hunyuan LLM Pretraining Team (Qingyun Project).

Selected Publications

EMNLP 2025
HMoE: Heterogeneous Mixture of Experts for Language Modeling
An Wang*, Xingwu Sun*, Ruobing Xie, Shuaipeng Li, Jiaqi Zhu, Zhen Yang, Pinxue Zhao, JN Han, Zhanhui Kang, Di Wang, Naoaki Okazaki, Cheng-zhong Xu
EMNLP 2025 Main — 🏅 SAC Highlight Award
ICML 2025
Scaling Laws for Floating-Point Quantization Training
Xingwu Sun, Shuaipeng Li, Ruobing Xie, Weidong Han, Kan Wu, Zhen Yang, Yixing Li, An Wang, Shuai Li, Jinbao Xue, Yu Cheng, Yangyu Tao, Zhanhui Kang, Chengzhong Xu, Di Wang, Jie Jiang
ICML 2025
EMNLP 2025
Sparsifying Mamba
An Wang, Ruobing Xie, Shuaipeng Li, Xingwu Sun, Zhanhui Kang
EMNLP 2025 Findings
arXiv 2025
Towards a Comprehensive Scaling Law of Mixture-of-Experts
G Zhao, Y Fu, S Li, X Sun, R Xie, An Wang, W Han, Z Yang, W Sun, et al.
arXiv preprint, 2025
EACL 2023
DREEAM: Guiding Attention with Evidence for Improving Document-Level Relation Extraction
Youmi Ma, An Wang, Naoaki Okazaki
EACL 2023 — Cited by 152

Education

2022 – 2025
Ph.D. in Computer Science, Institute of Science Tokyo 🇯🇵
Advised by Prof. Naoaki Okazaki
2020 – 2022
M.Eng. in Computer Science, Tokyo Institute of Technology 🇯🇵
Advised by Prof. Haruo Yokota
2015 – 2019
B.Eng. in Mechatronic Engineering, Shanghai University 🇨🇳

Experience

2024 – Now
Tencent — Hunyuan LLM Pretraining Team (Qingyun Project), Beijing 🇨🇳
2023 – 2024
NEC Laboratories Europe — Human-centric Team, Heidelberg, Germany 🇩🇪
2019
Intsig — Natural Language Understanding Team, Shanghai 🇨🇳

Honors & Awards

2025 Excellence Award, the Association for Natural Language Processing
2023 🏆 Best Paper Award, 29th Annual Meeting of The Association for Natural Language Processing
2023 Spring Scholarship for Doctoral Students, Tokyo Institute of Technology
2022 Advanced Human Resource Development Fellowship, Tokyo Institute of Technology