 |
姓名: |
汪子乔 |
职称: |
长聘副教授 |
学科: |
计算机科学与技术 |
专业: |
计算机科学与技术 |
研究方向: |
机器学习、统计学习原理和信息论 |
电子邮件: |
ziqiaowang@tongji.edu.cn |
通讯地址: |
上海市曹安公路4800号智信馆 |
汪子乔,同济大学计算机科学与技术学院长聘副教授,博士毕业于加拿大渥太华大学。入选国家级海外高层次青年人才、上海市白玉兰海外高层次青年人才,主持和参与多项国家级及省部级项目。研究方向为机器学习基础理论、大模型学习理论与算法以及信息论。近几年主要成果发表在人工智能与机器学习等相关领域国际顶级会议,涵盖NeurIPS、ICML、ICLR等,博士论文被提名2025年加拿大人工智能协会最佳博士论文奖,以及提名2025年渥太华大学总督学术奖章和Pierre Laberge论文奖。曾担任 ICLR、NeurIPS等会议领域主席(Area Chair),2025年中国具身智能大会注册主席,以及2024 年 IEEE 北美信息论暑期学校(NASIT)联合程序主席。
个人主页:https://ziqiaowanggeothe.github.io/
招生信息:https://tongji.teacher.360eol.com/teacherBasic/preview?teacherId=36636
近年代表性论文如下:
Weak-to-Strong Generalization via Bregman Bias–Variance Decomposition
Gengze Xu, Wei Yao, Ziqiao Wang†, and Yong Liu†
ICML 2026
Theoretically Grounded Framework for LLM Watermarking: A Distribution-Adaptive Approach
Haiyun He, Yepeng Liu, Ziqiao Wang, Yongyi Mao, and Yuheng Bu
NeurIPS 2025
Generalization in Federated Learning: A Conditional Mutual Information Framework
Ziqiao Wang, Cheng Long, and Yongyi Mao
ICML 2025
Generalization Bounds via Conditional f-Information
Ziqiao Wang and Yongyi Mao
NeurIPS 2024
On f-Divergence Principled Domain Adaptation: An Improved Framework
Ziqiao Wang and Yongyi Mao
NeurIPS 2024
Sample-Conditioned Hypothesis Stability Sharpens Information-Theoretic Generalization Bounds
Ziqiao Wang and Yongyi Mao
NeurIPS 2023
Tighter Information-Theoretic Generalization Bounds from Supersamples
Ziqiao Wang and Yongyi Mao
ICML 2023 (Oral)
Information-Theoretic Analysis of Unsupervised Domain Adaptation
Ziqiao Wang and Yongyi Mao
ICLR 2023
Over-Training with Mixup May Hurt Generalization
Zixuan Liu*, Ziqiao Wang*, Hongyu Guo, and Yongyi Mao
ICLR 2023
On the Generalization of Models Trained with SGD: Information-Theoretic Bounds and Implications
Ziqiao Wang and Yongyi Mao
ICLR 2022