学术交流

    学术交流

    当前位置: 首页 - 学术交流 - 正文

    数苑经纬讲坛(77):Rethinking Generalisation: Beyond KL with Geometry and Comparators

    信息来源:

    报告人:Benjamin Guedj University College London

    报告时间:2026年4月28日(周二)下午15:00 - 16:00

    报告地点腾讯会议388-984-701

    报告摘要:Generalisation is arguably one of the central problems in machine learning and foundational AI. Generalisation theory has traditionally relied on KL-based PAC-Bayesian bounds, which, despite their elegance, often obscure geometry and limit applicability. In this talk, I will present recent advances that move beyond traditional bounds. One line of work replaces KL with Wasserstein distances, yielding high-probability bounds valid for heavy-tailed losses and leading to new, optimisable learning objectives. Another line introduces a general comparator framework, showing how optimal bounds naturally arise from convex conjugates of cumulant generating functions, unifying and extending many classical results. Together, these perspectives highlight how rethinking divergences and comparators opens new directions in both theory and practice. I will conclude by discussing links with information theory and how these ideas might shape the next generation of PAC-Bayesian learning algorithms.



    联系地址:

    地址:  湖北省武汉市洪山区珞喻路152号

    邮编:430079

    版权所有:Copyright © 2005-2020 华中师范大学 鄂ICP备05003325号-9