University of Cambridge > Talks.cam > Cambridge ML Systems Seminar Series > Learning Under Constraints: From Federated Collaboration to Black-Box LLMs

Learning Under Constraints: From Federated Collaboration to Black-Box LLMs

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Sally Matthews.

In both federated learning (FL) and large language model (LLMs) optimization, a central challenge is effective learning under constraints, ranging from data heterogeneity and personalization to limited communication and black-box access. In this talk, I present three approaches that address these challenges across different settings. FilFL improves generalization in FL by filtering clients based on their joint contribution to global performance. DPFL tackles decentralized personalization by learning asymmetric collaboration graphs under strict resource budgets. Moving beyond FL, I will present ACING , a reinforcement learning method for optimizing instructions in black-box LLMs under strict query budgets, where weights and gradients are inaccessible. While these works tackle distinct problems, they are unified by a common goal: developing efficient learning mechanisms that perform reliably under real-world constraints.

This talk is part of the Cambridge ML Systems Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity