PGMODAYS 2025

Mardi 18 novembre & Mercredi 19 novembre

EDF Lab Paris-Saclay, Palaiseau, France

BOOK OF ABSTRACTS

 

 

Invited Speakers

 

  • CEVHER Volkan (EPFL, Suisse)

 

  • DE SANTIS Marianna (Université de Florence, Italie)

Slides PGMODAYS

  • SUROWIEC Thomas (Simula Research Laboratory, Norvège)

SLIDES PGMODAYS

  • VAN HENTENRYCK Pascal (Georgia Tech., Etats-Unis)

Volkan Cevher

EPFL, Suisse

Training neural networks at any scale

At the heart of deep learning’s transformative impact lies the concept of scale--encompassing both data and computational resources, as well as their interaction with neural network architectures. Scale, however, presents critical challenges, such as increased instability during training and prohibitively expensive model-specific tuning. Given the substantial resources required to train such models, formulating high-confidence scaling hypotheses backed by rigorous theoretical research has become paramount. 

To bridge theory and practice, the talk explores a key mathematical ingredient of scaling in tandem with scaling theory: the numerical solution algorithms commonly employed in deep learning, spanning domains from vision to language models. We unify these algorithms under a common master template, making their foundational principles transparent. In doing so, we reveal the interplay between adaptation to smoothness structures via online learning and the exploitation of optimization geometry through non-Euclidean norms. Our exposition moves beyond simply building larger models--it emphasizes strategic scaling, offering insights that promise to advance the field while economizing on resources.

 

Marianna De Santis

University of Florence

Exact approaches for multi-objective binary quadratic optimization

Multi-objective mixed-integer nonlinear programming (MOMINLP) provides a powerful framework for modeling complex real-world decision problems. Practical applications often involve multiple, conflicting objectives and require binary or general integer variables to capture logical constraints or discrete decisions. Solving MOMINLP problems entails identifying the set of efficient solutions - i.e., solutions where no objective can be improved without deteriorating at least one other.

An efficient integer assignment refers to a fixing of the integer variables such that there exists at least one efficient solution for the corresponding continuous subproblem. In many cases, correctly solving MOMINLPs requires exploring a large number of such assignments - potentially all integer-feasible ones - rendering full enumeration unavoidable. This significantly complicates the solution process compared to the single-objective case and poses substantial challenges for algorithm development.

In this talk, we review essential tools recently proposed for developing branch-and-bound methods for multi-objective mixed-integer nonlinear optimization, and we explore approaches specifically tailored to problems with quadratic objective functions. Building on the well-established technique of quadratic convex reformulation - originally developed for single-objective binary quadratic programs - we extend this methodology to the multi-objective setting. We propose a branch-and-bound algorithm where lower bound sets are obtained from appropriately defined convex quadratic subproblems. Computational experiments on multiobjective k-item Quadratic Knapsack and multiobjective Max-Cut instances demonstrate the effectiveness of our approach.

 

Thomas Surowiec

Simula Research Laboratory, Norvège

Risk-Averse PDE-Constrained Optimization: Theory, Algorithms, and Statistical Foundations

Over the past few decades, risk-averse optimization has become an important framework for decision-making under uncertainty in systems governed by partial differential equations (PDEs). Uncertainty in the PDE parameters creates both theoretical and computational challenges. Addressing these challenges requires techniques from functional analysis, numerical optimization, and statistics. In this plenary, I will survey recent developments in risk-averse PDE-constrained optimization, with an emphasis on theory and algorithms as well as stability and asymptotic analysis.

The talk will address three central themes. First, we develop rigorous formulations that incorporate risk measures into PDE-constrained settings to capture the behavior of risk-averse decision makers. Second, we consider algorithmic strategies for handling nonsmooth risk measures that make large-scale computations both tractable and robust. Third, we turn our attention to results on stability and asymptotic behavior that characterize solution sensitivity to perturbations in data, discretization, and sampling, and provide statistical guarantees for risk-averse models.

Together, these directions show how risk-averse optimization enriches the mathematical landscape of PDE-constrained problems and offers a principled framework for reliable decision-making in the presence of uncertainty.

 

Pascal Van Hentenryck

Georgia Tech., États-Unis

“Learning to Optimize”

In many industry settings, the same optimization problem is solved repeatedly for instances taken from a distribution that can be learned or forecasted. Indeed, such parametric optimization problems are ubiquitous in applications over complex infrastructures such as electrical power grids, supply chains, manufacturing, and transportation networks. The scale and complexity of these applications have grown significantly in recent years, challenging traditional optimization approaches. This talk studies how to speed up these parametric optimization problems to meet real-time constraints present in many applications. covering concepts such as primal and dual optimization proxies, learning to optimize, contextual optimization, and decision-focused learning. The methodologies are highlighted on industrial problems in grid optimization, end-to-end supply chains, logistics, and transporation systems.