- Cet évènement est passé
Kang Liu – (IMB Dijon) « Representation and Regression Problems in Neural Networks: Relaxation, Generalization, and Numerics »
octobre 9 @ 10:30 -12:30
Résumé : In this presentation, we explore three non-convex optimization problems related to the training of shallow neural networks (NNs) for exact and approximate representation, as well as regression tasks. From a theoretical perspective, we convexify these problems using a « mean-field » approach and, leveraging a « representer theorem”, demonstrate the absence of relaxation gaps. We then establish generalization bounds provided by solutions of previous problems, characterizing their performance on testing datasets. We also examine the sensitivity of these bounds to the hyperparameters in optimization problems and propose reasonable choices for the hyperparameters. On the numerical side, we describe the discretization approach for convexified problems and provide their convergence rates. For low-dimensional datasets, these discretized problems can be addressed by the simplex method efficiently. For high-dimensional datasets, we develop a sparsification algorithm that, in combination with the gradient descent algorithm for over-parameterized shallow NNs, leads to favorable results for primal problems.
https://indico.math.cnrs.fr/event/12982/
- wpea_event_id:
- indico-vnt-12982@indico.math.cnrs.fr
- wpea_event_origin:
- ical
- wpea_event_link:
- https://indico.math.cnrs.fr/event/12982/