Следене
Loucas Pillaud-Vivien
Loucas Pillaud-Vivien
Courant Institute of Mathematics, NYU / Flatiron Institute, New York
Потвърден имейл адрес: flatironinstitute.org - Начална страница
Заглавие
Позовавания
Позовавания
Година
Statistical optimality of stochastic gradient descent on hard learning problems through multiple passes
L Pillaud-Vivien, A Rudi, F Bach
Advances in Neural Information Processing Systems 31, 2018
942018
Implicit bias of sgd for diagonal linear networks: a provable benefit of stochasticity
S Pesme, L Pillaud-Vivien, N Flammarion
Advances in Neural Information Processing Systems 34, 2021
832021
Gradient flow dynamics of shallow ReLU networks for square loss and orthogonal inputs
E Boursier, L Pillaud-Vivien, N Flammarion
Advances in Neural Information Processing Systems 35, 2022
452022
Sgd with large step sizes learns sparse features
M Andriushchenko, AV Varre, L Pillaud-Vivien, N Flammarion
International Conference on Machine Learning, 2023
402023
Exponential convergence of testing error for stochastic gradient methods
L Pillaud-Vivien, A Rudi, F Bach
Conference On Learning Theory, 2018
332018
Last iterate convergence of SGD for Least-Squares in the Interpolation regime.
AV Varre, L Pillaud-Vivien, N Flammarion
Advances in Neural Information Processing Systems 34, 2021
312021
Label noise (stochastic) gradient descent implicitly solves the Lasso for quadratic parametrisation
L Pillaud-Vivien, J Reygner, N Flammarion
Conference on Learning Theory, 2022
252022
Overcoming the curse of dimensionality with Laplacian regularization in semi-supervised learning
V Cabannes, L Pillaud-Vivien, F Bach, A Rudi
Advances in Neural Information Processing Systems 34, 2021
182021
Statistical estimation of the poincaré constant and application to sampling multimodal distributions
L Pillaud-Vivien, F Bach, T Lelièvre, A Rudi, G Stoltz
International Conference on Artificial Intelligence and Statistics, 2020
132020
Central Limit Theorem for stationary Fleming--Viot particle systems in finite spaces
T Lelievre, L Pillaud-Vivien, J Reygner
ALEA, Lat. Am. J. Probab. Math. Stat. 15, 1163–1182, 2018
132018
On Learning Gaussian Multi-index Models with Gradient Flow
A Bietti, J Bruna, L Pillaud-Vivien
arXiv preprint arXiv:2310.19793, 2023
62023
On Single Index Models beyond Gaussian Data
J Bruna, L Pillaud-Vivien, A Zweig
Advances in Neural Information Processing Systems 37, 2023
62023
Kernelized Diffusion maps
L Pillaud-Vivien, F Bach
Conference On Learning Theory, 2023
42023
Learning with reproducing kernel Hilbert spaces: stochastic gradient descent and laplacian estimation
L Pillaud-Vivien
Université Paris sciences et lettres, 2020
22020
The Computational Complexity of Learning Gaussian Single-Index Models
A Damian, L Pillaud-Vivien, JD Lee, J Bruna
arXiv preprint arXiv:2403.05529, 2024
12024
Batch and match: black-box variational inference with a score-based divergence
D Cai, C Modi, L Pillaud-Vivien, CC Margossian, RM Gower, DM Blei, ...
arXiv preprint arXiv:2402.14758, 2024
12024
On the spectral bias of two-layer linear networks
AV Varre, ML Vladarean, L Pillaud-Vivien, N Flammarion
Advances in Neural Information Processing Systems 37, 2023
12023
An Ordering of Divergences for Variational Inference with Factorized Gaussian Approximations
CC Margossian, L Pillaud-Vivien, LK Saul
arXiv preprint arXiv:2403.13748, 2024
2024
La Résilience à Paris: états des lieux et préconisations multi-bénéfices pour l’espace public
A Hatchuel, A Labourdette, F Leduc, L Pillaud-Vivien, M Renaudin
Mairie de Paris, 2017
2017
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–19