Следене
Adil Salim
Adil Salim
Microsoft Research
Потвърден имейл адрес: microsoft.com - Начална страница
Заглавие
Позовавания
Позовавания
Година
Textbooks are all you need
S Gunasekar, Y Zhang, J Aneja, CCT Mendes, A Del Giorno, S Gopi, ...
arXiv preprint arXiv:2306.11644, 2023
3012023
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
S Chen, S Chewi, J Li, Y Li, A Salim, AR Zhang
The Eleventh International Conference on Learning Representations, 2022
1972022
Maximum mean discrepancy gradient flow
M Arbel, A Korba, A Salim, A Gretton
Advances in Neural Information Processing Systems 32, 2019
1442019
Phi-3 technical report: A highly capable language model locally on your phone
M Abdin, SA Jacobs, AA Awan, J Aneja, A Awadallah, H Awadalla, ...
arXiv preprint arXiv:2404.14219, 2024
1212024
A non-asymptotic analysis for Stein variational gradient descent
A Korba, A Salim, M Arbel, G Luise, A Gretton
Advances in Neural Information Processing Systems 33, 4672-4682, 2020
852020
Phi-2: The surprising power of small language models
M Javaheripi, S Bubeck, M Abdin, J Aneja, S Bubeck, CCT Mendes, ...
Microsoft Research Blog, 2023
822023
Optimal and practical algorithms for smooth and strongly convex decentralized optimization
D Kovalev, A Salim, P Richtárik
Advances in Neural Information Processing Systems 33, 18342-18352, 2020
792020
Towards a theory of non-log-concave sampling: first-order stationarity guarantees for langevin monte carlo
K Balasubramanian, S Chewi, MA Erdogdu, A Salim, S Zhang
Conference on Learning Theory, 2896-2923, 2022
592022
The probability flow ODE is provably fast
S Chen, S Chewi, H Lee, Y Li, J Lu, A Salim
Advances in Neural Information Processing Systems 36, 2023
562023
The Wasserstein proximal gradient algorithm
A Salim, A Korba, G Luise
Advances in Neural Information Processing Systems 33, 12356-12366, 2020
512020
Improved analysis for a proximal algorithm for sampling
Y Chen, S Chewi, A Salim, A Wibisono
Conference on Learning Theory, 2984-3014, 2022
462022
Dualize, split, randomize: Toward fast nonsmooth optimization algorithms
A Salim, L Condat, K Mishchenko, P Richtárik
Journal of Optimization Theory and Applications 195 (1), 102-130, 2022
392022
Primal dual interpretation of the proximal stochastic gradient Langevin algorithm
A Salim, P Richtarik
Advances in Neural Information Processing Systems 33, 3786-3796, 2020
382020
A constant step Forward-Backward algorithm involving random maximal monotone operators
P Bianchi, W Hachem, A Salim
Journal of Convex Analysis 26 (2), 387-436, 2019
292019
Stochastic proximal langevin algorithm: Potential splitting and nonasymptotic rates
A Salim, D Kovalev, P Richtárik
Advances in Neural Information Processing Systems 32, 2019
272019
A convergence theory for SVGD in the population limit under Talagrand’s inequality T1
A Salim, L Sun, P Richtarik
International Conference on Machine Learning, 19139-19152, 2022
26*2022
Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein space
MZ Diao, K Balasubramanian, S Chewi, A Salim
International Conference on Machine Learning, 7960-7991, 2023
252023
An optimal algorithm for strongly convex minimization under affine constraints
A Salim, L Condat, D Kovalev, P Richtárik
International conference on artificial intelligence and statistics, 4482-4498, 2022
252022
Distributed fixed point methods with compressed iterates
S Chraibi, A Khaled, D Kovalev, P Richtárik, A Salim, M Takáč
arXiv preprint arXiv:1912.09925, 2019
252019
Snake: a stochastic proximal gradient algorithm for regularized problems over large graphs
A Salim, P Bianchi, W Hachem
IEEE Transactions on Automatic Control 64 (5), 1832-1847, 2019
242019
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–20