Следене
Shaoduo Gan
Shaoduo Gan
Потвърден имейл адрес: inf.ethz.ch
Заглавие
Позовавания
Позовавания
Година
Communication compression for decentralized training
H Tang, S Gan, C Zhang, T Zhang, J Liu
NeurIPS 2018, 2018
2352018
Towards Demystifying Serverless Machine Learning Training
J Jiang*, S Gan*, Y Liu, F Wang, G Alonso, A Klimovic, A Singla, W Wu, ...
SIGMOD 2021, 2021
632021
1-bit Adam: Communication Efficient Large-Scale Training with Adam's Convergence Speed
H Tang, S Gan, AA Awan, S Rajbhandari, C Li, X Lian, J Liu, C Zhang, ...
ICML 2021, 2021
332021
Ease. ML: A Lifecycle Management System for Machine Learning
L Aguilar Melgar, D Dao, S Gan, NM Gürel, N Hollenstein, J Jiang, ...
CIDR 2021, 2021
17*2021
Bagua: Scaling up Distributed Learning with System Relaxations
S Gan, X Lian, R Wang, J Chang, C Liu, H Shi, S Zhang, X Li, T Sun, ...
VLDB 2022, 2022
142022
Fruda: Framework for distributed adversarial domain adaptation
S Gan, A Mathur, A Isopoussu, F Kawsar, N Berthouze, ND Lane
IEEE Transactions on Parallel and Distributed Systems 33 (11), 3153-3164, 2021
52021
Few-shot Named Entity Recognition with Entity-level Prototypical Network Enhanced by Dispersedly Distributed Prototypes
B Ji, S Li, S Gan, J Yu, J Ma, H Liu
arXiv preprint arXiv:2208.08023, 2022
22022
Erpc: An edge-resources based framework to reduce bandwidth cost in the personal cloud
S Gan, J Yu, X Li, J Ma, L Luo, Q Wu, S Li
Web-Age Information Management: 17th International Conference, WAIM 2016 …, 2016
22016
In-Database Machine Learning with CorgiPile: Stochastic Gradient Descent without Full Data Shuffle
L Xu, S Qiu, B Yuan, J Jiang, C Renggli, S Gan, K Kara, G Li, J Liu, W Wu, ...
SIGMOD 2022, 2022
12022
A novel optimization scheme for caching in locality-aware P2P networks
S Gan, J Zhang, J Yu, X Li, J Ma, L Luo, Q Wu
2016 IEEE Symposium on Computers and Communication (ISCC), 1024-1031, 2016
12016
Stochastic Gradient Descent without Full Data Shuffle
L Xu, S Qiu, B Yuan, J Jiang, C Renggli, S Gan, K Kara, G Li, J Liu, W Wu, ...
arXiv preprint arXiv:2206.05830, 2022
2022
A System Study of Communication-Efficient Distributed Machine Learning
S Gan
ETH Zurich, 2021
2021
Distributed Asynchronous Domain Adaptation: Towards Making Domain Adaptation More Practical in Real-World Systems
S Gan*, A Mathur*, A Isopoussu, N Berthouze, ND Lane, F Kawsar
NeurIPS 2019 SysML Workshop, 2019
2019
Multi-Step Decentralized Domain Adaptation
A Mathur, S Gan, A Isopoussu, F Kawsar, N Berthouze, ND Lane
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–14