Следене
Hanlin Tang
Hanlin Tang
Потвърден имейл адрес: ur.rochester.edu - Начална страница
Заглавие
Позовавания
Позовавания
Година
: Decentralized Training over Decentralized Data
H Tang, X Lian, M Yan, C Zhang, J Liu
International Conference on Machine Learning, 4848-4856, 2018
4032018
Communication compression for decentralized training
H Tang, S Gan, C Zhang, T Zhang, J Liu
Advances in Neural Information Processing Systems 31, 2018
3162018
Doublesqueeze: Parallel stochastic gradient descent with double-pass error-compensated compression
H Tang, C Yu, X Lian, T Zhang, J Liu
International Conference on Machine Learning, 6155-6165, 2019
2612019
Central server free federated learning over single-sided trust social networks
C He, C Tan, H Tang, S Qiu, J Liu
arXiv preprint arXiv:1910.04956, 2019
892019
1-bit adam: Communication efficient large-scale training with adam’s convergence speed
H Tang, S Gan, AA Awan, S Rajbhandari, C Li, X Lian, J Liu, C Zhang, ...
International Conference on Machine Learning, 10118-10129, 2021
802021
Distributed learning over unreliable networks
C Yu, H Tang, C Renggli, S Kassing, A Singla, D Alistarh, C Zhang, J Liu
International Conference on Machine Learning, 7202-7212, 2019
702019
Deepsqueeze: Decentralization meets error-compensated compression
H Tang, X Lian, S Qiu, L Yuan, C Zhang, T Zhang, J Liu
arXiv preprint arXiv:1907.07346, 2019
432019
1-bit LAMB: communication efficient large-scale large-batch training with LAMB’s convergence speed
C Li, AA Awan, H Tang, S Rajbhandari, Y He
2022 IEEE 29th International Conference on High Performance Computing, Data …, 2022
302022
Decentralized online learning: Take benefits from others' data without sharing your own to track global trend
Y Zhao, C Yu, P Zhao, H Tang, S Qiu, J Liu
arXiv preprint arXiv:1901.10593, 2019
252019
Errorcompensatedx: error compensation for variance reduced algorithms
H Tang, Y Li, J Liu, M Yan
Advances in Neural Information Processing Systems 34, 18102-18113, 2021
132021
Mkq-bert: Quantized bert with 4-bits weights and activations
H Tang, X Zhang, K Liu, J Zhu, Z Kang
arXiv preprint arXiv:2203.13483, 2022
122022
Apmsqueeze: A communication efficient adam-preconditioned momentum sgd algorithm
H Tang, S Gan, S Rajbhandari, X Lian, J Liu, Y He, C Zhang
arXiv preprint arXiv:2008.11343, 2020
72020
Easyquant: An efficient data-free quantization algorithm for llms
H Tang, Y Sun, D Wu, K Liu, J Zhu, Z Kang
arXiv preprint arXiv:2403.02775, 2024
52024
Razorattention: Efficient kv cache compression through retrieval heads
H Tang, Y Lin, J Lin, Q Han, S Hong, Y Yao, G Wang
arXiv preprint arXiv:2407.15891, 2024
32024
PASTO: Strategic Parameter Optimization in Recommendation Systems--Probabilistic is Better than Deterministic
W Ding, H Tang, J Feng, L Yuan, S Yang, G Yang, J Zheng, J Wang, Q Su, ...
arXiv preprint arXiv:2108.09076, 2021
2021
Communication Efficient Machine Learning
H Tang
University of Rochester, 2021
2021
Systems/Subsytems
S Rajbhandari, AVN Jalajakumari, H Chun, G Faulkner, K Cameron, ...
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–17