Aojun Zhou
Aojun Zhou
Verified email at
Cited by
Cited by
Incremental network quantization: Towards lossless cnns with low-precision weights
A Zhou, A Yao, Y Guo, L Xu, Y Chen
ICLR2017, 2017
Explicit loss-error-aware quantization for low-bit deep neural networks
A Zhou, A Yao, K Wang, Y Chen
Proceedings of the IEEE conference on computer vision and pattern …, 2018
Deep neural network compression with single and multiple level quantization
Y Xu, Y Wang, A Zhou, W Lin, H Xiong
arXiv preprint arXiv:1803.03289, 2018
Adversarial Robustness vs Model Compression, or Both?
S Ye, K Xu, S Liu, H Cheng, JH Lambrechts, H Zhang, A Zhou, K Ma, ...
arXiv preprint arXiv:1903.12561, 2019
HBONet: Harmonious Bottleneck on Two Orthogonal Dimensions
A Zhou*, D Li*, A Yao, equal contribution, ICCV2019
arXiv preprint arXiv:1908.03888, 2019
Deeply-supervised knowledge synergy
D Sun, A Yao, A Zhou, H Zhao
Proceedings of the IEEE Conference on Computer Vision and Pattern …, 2019
Towards Improving Generalization of Deep Networks via Consistent Normalization
A Zhou*, Y Ma*, Y Li, X Zhang, P Luo, 2019
SnapQuant: A Probabilistic and Nested Parameterization for Binary Networks
K Wang, H Zhao, A Yao, A Zhou, D Sun, Y Chen
Incremental Network Quantization: Towards
A Zhou, A Yao, Y Guo, L Xu, Y Chen
Deeply-supervised Knowledge Synergy Supplementary Materials
D Sun, A Yao, A Zhou, H Zhao
The system can't perform the operation now. Try again later.
Articles 1–10