Следене
Zhengyang Geng
Zhengyang Geng
Потвърден имейл адрес: cs.cmu.edu - Начална страница
Заглавие
Позовавания
Позовавания
Година
Is attention better than matrix decomposition?
Z Geng, MH Guo, H Chen, X Li, K Wei, Z Lin
ICLR, 2020
1422020
Medusa: Simple llm inference acceleration framework with multiple decoding heads
T Cai, Y Li, Z Geng, H Peng, JD Lee, D Chen, T Dao
arXiv preprint arXiv:2401.10774, 2024
74*2024
Deep Equilibrium Optical Flow Estimation
S Bai, Z Geng, Y Savani, JZ Kolter
CVPR 2022, 2022
652022
On Training Implicit Models
Z Geng, XY Zhang, S Bai, Y Wang, Z Lin
NeurIPS 2021, 2021
602021
Residual Relaxation for Multi-view Representation Learning
Y Wang, Z Geng, F Jiang, C Li, Y Wang, J Yang, Z Lin
NeurIPS 2021, 2021
312021
Eliminating Gradient Conflict in Reference-based Line-art Colorization
Z Li, Z Geng, Z Kang, W Chen, Y Yang
ECCV 2022, 2022
242022
Deep Equilibrium Approaches to Diffusion Models
A Pokle, Z Geng, Z Kolter
NeurIPS 2022, 2022
232022
One-step diffusion distillation via deep equilibrium models
Z Geng, A Pokle, JZ Kolter
Advances in Neural Information Processing Systems 36, 2024
62024
Equilibrium image denoising with implicit differentiation
Q Chen, Y Wang, Z Geng, Y Wang, J Yang, Z Lin
IEEE Transactions on Image Processing 32, 1868-1881, 2023
52023
Torchdeq: A library for deep equilibrium models
Z Geng, JZ Kolter
arXiv preprint arXiv:2310.18605, 2023
32023
Consistency Models Made Easy
Z Geng, A Pokle, W Luo, J Lin, JZ Kolter
arXiv preprint arXiv:2406.14548, 2024
2024
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–11