Следене
Yoon Kim
Yoon Kim
Assistant Professor, MIT
Потвърден имейл адрес: mit.edu - Начална страница
Заглавие
Позовавания
Позовавания
Година
Convolutional Neural Networks for Sentence Classification
Y Kim
EMNLP 2014, 2014
19414*2014
OpenNMT: Open-Source Toolkit for Neural Machine Translation
G Klein, Y Kim, Y Deng, J Senellart, AM Rush
ACL 2017 (System Demonstrations), 2017
23642017
Character-Aware Neural Language Models
Y Kim, Y Jernite, D Sontag, AM Rush
AAAI 2016, 2015
21232015
Sequence-Level Knowledge Distillation
Y Kim, AM Rush
EMNLP 2016, 2016
10472016
Structured Attention Networks
Y Kim, C Denton, L Hoang, AM Rush
ICLR 2017, 2017
5912017
Adversarially Regularized Autoencoders
J Zhao, Y Kim, K Zhang, AM Rush, Y LeCun
ICML 2018, 2017
414*2017
Temporal Analysis of Language through Neural Language Models
Y Kim, YI Chiu, K Hanaki, D Hegde, S Petrov
Proceedings of the ACL 2014 Workshop on Language Technologies and …, 2014
4122014
Semi-Amortized Variational Autoencoders
Y Kim, S Wiseman, AC Miller, D Sontag, AM Rush
ICML 2018, 2018
2812018
Parameter-Efficient Transfer Learning with Diff Pruning
D Guo, AM Rush, Y Kim
ACL 2021, 2020
2792020
Large language models are few-shot clinical information extractors
M Agrawal, S Hegselmann, H Lang, Y Kim, D Sontag
EMNLP 2022, 2022
2292022
Avoiding Latent Variable Collapse With Generative Skip Models
AB Dieng, Y Kim, AM Rush, DM Blei
AISTATS 2019, 2018
2182018
DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings
YS Chuang, R Dangovski, H Luo, Y Zhang, S Chang, M Soljačić, SW Li, ...
NAACL 2022, 2022
1622022
Compound Probabilistic Context-Free Grammars for Grammar Induction
Y Kim, C Dyer, AM Rush
ACL 2019, 2019
1542019
Latent Alignment and Variational Attention
Y Deng, Y Kim, J Chiu, D Guo, AM Rush
NeurIPS 2018, 2018
1502018
Unsupervised Recurrent Neural Network Grammars
Y Kim, AM Rush, L Yu, A Kuncoro, C Dyer, G Melis
NAACL 2019, 2019
1442019
Sequence-level Mixed Sample Data Augmentation
D Guo, Y Kim, AM Rush
EMNLP 2020, 2020
842020
DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models
YS Chuang, Y Xie, H Luo, Y Kim, J Glass, P He
ICLR 2024, 2023
752023
Multitask Prompt Tuning Enables Parameter-Efficient Transfer Learning
Z Wang, R Panda, L Karlinsky, R Feris, H Sun, Y Kim
ICLR 2023, 2023
692023
Reasoning or Reciting? Exploring the Capabilities and Limitations of Language Models Through Counterfactual Tasks
Z Wu, L Qiu, A Ross, E Akyürek, B Chen, B Wang, N Kim, J Andreas, ...
NAACL 2024, 2023
612023
Adapting Sequence Models for Sentence Correction
A Schmaltz, Y Kim, AM Rush, SM Shieber
EMNLP 2017, 2017
612017
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–20