Следене
Yixuan Su
Yixuan Su
Research Scientist@Cohere
Потвърден имейл адрес: cohere.com - Начална страница
Заглавие
Позовавания
Позовавания
Година
Multi-Task Pre-Training for Plug-and-Play Task-Oriented Dialogue System
Y Su, L Shu, E Mansimov, A Gupta, D Cai, YA Lai, Y Zhang
ACL'22, 2021
1622021
Pandagpt: One model to instruction-follow them all
Y Su, T Lan, H Li, J Xu, Y Wang, D Cai
TLLM'23, 2023
1602023
A Contrastive Framework for Neural Text Generation
Y Su, T Lan, Y Wang, D Yogatama, L Kong, N Collier
NeurIPS'22 (Spotlight), 2022
156*2022
A survey on retrieval-augmented text generation
H Li, Y Su, D Cai, Y Wang, L Liu
arXiv preprint arXiv:2202.01110, 2022
146*2022
Language models can see: plugging visual controls in text generation
Y Su, T Lan, Y Liu, F Liu, D Yogatama, Y Wang, L Kong, N Collier
arXiv preprint arXiv:2205.02655, 2022
95*2022
Plan-then-Generate: Controlled Data-to-Text Generation via Planning
Y Su, D Vandyke, S Wang, Y Fang, N Collier
EMNLP'21-Findings, 2021
672021
Starcoder 2 and the stack v2: The next generation
A Lozhkov, R Li, LB Allal, F Cassano, J Lamy-Poirier, N Tazi, A Tang, ...
arXiv preprint arXiv:2402.19173, 2024
532024
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning
Y Su, F Liu, Z Meng, L Shu, E Shareghi, N Collier
NAACL'22-Findings, 2021
502021
Non-autoregressive text generation with pre-trained language models
Y Su, D Cai, Y Wang, D Vandyke, S Baker, P Li, N Collier
EACL'21, 2021
452021
Dialogue Response Selection with Hierarchical Curriculum Learning
Y Su, D Cai, Q Zhou, Z Lin, S Baker, Y Cao, S Shi, N Collier, Y Wang
ACL'21, 2021
412021
Contrastive search is what you need for neural text generation
Y Su, N Collier
TMLR'23, 2022
382022
Rewire-then-Probe: A Contrastive Recipe for Probing Biomedical Knowledge of Pre-trained Language Models
Z Meng, F Liu, E Shareghi, Y Su, C Collins, N Collier
ACL'22, 2021
38*2021
Prototype-to-style: Dialogue generation with style-aware editing on retrieval memory
Y Su, Y Wang, D Cai, S Baker, A Korhonen, N Collier
TASLP'21, 2021
362021
Few-Shot Table-to-Text Generation with Prototype Memory
Y Su, Z Meng, S Baker, N Collier
EMNLP'21-Findings, 2021
292021
Keep the Primary, Rewrite the Secondary: A Two-Stage Approach for Paraphrase Generation
Y Su, D Vandyke, S Baker, Y Wang, N Collier
ACL'21-Findings, 2021
182021
Sparkles: Unlocking chats across multiple images for multimodal instruction-following models
Y Huang, Z Meng, F Liu, Y Su, N Collier, Y Lu
arXiv preprint arXiv:2308.16463, 2023
16*2023
Replacing Judges with Juries: Evaluating LLM Generations with a Panel of Diverse Models
P Verga, S Hofstatter, S Althammer, Y Su, A Piktus, A Arkhangorodsky, ...
arXiv preprint arXiv:2404.18796, 2024
15*2024
Exploring dense retrieval for dialogue response selection
T Lan, D Cai, Y Wang, Y Su, H Huang, XL Mao
ACM Transactions on Information Systems 42 (3), 1-29, 2024
12*2024
Specialist or Generalist? Instruction Tuning for Specific NLP Tasks
C Shi, Y Su, C Yang, Y Yang, D Cai
EMNLP'23, 2023
9*2023
An empirical study on contrastive search and contrastive decoding for open-ended text generation
Y Su, J Xu
arXiv preprint arXiv:2211.10797, 2022
72022
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–20