Следене
Machel Reid
Machel Reid
Research Scientist, Google DeepMind
Потвърден имейл адрес: google.com - Начална страница
Заглавие
Позовавания
Позовавания
Година
Large Language Models are Zero-Shot Reasoners
T Kojima, SS Gu, M Reid, Y Matsuo, Y Iwasawa
NeurIPS 2022, 2022
17732022
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
3482023
Can wikipedia help offline reinforcement learning?
M Reid, Y Yamada, SS Gu
arXiv preprint arXiv:2201.12122, 2022
862022
LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer
M Reid, V Zhong
Findings of the Annual Meeting of the Association for Computational …, 2021
592021
A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
DI Adelani, JO Alabi, A Fan, J Kreutzer, X Shen, M Reid, D Ruiter, ...
NAACL 2022, 2022
36*2022
Diffuser: Diffusion via edit-based reconstruction
M Reid, VJ Hellendoorn, G Neubig
The Eleventh International Conference on Learning Representations, 2022
31*2022
Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
M Reid, E Marrese-Taylor, Y Matsuo
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2021
312021
Learning to Model Editing Processes
M Reid, G Neubig
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2022
242022
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
M Reid, J Hu, G Neubig, Y Matsuo
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
232021
VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling
M Reid, E Marrese-Taylor, Y Matsuo
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
212020
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
M Reid, M Artetxe
Conference of the North American Chapter of the Association for …, 2021
162021
M2D2: A Massively Multi-domain Language Modeling Dataset
M Reid, V Zhong, S Gururangan, L Zettlemoyer
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
122022
Low-Resource Machine Translation Using Cross-Lingual Language Model Pretraining
F Zheng, M Reid, E Marrese-Taylor, Y Matsuo
AmericasNLP Workshop, NAACL 2021, 2021
112021
On the impact of data augmentation on downstream performance in natural language processing
I Okimura, M Reid, M Kawano, Y Matsuo
Proceedings of the Third Workshop on Insights from Negative Results in NLP …, 2022
102022
mmt5: Modular multilingual pre-training solves source language hallucinations
J Pfeiffer, F Piccinno, M Nicosia, X Wang, M Reid, S Ruder
arXiv preprint arXiv:2305.14224, 2023
82023
Variational Inference for Learning Representations of Natural Language Edits
E Marrese-Taylor, M Reid, Y Matsuo
AAAI 2021, 2020
82020
On the role of parallel data in cross-lingual transfer learning
M Reid, M Artetxe
arXiv preprint arXiv:2212.10173, 2022
42022
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
M Reid, N Savinov, D Teplyashin, D Lepikhin, T Lillicrap, J Alayrac, ...
arXiv preprint arXiv:2403.05530, 2024
32024
Gemma: Open models based on gemini research and technology
G Team, T Mesnard, C Hardin, R Dadashi, S Bhupatiraju, S Pathak, ...
arXiv preprint arXiv:2403.08295, 2024
22024
Buffet: Benchmarking large language models for few-shot cross-lingual transfer
A Asai, S Kudugunta, XV Yu, T Blevins, H Gonen, M Reid, Y Tsvetkov, ...
arXiv preprint arXiv:2305.14857, 2023
22023
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–20