Следене
Sheng Zhang
Sheng Zhang
Microsoft Research
Потвърден имейл адрес: microsoft.com - Начална страница
Заглавие
Позовавания
Позовавания
Година
BioGPT: generative pre-trained transformer for biomedical text generation and mining
R Luo, L Sun, Y Xia, T Qin, S Zhang, H Poon, TY Liu
Briefings in bioinformatics 23 (6), bbac409, 2022
6202022
LLaVA-Med: Training a large language-and-vision assistant for biomedicine in one day
C Li, C Wong, S Zhang, N Usuyama, H Liu, J Yang, T Naumann, H Poon, ...
arXiv preprint arXiv:2306.00890, 2023
3432023
ReCoRD: Bridging the gap between human and machine commonsense reading comprehension
S Zhang, X Liu, J Liu, J Gao, K Duh, B Van Durme
arXiv preprint arXiv:1810.12885, 2018
2702018
AMR parsing as sequence-to-graph transduction
S Zhang, X Ma, K Duh, B Van Durme
arXiv preprint arXiv:1905.08704, 2019
1892019
Universal decompositional semantics on universal dependencies
AS White, D Reisinger, K Sakaguchi, T Vieira, S Zhang, R Rudinger, ...
Proceedings of the 2016 Conference on Empirical Methods in Natural Language …, 2016
1892016
Can generalist foundation models outcompete special-purpose tuning? case study in medicine
H Nori, YT Lee, S Zhang, D Carignan, R Edgar, N Fusi, N King, J Larson, ...
arXiv preprint arXiv:2311.16452, 2023
1822023
Deep generalized canonical correlation analysis
A Benton, H Khayrallah, B Gujral, DA Reisinger, S Zhang, R Arora
arXiv preprint arXiv:1702.02519, 2017
1812017
BiomedCLIP: a multimodal biomedical foundation model pretrained from fifteen million scientific image-text pairs
S Zhang, Y Xu, N Usuyama, H Xu, J Bagga, R Tinn, S Preston, R Rao, ...
arXiv preprint arXiv:2303.00915, 2023
176*2023
Ordinal common-sense inference
S Zhang, R Rudinger, K Duh, B Van Durme
Transactions of the Association of Computational Linguistics, 2017
1282017
Answering natural language questions via phrasal semantic parsing
K Xu, S Zhang, Y Feng, D Zhao
CCF International Conference on Natural Language Processing and Chinese …, 2014
1222014
Broad-coverage semantic parsing as transduction
S Zhang, X Ma, K Duh, B Van Durme
arXiv preprint arXiv:1909.02607, 2019
802019
Context-faithful prompting for large language models
W Zhou, S Zhang, H Poon, M Chen
arXiv preprint arXiv:2303.11315, 2023
712023
UniversalNER: Targeted distillation from large language models for open named entity recognition
W Zhou, S Zhang, Y Gu, M Chen, H Poon
arXiv preprint arXiv:2308.03279, 2023
692023
A whole-slide foundation model for digital pathology from real-world data
H Xu, N Usuyama, J Bagga, S Zhang, R Rao, T Naumann, C Wong, ...
Nature, 1-8, 2024
452024
Optimizing bi-encoder for named entity recognition via contrastive learning
S Zhang, H Cheng, J Gao, H Poon
arXiv preprint arXiv:2208.14565, 2022
442022
An Evaluation of PredPatt and Open IE via Stage 1 Semantic Role Labeling
S Zhang, R Rudinger, B Van Durme
IWCS 2017—12th International Conference on Computational Semantics—Short …, 2017
432017
MT/IE: Cross-lingual open information extraction with neural sequence-to-sequence models
S Zhang, K Duh, B Van Durme
Proceedings of the 15th Conference of the European Chapter of the …, 2017
412017
Knowledge-rich self-supervision for biomedical entity linking
S Zhang, H Cheng, S Vashishth, C Wong, J Xiao, X Liu, T Naumann, ...
arXiv preprint arXiv:2112.07887, 2021
312021
The universal decompositional semantics dataset and decomp toolkit
AS White, E Stengel-Eskin, S Vashishtha, V Govindarajan, DA Reisinger, ...
arXiv preprint arXiv:1909.13851, 2019
312019
Neural-Davidsonian semantic proto-role labeling
R Rudinger, A Teichert, R Culkin, S Zhang, B Van Durme
arXiv preprint arXiv:1804.07976, 2018
292018
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–20