Следене
Jung-Woo Ha
Jung-Woo Ha
Head@NAVER AI Lab, Co-Director@SNU-NAVER Hyperscale / KAIST-NAVER Hypercreative AI Centers
Потвърден имейл адрес: navercorp.com - Начална страница
Заглавие
Позовавания
Позовавания
Година
StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation
Y Choi, M Choi, M Kim, JW Ha, S Kim, J Choo
CVPR 2018, 2018
28132018
StarGAN v2: Diverse Image Synthesis for Multiple Domains
Y Choi, Y Uh, J Yoo, JW Ha
Proceedings of the IEEE/CVF Conferences on Computer Vision and Pattern …, 2020
6842020
Hadamard product for low-rank bilinear pooling
JH Kim, KW On, J Kim, JW Ha, BT Zhang
ICLR 2017, 2017
6262017
Dual attention networks for multimodal reasoning and matching
H Nam, JW Ha, J Kim
CVPR 2017, 2017
6172017
Overcoming Catastrophic Forgetting by Incremental Moment Matching
SW Lee, JW Kim, JH Jeon, JW Ha, BT Zhang
NIPS 2017, 2017
4502017
Multimodal residual learning for visual qa
JH Kim, SW Lee, D Kwak, MO Heo, J Kim, JW Ha, BT Zhang
Advances in Neural Information Processing Systems, 361-369, 2016
2952016
Photorealistic Style Transfer via Wavelet Transforms
J Yoo, Y Uh, S Chun, B Kang, JW Ha
arXiv preprint arXiv:1903.09760 (ICCV 2019), 2019
1892019
Phase-Aware Speech Enhancement with Deep Complex U-Net
HS Choi, J Kim, J Huh, A Kim, JW Ha, K Lee
ICLR 2019 (to appear), 2019
1822019
DialogWAE: Multimodal Response Generation with Conditional Wasserstein Auto-Encoder
X Gu, K Cho, JW Ha, S Kim
arXiv:1805.12352 (ICLR 2019), 2019
1252019
AdamP: Slowing down the weight norm increase in momentum-based optimizers
B Heo, S Chun, SJ Oh, D Han, S Yun, Y Uh, JW Ha
arXiv preprint arXiv:2006.08217 (ICLR 2021), 2021
83*2021
Nsml: Meet the mlaas platform with a real-world case study
H Kim, M Kim, D Seo, J Kim, H Park, S Park, H Jo, KH Kim, Y Yang, Y Kim, ...
arXiv preprint arXiv:1810.09957, 2018
802018
Reinforcement learning based recommender system using biclustering technique
S Choi, H Ha, U Hwang, C Kim, JW Ha, S Yoon
arXiv preprint arXiv:1801.05532, 2018
622018
NSML: A Machine Learning Platform That Enables You to Focus on Your Models
N Sung, M Kim, H Jo, Y Yang, J Kim, L Lausen, Y Kim, G Lee, D Kwak, ...
arXiv:1712.05902, https://arxiv.org/abs/1712.05902, 2017
602017
Large-Scale Item Categorization in e-Commerce Using Multiple Recurrent Neural Networks
JW Ha, H Pyo, J Kim
the 22nd ACM SIGKDD Conference on Knowledge Discovery and Data Mining 2016 …, 2016
562016
KLUE: Korean Language Understanding Evaluation
S Park, J Moon, S Kim, WI Cho, J Han, J Park, C Song, J Kim, Y Song, ...
arxiv preprinting arXiv:2105.09680 (NeurIPS 2021 Dataset and Benchmark Track), 2021
552021
Representation Learning of Music Using Artist Labels
J Park, J Lee, J Park, JW Ha, J Nam
arXiv preprint arXiv:1710.06648, ISMIR 2018(to appear), 2017
552017
Rainbow Memory: Continual Learning with a Memory of Diverse Samples
J Bang, H Kim, YJ Yoo, JW Ha, J Choi
arXiv preprint arXiv:2103.17230 (CVPR 2021), 2021
482021
Evolutionary hypernetwork models for aptamer-based cardiovascular disease diagnosis
JW Ha, JH Eom, SC Kim, BT Zhang
Proceedings of the 9th annual conference companion on Genetic and …, 2007
482007
Automated construction of visual-linguistic knowledge via concept learning from cartoon videos
JW Ha, KM Kim, BT Zhang
Twenty-Ninth AAAI Conference on Artificial Intelligence, 2015
432015
Sparse Population Code Models of Word Learning in Concept Drift.
BT Zhang, JW Ha, M Kang
CogSci 2012, 2012
402012
Системата не може да изпълни операцията сега. Опитайте отново по-късно.
Статии 1–20