Murtaza Bulut
Cited by
Cited by
IEMOCAP: Interactive emotional dyadic motion capture database
C Busso, M Bulut, CC Lee, A Kazemzadeh, E Mower, S Kim, JN Chang, ...
Language resources and evaluation 42, 335-359, 2008
Analysis of emotion recognition using facial expressions, speech and multimodal information
C Busso, Z Deng, S Yildirim, M Bulut, CM Lee, A Kazemzadeh, S Lee, ...
Proceedings of the 6th international conference on Multimodal interfaces …, 2004
Emotion recognition based on phoneme classes
CM Lee, S Yildirim, M Bulut, A Kazemzadeh, C Busso, Z Deng, S Lee, ...
Eighth international conference on spoken language processing, 2004
An acoustic study of emotions expressed in speech
S Yildirim, M Bulut, CM Lee, A Kazemzadeh, Z Deng, S Lee, S Narayanan, ...
Eighth International Conference on Spoken Language Processing, 2004
Expressive speech synthesis using a concatenative synthesizer.
M Bulut, SS Narayanan, AK Syrdal
Expressive facial animation synthesis by learning speech coarticulation and expression spaces
Z Deng, U Neumann, JP Lewis, TY Kim, M Bulut, S Narayanan
IEEE transactions on visualization and computer graphics 12 (6), 1523-1534, 2006
Toward effective automatic recognition systems of emotion in speech
C Busso, M Bulut, S Narayanan, J Gratch, S Marsella
Social emotions in nature and artifact: emotions in human and human-computer …, 2013
Limited domain synthesis of expressive military speech for animated characters
WL Johnson, S Narayanan, R Whitney, R Das, M Bulut, C LaBore
Proceedings of 2002 IEEE Workshop on Speech Synthesis, 2002., 163-166, 2002
On the robustness of overall F0-only modifications to the perception of emotions in speech
M Bulut, S Narayanan
The Journal of the Acoustical Society of America 123 (6), 4547-4558, 2008
Constructing emotional speech synthesizers with limited speech database
R Tsuzuki, H Zen, K Tokuda, T Kitamura, M Bulut, S Narayanan
Proc. ICSLP 2, 1185-1188, 2004
Method and system for assisting patients
RS Jasinschi, M Bulut, L Bellodi
US Patent 9,747,902, 2017
Investigating the role of phoneme-level modifications in emotional speech resynthesis
M Bulut, C Busso, S Yildirim, A Kazemzadeh, CM Lee, S Lee, ...
Ninth European Conference on Speech Communication and Technology, 2005
Camera-based heart rate monitoring in highly dynamic light conditions
V Jeanne, M Asselman, B den Brinker, M Bulut
2013 International Conference on Connected Vehicles and Expo (ICCVE), 798-799, 2013
Device, system and method for obtaining vital sign information of a subject
M Bulut
US Patent App. 15/761,821, 2018
Automatic dynamic expression synthesis for speech animation
Z Deng, M Bulut, U Neumann, S Narayanan
Proc. of IEEE Computer Animation and Social Agents 2004, 267-274, 2004
Enhancing physical activity through context-aware coaching
S Van Dantzig, M Bulut, M Krans, A Van Der Lans, B De Ruyter
Proceedings of the 12th EAI International Conference on Pervasive Computing …, 2018
Stress-measuring system
AAML Bruekers, M Bulut, V Mihajlovic, M Ouwerkerk, JHDM Westerink
US Patent 10,758,179, 2020
A statistical approach for modeling prosody features using POS tags for emotional speech synthesis
M Bulut, S Lee, S Narayanan
2007 IEEE International Conference on Acoustics, Speech and Signal …, 2007
Recognition for synthesis: Automatic parameter selection for resynthesis of emotional speech from neutral speech
M Bulut, S Lee, S Narayanan
2008 IEEE International Conference on Acoustics, Speech and Signal …, 2008
Signal selection for obtaining a remote photoplethysmographic waveform
AC Den Brinker, M Bulut, V Jeanne
US Patent 9,907,474, 2018
The system can't perform the operation now. Try again later.
Articles 1–20