Detecting emotional situations using convolutional neural networks and distributed models of human brain activity
Emotions are thought to be canonical responses to situations ancestrally linked to survival or the well-being of an organism. Although sensory elements do not fully determine the nature of emotional responses, they should be sufficient to convey the schema or situation that an organism must respond to. However, few computationally explicit models describe how combinations of stimulus features come to evoke different types of emotional responses, and further, it is not clear that activity in sensory (e.g., visual) cortex contains distinct codes for multiple classes of emotional responses in a rich way. In this talk I will present research 1) developing convolutional neural networks that identify different kinds of emotional situations, and 2) using human neuroimaging to understand how model representations of these situations are characterized by distributed patterns of activity in the human visual system. I will conclude by discussing future directions using machine learning approaches to understand emotional phenomena.