Tuesday, October 31, 3:30-5:00 in MUEN D430
Paul G. Allen School of Computer Science & Engineering
University of Washington
Title: From Naive Physics to Connotation: Modeling Commonsense in Frame Semantics
Abstract: Intelligent communication requires reading between the lines, which in turn, requires rich background knowledge about how the world works. However, learning unspoken commonsense knowledge from language is nontrivial, as people rarely state the obvious, e.g., "my house is bigger than me.’’ In this talk, I will discuss how we can recover the trivial everyday knowledge just from language without an embodied agent. A key insight is this: the implicit knowledge people share and assume systematically influences the way people use language, which provides indirect clues to reason about the world. For example, if "Jen entered her house’’, it must be that her house is bigger than her. In this talk, I will first present how we can organize various aspects of commonsense — ranging from naive physics knowledge to more pragmatic connotations — by adapting representations of frame semantics. I will then discuss neural network approaches that complement the frame-centric approaches. I will conclude the talk by discussing the challenges in current models and formalisms, pointing to avenues for future research.
About the Speaker: Yejin Choi is an associate professor of Paul G. Allen School of Computer Science & Engineering at the University of Washington. Her recent research focuses on integrating language and vision, learning knowledge about the world from text and images, modeling richer context for natural language generation, and modeling nonliteral meaning of text using connotation frames. She was among the IEEE’s AI Top 10 to Watch in 2015 and a co-recipient of the Marr Prize at ICCV 2013. Her work on detecting deceptive reviews, predicting the literary success, and learning to interpret connotation has been featured by numerous media outlets including NBC News for New York, NPR Radio, New York Times, and Bloomberg Business Week. She received her Ph.D. in Computer Science at Cornell University.
Thursday, November 9, 3:30-5:00 in ECCR 265
Title: Expanding the Horizons of NLP: Opportunities and Challenges
Abstract: Natural language processing (NLP) has the potential to be employed in a broad array of user-facing applications. To realize this potential, however, we need to address several challenges related to representations, data availability and scalability. In this talk, I will discuss these concerns and how we may overcome them. First, as a motivating example of the broad reach of NLP, I will present our recent work on using language technology used to improve mental health treatment. Then, using the task of reading comprehension, I will show how the choice of representation can make a big difference in our ability to reason about text. Finally, using applications from semantics as examples, I will present a general strategy for training classifiers for linguistic structure using partially annotated examples spread across multiple datasets.
About the Speaker: Vivek Srikumar is an assistant professor in the School of Computing at the University of Utah. His research lies in the areas of natural learning processing and machine learning and has primarily been driven by questions arising from the need to learn structured representations of text using little or indirect supervision and to scale inference to large problems. His work has been published in various AI, NLP and machine learning venues and recently received the best paper award at EMNLP. Previously, he obtained his Ph.D. from the University of Illinois at Urbana-Champaign in 2013 and was a post-doctoral scholar at Stanford University.
Interactive Cognition Lab
University of California, Merced
Abstract: One of the key findings from studies on the interaction of conceptual metaphor with grammatical constructions is that generalizations exist in how the metaphoric source and target domains are expressed in argument structure constructions (Sullivan 2013, David 2016). For instance, in transitive constructions, it is the verb that evokes the (concrete) metaphoric source domain, while the direct object evokes the (abstract) metaphoric target domain, and not the reverse. Examples of this include crush someone’s spirit and tackle poverty. Indeed, as I will argue in this talk, when analyzing the full range of argument structure constructions some cross-constructional as well as cross-linguistic generalizations begin to emerge. First, I present the links between metaphor and grammar within the architecture of Embodied Construction Grammar. ECG is a version of construction grammar that formalizes the meaningfulness of grammatical structures as hierarchically-organized image schemas (Feldman, Dodge & Bryant 2009). Then, I show how establishing such links has been fruitful in MetaNet, an automated metaphor identification system (Stickles et al. 2016). Finally, I show how the metaphorical construction model is useful in doing annotation of metaphor in corpora. I introduce a team-based online annotation tool I developed – Constructional Annotation Net – in order to track metaphor use in a corpus of cancer patient blogs (David & Matlock in prep). These corpus and computational implementations of metaphoric constructions show that establishing the metaphor-grammar link can reap immediate computational benefits in conceptual metaphor research.
About the Speaker:
Dr. Oana David is a Chancellor’s post-doctoral fellow working in the Interactive Cognition Lab
at the University of California Merced. Dr. David’s research, which uses psycholinguistic and computational tools, concerns the use of theories and methods in metaphor research to achieve a better understanding of critical social issues. She will give a series of talks in the Linguistics department on Wednesday, November 8. She will address the Computational Semantics group in the morning, followed by the CLASP brown bag forum talk at noon and culminating with a Linguistics Circle talk at 4pm. We will post more information about these events as soon as they are scheduled.