We are happy to announce that 19 members of the ATLAS community contributed to work accepted for the 2023 ACM CHI Conference on Human Factors in Computing Systems, taking place in Hamburg, Germany, April 23–28.
Accepting fewer than 25 percent of submissions, CHI is the premier international conference on human-computer interaction (HCI), attracting researchers and practitioners from around the world.
A special shout-out goes to Laura Devendorf, Etta Sandry and Emma Goodwill, who were awarded an Honorable Mention (top 5% of submissions) for their paper, "AdaCAD: Parametric Design as a New Form of Notation for Complex Weaving."
Details of all accepted work by members of the ATLAS community, which includes faculty with tenure homes in the College of Engineering and Applied Science and College of Media, Communication and Information, are listed below.
ATLAS @ CHI 2023
Laura Devendorf (ATLAS Unstable Design Lab Director, Information Science faculty member), Kathryn Walters, Marianne Fairbanks, Etta Sandry (ATLAS Unstable Design Lab weaving resident), Emma R. Goodwill (ATLAS Unstable Design Lab member, undergraduate student)
Woven textiles are increasingly a medium through which HCI is inventing new technologies. Key challenges in integrating woven textiles in HCI include the high level of textile knowledge required to make effective use of the new possibilities they afford and the need for tools that bridge the concerns of textile designers and concerns of HCI researchers. This paper presents AdaCAD, a parametric design tool for designing woven textile structures. Through our design and evaluation of AdaCAD we found that parametric design helps weavers notate and explain the logics behind the complex structures they generate. We discuss these finding in relation to prior work in integrating craft and/or weaving in HCI, histories of woven notation, and boundary object theory to illuminate further possibilities for collaboration between craftspeople and HCI practitioners.
Peter Gyory (ATLAS ACME Lab member, PhD candidate), S. Sandra Bae (ATLAS ACME Lab member, PhD student), Ruhan Yang (ATLAS ACME Lab member, PhD student), Ellen Yi-Luen Do (ATLAS ACME Lab Director, Computer Science faculty member), Clement Zheng (PhD alumnus, ATLAS ACME Lab)
The electronics-centered approach to physical computing presents challenges when designers build tangible interactive systems due to its inherent emphasis on circuitry and electronic components. To explore an alternative physical computing approach we have developed a computer vision (CV) based system that uses a webcam, computer, and printed fiducial markers to create functional tangible interfaces. Through a series of design studios, we probed how designers build tangible interfaces with this CV-driven approach. In this paper, we apply the annotated portfolio method to reflect on the fifteen outcomes from these studios. We observed that CV markers offer versatile materiality for tangible interactions, afford the use of democratic materials for interface construction, and engage designers in embodied debugging with their own vision as a proxy for CV. By sharing our insights, we inform other designers and educators who seek alternative ways to facilitate physical computing and tangible interaction design.
Designing with living organisms can offer new perspectives to design research and practices in HCI. In this work, we explore first-person perspectives through design research with Kombucha Scoby, a microbial biofilm. We began with a material design exploration, producing digitally fabricated and crafted samples with Scoby. As we noticed our felt experiences while growing and working with Kombucha Scoby, we shifted towards a reflective autoethnographic study. Through reflective writings, we followed sensory experiences such as hearing the Kombucha fermentation, touching the Scoby while harvesting it, and watching the slow growth of layers over time. Subsequently, we designed "sensory engagement probes”: designed experiments that bring forward new connections and communicate our process, motivations, and tensions that emerged while engaging with the organism. Lastly, we discuss how such design research can inform material design with living matter by creating space to contemplate "life as shared experience" and more-than-human design perspectives.
Data is everywhere, but may not be accessible to everyone. Conventional data visualization tools and guidelines often do not actively consider the specific needs and abilities of people with Intellectual and Developmental Disabilities (IDD), leaving them excluded from data-driven activities and vulnerable to ethical issues. To understand the needs and challenges people with IDD have with data, we conducted 15 semi-structured interviews with individuals with IDD and their caregivers. Our algorithmic interview approach situated data in the lived experiences of people with IDD to uncover otherwise hidden data encounters in their everyday life. Drawing on findings and observations, we characterize how they conceptualize data, when and where they use data, and what barriers exist when they interact with data. We use our results as a lens to reimagine the role of visualization in data accessibility and establish a critical near-term research agenda for cognitively accessible visualization.
Keke Wu (recent ATLAS PhD student)
Visualization amplifies cognition and helps a viewer see the trends, patterns, and outliers in data. However, conventional visualization tools and guidelines do not actively consider the unique needs and abilities of people with Intellectual and Developmental Disabilities (IDD), leaving them excluded from data-driven activities and vulnerable to ethical issues in everyday life. My dissertation work explores the challenges and opportunities of cognitively accessible visualization. Through mixed-method approaches and close collaboration with people with IDD, my team and I ran experiments and developed guidelines to improve current visualizations, we interviewed people with IDD and gained an initial understanding of their daily data experiences, and we are currently in the process of revising a participatory design workshop to create accessible visualizations for and with this population. For the remaining dissertation work, I hope to further expand our knowledge of cognitively accessible visualization, translating what I have learned from these experiences into a graphical user interface that supports people with IDD with their self-advocacy and self-expression using personally relevant data. My ultimate career goal is to theorize cognitively accessible visualization and empower people with IDD to make informed decisions and generate meaningful discoveries through accessible visual analytics.
Glazed ceramic is a versatile material that we use every day. In this paper, we present a new approach that instruments existing glazed ceramic ware with interactive electronic circuits. We informed this work by collaborating with a ceramics designer and connected his craft practice to our experience in physical computing. From this partnership, we developed a systematic approach that begins with the subtractive fabrication of traces on glazed ceramic surfaces via the resist-blasting technique, followed by applying conductive ink into the inlaid traces. We capture and detail this approach through an annotated flowchart for others to refer to, as well as externalize the material insights we uncovered through ceramic and circuit swatches. We then demonstrate a range of interactive home applications built with this approach. Finally, we reflect on the process we took and discuss the importance of collaborating with craftspeople for material-driven research within HCI.
Ran Zhou (ATLAS THING Lab member, PhD student), Zachary Schwemler (ATLAS MS alumnus), Akshay Baweja, Harpreet Sareen, Casey Lee Hunt (ATLAS THING Lab member, PhD student), Daniel Leithinger (ATLAS THING Lab Director, Computer Science faculty member)
Emerging research has demonstrated the viability of emotional communication through haptic technology inspired by interpersonal touch. However, the meaning-making of artificial touch remains ambiguous and contextual. We see this ambiguity caused by robotic touch’s "otherness" as an opportunity for exploring alternatives. To empower emotional haptic design in longitudinal out-of-lab exploration, we devise TactorBots, a design toolkit consisting of eight wearable hardware modules for rendering robotic touch gestures controlled by a web-based software application. We deployed TactorBots to thirteen designers and researchers to validate its functionality, characterize its design experience, and analyze what, how, and why alternative perceptions, practices, contexts, and metaphors would emerge in the experiment. We provide suggestions for designing future toolkits and field studies based on our experiences. Reflecting on the findings, we derive design implications for further enhancing the ambiguity and shifting the mindsets to expand the design space.
Note: This team will also lead an Interactivity session: Demonstrating TactorBots: A Haptic Design Toolkit for Exploration of Emotional Robotic Touch
ATLAS will also be represented at the Electrofab 2023 workshop during CHI. This year’s theme is “Beyond Prototyping Boards: Future Paradigms for Electronics Toolkits,” and will feature two papers authored by ATLAS members.
This paper introduces a new method of paper circuit fabrication that overcomes design barriers and increases flexibility in circuit design. Conventional circuit boards rely on thin traces, which limits the complexity and accuracy when applied to paper circuits. To address this issue, we propose a method that uses large conductive zones in paper circuits and performs subtractive processing during their fabrication. This approach eliminates design barriers and allows for more flexibility in circuit design. We introduce PaperCAD, a software tool that simplifies the design process by converting traditional circuit design to paper circuit design. We demonstrate our technique by creating two paper circuit boards. Our approach has the potential to promote the development of new applications for paper circuits.
The electronics-centered approach to physical computing presents challenges when designers build tangible interactive systems due to its inherent emphasis on circuitry and electronic components. To explore an alternative physical computing approach we have developed a computer vision (CV) based system that uses a webcam, computer, and printed fiducial markers to create functional tangible interfaces. Over the last three years, we ran a series of studios with design participants to investigate how CV markers can participate in physical computing and the construction of physical interactive systems. We observed that CV markers offer versatile materiality for tangible interactions, afford the use of democratic materials for interface construction, and engage designers in embodied debugging with their own vision as a proxy for CV. Taking these insights, we are developing a visual editor that enables designers to easily program marker behavior and connect it to keyboard events. We believe that such a platform will enable designers to develop physical and digital interfaces concurrently while minimizing the complexity of integrating both sides. In addition, this platform can also facilitate the construction of many alternative interfaces for existing software that cater to different people. We discuss our motivation, progress, and future work of this research here.
Two ATLAS community members also co-organized a workshop in the Extended Abstracts portion of CHI 2023.
Jack Forman, Pat Pataranutaporn, Phillip Gough, Raphael Kim, Fiona Bell (PhD Candidate), Netta Ofer (ATLAS Living Matter Lab member, PhD student), Jasmine Lu, Angela Vujic, Muqing Bai, Pattie Maes, Hiroshi Ishii, Misha Sra
As knowledge around bio-digital interaction continues to unfold, there are new opportunities for HCI researchers to integrate biology as a design and computational material. Our motivation for the workshop is to bring together interdisciplinary researchers with interest in exploring the next generation of biological HCI and exploring novel bio-digital interfaces implicating diverse contexts, scales, and stakeholders. The workshop aims to provide a space for interactive discussions, presentations, and brainstorming regarding opportunities and approaches for HCI around bio-digital interfaces. We invite researchers from both academia and industry to submit a short position paper in the following areas: Synthetic Biology, Biological Circuits, Do-It-Yourself Biology (DIYBio), Biomimetic Interfaces, Living Interfaces, Living Artefacts, and Bio-ethics. We will evaluate submissions on fit, ability to stimulate discussion, and contribution to HCI. On our website we have included examples of past work in this area to help inspire and inform position papers. Our website will host a recording of the entire workshop session with accepted papers to support asynchronous viewing for participants who are unable to attend in-person or synchronously.