ATLAS Institute opened its doors on Friday, October 19 to share its latest research and creative work, treating more than 200 visitors to laboratory and studio tours, project demonstrations and poster presentations. (See photo album from event.)
Held in partnership with CU Boulder’s Research and Innovation Week, the second annual ATLAS Research Showcase drew an enthusiastic group of visitors, eager to explore the numerous work in human-robot interaction, tech tattoos, typographic art and design, tangible computer interfaces, computer science education, smart textiles and more.
Miss this year's Research Showcase? Keep reading to learn more about projects and research showcased at the event!
Projects are listed under their associated labs, where applicable. Lab directors collaborate on all projects in their respective labs, so for brevity, only students and other related researcher are identified with individual project descriptions.
A Creativity Machine Environment Lab works on computational tools for design, especially sketching, creativity and design cognition, including creativity support tools and design studies, tangible and embedded interaction and, most recently, computing for health and wellness.
Clement Zheng, PhD student; HyunJoo Oh, PhD alumna
Paper is a modest material with rich affordances for craft and design. In this work, we explore shaping carbon-coated paper with manual and digital fabrication processes to create functional and aesthetic physical interfaces.
Peter Gyory, MS student; Clement Zheng, PhD student
Project explores a mechanism for hot swappable physical inputs on a game controller. In this iteration, we showcase a prototype controller for a game that requires the player to reconfigure its inputs during gameplay.
Clement Zheng, Ph.D. student; Amy Banic, visiting professor
Exploring the feasibility and implementation of bare skin tracking of hand and body joints for virtual environments. Kinetic Skin is a technique extended from the methodology of Delicate Paper Interfaces. In Kinectic Skin, the designs of input, typically printed on paper, are instead printed as a temporary water-based tattoo. Each tattoo sensor is worn on your body across a joint (for example but not limited to: finger, wrist, and elbow joints). Currently, we have single and multiple joint tracking to control a virtual environment. We plan to combine it with other cameraless tracking for more rich 3D input. The potential of this technology could be used for long-term tracking of body movements to inform rehabilitation, used as lightweight tracking for dance, music, other performing arts, or gaming, and other 3D User Interaction in Virtual and Augmented Reality Applications.
The BTU Lab is a collaborative hackerspace supporting fabrication, experimentation and design. While not a research lab, it’s a vibrant and dynamic hub for the ATLAS Institute student community, fostering a generation of creative technologists and inventors.
Ria Kahn, MS student
Join our interactive soldering workshop and learn a new skill.
The Interactive Robotics and Novel Technologies Lab explores human-centered principles for developing novel sensing, interactive and robotic technologies. Blending methods and techniques from computer science, design, engineering and the social sciences, the lab creates technologies that enable new forms of human assistance in applications, including collaborative work, education and space exploration.
Connor Brooks, PhD student
This project investigates control systems that help people operate complex robots. We researched different strategies for providing assistance to robot operators, including learning more about operators' goals so that robots can be more helpful.
Darren Guinness, PhD student; Annika Muehlbradt, PhD student; Daniel Szafir, Assistant Professor; Shaun Kane, Associate Professor
Off-the-shelf robots and touchscreens are used to provide tactile cues representing graphical information for individuals with vision impairments.
Michael Walker, PhD student; Hooman Hedayati, PhD student
A design for two novel interfaces for teleoperation using augmented reality: one focused on real-time control and one inspired by waypoint delegation. These designs increase operation effectiveness, support the user in conducting concurrent work and lower stress.
The Emergent Nanomaterials Lab manipulates matter on the smallest of scales to create materials with emergent properties, characterized by novel and sometimes surprising features arising from the interactions of multiple bodies. By synthesizing, assembling, combining, and organizing nanoscale building blocks, we design technologies that enhance the quality of human lives in the domains of health, energy, sensory augmentation, and self-expression.
Jesse Butterfield, PhD student; Phillip Vo, undergraduate student
This project explores how nano-engineered tattoo inks that impart specialized conductive and sensing properties can be used to permanently embed useful technologies in the skin. Potential applications include biomedical devices and wearable technologies that monitor and diagnose health issues.
The Laboratory for Playful Computation designs new programmable technologies and playful experiences to empower young people to learn through creative design.
Lila Finch, PhD student; Celeste Moreno, MS student
Art, science and computer science education are blended with an interdisciplinary curriculum that involves creating animated, programmed lanterns that visualize dynamic sensor data from hydroponic gardens and other sources. Curriculum is co-designed with teachers, who implement projects in their classrooms.
Annie Kelly, PhD student
Beginner-friendly toolkit for live performances, including programming light shows and instructions for building inexpensive tangible controllers with popular, DIY maker technologies.
Abigail Zimmermann-Niefield, PhD student
A machine learning toolkit for creating custom models of gesture and other embodied actions. Using a smartphone, the platform can collect data from sensors via Bluetooth, and provide feedback on activities from physical therapy to improving one’s golf chip.
The Transformative Human Interfaces for the Next Generation (THING) Lab employs shape-changing materials, novel sensors and unique design methods to make digital information tangible, paving the way for a new generation of interactivity that goes beyond sight and sound.
Purnendu, PhD student
Feel the power of sound waves and watch them levitate small objects. First in a line of projects aimed at harnessing the full potential of ultrasound for computer interfaces.
Lianne Gill, undergraduate student
Virtual Reality has the potential to transform computer-aided design into a physical, embodied activity. We envision new VR controllers that support physical measuring, sketching and designing in VR. The first in a set of tools is a paintbrush with smart bristles that sense the objects it touches and creates a virtual copy.
Peter Gyory, MS student; Clement Zheng, PhD student
Haptic VR Wizard is a Virtual Reality prototyping environment for physical objects. With this tool, designers can mock-up and test haptic feedback for appliances, furniture and robots through a combination of cardboard prototyping, 3D printed MechaMagnets and Wizard of Oz techniques.
Sydney Levy, undergraduate student; Toby Wu, undergraduate student; Mel Plett, MS student
Pneumatics enable a new generation of soft, deformable and shape-changing computer input devices, but designers and educators lack the tools to easily create and control inexpensive pneumatic interfaces. Our open source toolkit and instructions fill this gap through a mix of hardware, software and instructions.
Ethan Choe, undergraduate student; Corbin Peters, undergraduate student
To solve the problem of information overload of graphics and sound, we research different ways of presenting critical driving data through touch. Our vehicle haptics prototype renders sensor data, infotainment information and alerts through vibration and air flow.
Joel Swanson is an artist and writer who explores the relationship between language and technology. His work playfully subverts the technologies, materials and underlying structures of language to reveal its idiosyncrasies and inconsistencies. His work ranges from interactive installations to public sculptures that playfully and powerfully question words and their meanings.
Plans and preliminary conceptual work for upcoming public art installations in Minneapolis, Chicago, and San Fransisco, as well as an upcoming solo show at the Republic Plaza Building in downtown Denver.
The Unstable Design Lab weaves anthropology, art, design and engineering to imagine the future of human-technology relationships. We explore how instability—the idea that technology may challenge us, or not work as we expect—can be embraced through design to help us live more humanely, creatively, and sustainably with technology.
Josephine (Jolie) Klefeker, undergraduate student
Inspired by autonomous sensory meridian response (ASMR) media, the goal of this research is to construct a sonic toolkit for the recording and collection of sounds to support new and unconventional perceptions of one's environment.
Mikhaila Friske, PhD student
AdaCAD is a web-based tool that allows users to design weave drafts for smart textiles. By blending conventions in circuit design and textile design, AdaCAD allows a designer to simultaneously view smart textile designs in terms of their aesthetic design and their functional and structural properties.
Andrea Chamorro, undergraduate student
Finds catching and insightful patterns in biodata, as well as other sources, and uses them to generate patterns for e-textiles.
Shanel Wu, PhD student; Mikhaila Friske, PhD student; Chad DiLauro, undergraduate student
Various smart textiles created using a standard handweaver's floor loom and conductive yarns able to sense touch, change color and integrate with other electronic systems, such as Arduino.
Laura Devendorf, Assistant Professor (Rachel Bork, undergraduate student)
Weaving a textile that is partly determined by environmental conditions. Every fourth weft thread was set based on current wind speeds.
Whaaat!? is the reaction to an amazing interaction or novel experience—the feeling when the reality in front of you fails to match up with your expectations. The Whaaat!? Lab explores new ways to create this feeling and bring more delight to the world through games and experimental interactions.
Matthew Bethancourt, Lisa Bethancourt, Danny Rankin
This interactive, story-driven game is experienced through an analog telephone, where users interact with, and speaks into the phone to solve puzzles, find clues and advance the story.
Amanda McAndrews, Faculty Services Portfolio Manager; Caroline Sinkinson, Associate Professor; Blair Young, MS student
Originating at the University of Mary Washington, DoOO encompasses a technology platform and the pedagogical support systems to help students gain digital literacies and skills, and develop their digital identities in a supportive environment.
Layne Hubbard, MS student; Samuel Nesmith, undergraduate student; Josh Brown, MS student; Dr. Tom Yeh, Assistant Professor
MindScribe robotics enable preschool children to tell stories about their creative inventions. Through voice interaction, MindScribe stuffed animals ask preschoolers questions about their creations and guide them in improvisational storytelling.
Keke Wu, MS student; Matt Whitlock, PhD student
A novel system that integrates mobile data collection and cloud-based data fusion with mobile/AR visualization for in-situ field analysis.
Annie Bruns, Instructor; Amanda Rodriquez-Espinola, PhD student
Incorporates mindfulness practices and changes to smartphone layout, encouraging students to consume social media/internet content more mindfully.