Furniture that can be digitally reconfigured by robots to correspond to virtual reality, tattoos that serve medical and aesthetic functionality, weaving circuits into smart textiles: These are just a few of the projects that will be demonstrated at ATLAS Institute's third annual Research Showcase, held in partnership with CU Boulder’s Research & Innovation Week.

Thumbnail for ATLAS Research Showcase programThe showcase features laboratory and studio tours, demos and poster presentations, as well as opportunities to connect with the institute's thriving community of creative technologists, inventors and artists working across and between multiple disciplines. Showcase visitors can explore work in human-robot interactions, tech tattoos, tangible computer interfaces, computer science education, smart textiles and much more. 
DOWNLOAD EVENT PROGRAM 

If you go

Who: Everyone

What: ATLAS 3rd annual Research Showcase

Where: Roser ATLAS Center

When: Oct. 18, 3:30-5:30 p.m.

Cost: Free. REGISTER NOW 

 

Here are some of the projects attendees can visit in this year's showcase:

  • LEGO ChemBot: This automated robot capable of performing repetitive lab work such as weighing and combining compounds and liquids, temperature control, and remote sensing and actuation is constructed of thousands of LEGO pieces, providing functionality similar to commercially available robotic equipment lab equipment at a fraction of the cost.
  • HOT SWAP: All Hands On Deck  Winner of the 2019 Indiecade "Innovation in Interaction Design" award for their design of novel interchangeable controllers, HOT SWAP is a collaborative naval combat game with five shared controllers that must be constantly swapped if the team is to survive ongoing attacks from a fleet of pirate ships. This is the first time HOT SWAP is on display since winning the prestigious award. 
  • AdaCAD:  A web application to aid in the design of smart textiles, including features that help the user define large patterns, predict connectivity and visualize embedded circuits.   
  • Shapebots: Shape-changing swarm robots are a new type of computer interface that consists of a swarm of self-transformable robots. These robots can both individually and collectively change their shape to display information, actuate objects, act as tangible controllers and visualize data. 

About ATLAS

Part of CU Boulder’s College of Engineering & Applied Science, ATLAS is an interdisciplinary institute for radical creativity and invention. The institute has rapidly expanded its research activities over the last five years, attracting talented new faculty members and students, and establishing new research labs and design studios. If you haven't seen the new ATLAS, this is your opportunity!

Participating Labs and Studios 

ACME Lab 
Director: Ellen Do
A Creativity Machine Environment Lab works on computational tools for design, especially sketching, creativity and design cognition, including creativity support tools and design studies, tangible and embedded interaction and, most recently, computing for health and wellness. 

Identity Lab
Director: Jed Brubaker, Department of Information Science
The Identity Lab explores how identity is designed, represented and experienced through technology.

IRON Lab
Director: Dan Szafir
The Interactive Robotics and Novel Technologies Lab explores human-centered principles for novel sensing, interactive and robotic technologies. Blending techniques from computer science, design, engineering and the social sciences, the lab creates technologies that enable more nuanced and intuitive forms of robotic assistance in collaborative work, education and space exploration.

Joel Swanson Studio
Joel Swanson is an artist and designer who explores how technology influences and shapes language by critically subverting the related technologies, materials and structures to reveal idiosyncrasies and inconsistencies. Shown nationally and internationally, Swanson's work ranges from interactive installations to public sculptures that playfully and powerfully question words and their meanings.

Laboratory for Emergent Nanomaterials 
Director: Carson Bruns
The Emergent Nanomaterials Lab manipulates matter on the smallest of scales to create materials with emergent properties, characterized by novel and sometimes surprising features arising from the interactions of multiple bodies. By synthesizing, assembling, combining, and organizing nanoscale building blocks, researchers design technologies that enhance the quality of human lives in the domains of health, energy, sensory augmentation and self-expression.

Living Matter Lab
Director: Mirela Alistar
The Living Matter Lab, directed by Mirela Alistar, looks to move complex diagnostics out of the lab and into the home through pioneering new technologies, in particular through developing compact and highly configurable digital microfluidic biochips.

MettaCognition Lab
Director: Annie Bruns
The MettaCognition Lab investigates how mindfulness and loving-kind- ness (metta) meditation cultivates critical skills for resilience, emotional intelligence and attentional control.

THING Lab 
Director: Daniel Leithinger
The Transformative Human Interfaces for the Next Generation (THING) Lab employs shape-changing materials, novel sensors and unique design methods to make digital information tangible, paving the way for a new generation of interactivity that goes beyond sight and sound.

Unstable Design Lab  
Director: Laura Devendorf
The Unstable Design Lab interweaves anthropology, art, design and engineering to imagine the future of human-technology relationships. We explore how instability—the idea that technology may challenge us, or not work as we expect—can be embraced through design to live more humanely, creatively and sustainably with technology.

VisuaLab
Director: Danielle Szafir
The VisuaLab works at the intersection of data science, visual cognition, and computer graphics. We quantify how people make sense of visual information to develop novel visualization techniques, interactive sys- tems, and computational models for working with data across domains.

Whaaat!? Lab
Directors: Matt Bethancourt & Danny Rankin
The Whaaat!? Lab explores and designs games and experimental interactions that aim to surprise and delight in ways that make the player say, “whaaat!?

 

Projects

AR/VR and Robotics  
Michael Walker, Hooman Hedayati, Midhun Sreekumar Menon
IRON Lab,
ATLS 234
Research showing that augmented and virtual reality can significantly improve user performances and experiences with robots as scientific advancements of robotic hardware and software allow robots to be brought into workplaces and homes.

Augmented Reality Remote Assistance (ARRA)  
Peter Gyory, Kyle Neubarth, Chad Lewis, Blake Hampton, Gabriel Chapel, Hyerin Seok, Dan Szafir, Ellen Do, Daniel Leithinger, Per Karlsson
ACME Lab, THING Lab, ATLS 2B29–MoCap Studio
ARRA improves remote collaboration and guidance for tasks like device repair and search and rescue by sending video feeds of remote people and scanned 3D models of their remote environments. Experts can then navigate and annotate this 3D model in virtual reality. 

Autonomous Forest Trail Navigation  
Ashwin Vasan, Michael Walker,
IRON Lab,
ATLS 234
Using simulated 3D environments, we train a quadcopter to fly autonomously and navigate a forest trail environment. Using a VR headset, we can visualize and navigate this forest-based 3D simulation.

:: Body   
Armon Naeini , 
Open Access Week Installation
An interactive augmented reality installation selected for CU Boulder's Open Access Week, Oct 21-24.

Delete After Death: Improving Facebook’s Postmortem Options
Katie Z. Gach, Facebook Memorialization Team
Identity Lab (Dept. of Information Science, CMCI),
ATLS 225
Facebook is often legally obliged to delete accounts if they learn of a user's demise. Erasing years of personal online reflections and photos can be deeply painful for surviving loved ones. This research led to revised user account management options after death for all Facebook users worldwide.

Effective Highlight Colors for Visualizations  
Supriya Naidu, Danielle Szafir, 
VisuaLab, (Department of Information Science, CMCI ) 
ATLS 208
This project measures how effectively different highlight colors support search and positive aesthetics when quantitative data is visually represented.

Fabricating Soft Actuators  
Purnendu, Daniel Leithinger, Eric Acome, Christoph Keplinger
THING Lab,
ATLS 231
Vegetable oil-filled thin poly sheets configured using a modified CNC machine show promise for the development of flexible pumps, valves and actuators, paving the way for a new approach to shape-changing interfaces that are low-cost, modular, soft and conformable. (A collaboration with the Keplinger Research Group in Mechanical Engineering at CU Boulder.) 

Furniture Music  
M Bethancourt, 
Whaaat!? Lab, 
ATLS 105
An interactive experience combining furniture placement with soundscape manipulation.

HOT SWAP: All Hands on Deck  
Peter Gyory, Clement Zheng, Ellen Do, Daniel Leithinger

ACME Lab, ATLS 229
HOT SWAP is an award-winning, collaborative naval combat game with five shared controllers that must be constantly swapped for the team to survive relentless attacks from pirate ships. HOT SWAP won the 2019 Indiecade "Innovation in Interaction Design" award for their use of novel interchangeable controllers.

Immersive Analytics  
Matt Whitlock, Keke Wu, Danielle Szafir

VisuaLab, ATLS 208
This project examines how to use immersive analytics most effectively and which domains would most benefit from the application of virtual and augmented realities for on data visualization and interaction.

LEGO ChemBot 
Kailey Shara
Laboratory for Emergent Nanomaterials, ATLS 204
A robot capable of performing repetitive chemistry lab work, such as weighing and combining compounds and liquids, temperature control, and remote sensing and actuation, is constructed of thousands of LEGO pieces, providing comparable functionality as commercially available robotic lab equipment at a fraction of the cost. 

Mechamarkers  
Clement Zheng, Peter Gyory, Farjana Ria Khan, Daniel Leithinger, Ellen Do

ACME Lab, ATLS 229
Mechamarkers is a system for interaction designers that facilitates making and sensing low-cost physical inputs for 3D interfaces. 

Mindfulness in the Classroom
Annie Bruns, Autumn Stevens
MettaCognition Lab, ATLS 225
This study gauges the impact on wellbeing for 100 college students when they engage in a three-week social media fast, combined with self-reflection and journaling practice.

Motion Capture Studio  
Peter Gyory, Daniel Leithinger, Ellen Do, Mark Gross, 

CMAP, ATLS 2B Mo Cap Studio
The Motion Capture Studio's state-of-the-art facility allows for live capture of actors and objects, enabling research in virtual reality, design, assistive technologies, animation, game design, theater and more.  

Personal Biochips  
Mirela Alistar 

Living Matter Lab, ATLS 206
This work explores whether new programmable microfluidic biochips that use electrical voltage to transport and combine droplets of fluid across an array of small electrodes can take the place of lengthy and expensive medical diagnostic tests currently completed by hand.

REFORM: Recognizing F-formations for Social Agents  
Hooman Hedayati, Annika Muehlbradt, Dan Szafir
IRON Lab, ATLS 234
This project introduces an algorithm that helps robots recognize and avoid interrupting focused, conversational groups, or F-formations, based on relative positions and orientations of participants. 

RoboGraphics: Dynamic Tactile Graphics Powered by Mobile Robots
Darren Guinness, Annika Muehlbradt, Daniel Szafir, and Shaun K. Kane
IRON Lab, Superhuman Computing Lab, ATLS 234
RoboGraphics, involving dynamic tactile graphics that combine a touch screen tablet with static tactile overlays and small mobile robots, help the visually-impaired explore data quickly and accurately.

Robot Manipulation Using VR  
Arth Beladiya, Michael Walker

IRON Lab, ATLS 234
This project involves controlling a one-armed manipulator robot using virtual reality controller inputs. The arm follows the controller inputs in real-life and can be used to tele-operate the robot in a unique and simplistic manner.

RoomShift  
Ryo Suzuki, Hooman Hedayati, James L Bohn, Clement Zheng, Daniel Szafir, Ellen Yi-Luen Do, Mark D Gross, Daniel Leithinger, 
THING Lab, ATLS 231
RoomShift is a room-scale dynamic haptic environment for virtual reality, based on a small swarm of adapted Roomba robots capable of moving furniture around a room. 

ShapeBots  
Ryo Suzuki, Clement Zheng, Yasuaki Kakehi, Tom Yeh, Ellen Yi-Luen Do, Mark D. Gross, Daniel Leithinger, 
THING Lab, ATLS 231
These shape-changing swarm robots are a new type of computer interface consisting of a swarm of self-transformable devices that display information, actuate objects, act as tangible controllers, visualize data and more. 

Shape and Size Perception
Danielle Szafir, David Burlinson, 
VisuaLab, Information Science, CMCI, ATLS 208
This project explores how people see shapes, structures and patterns, and how these elements can be used to create better charts and graphs. 

Tech Tattoos  
Carson Bruns, Jesse Butterfield, Sean Keyser
Laboratory for Emergent Nanomaterials, ATLS 204
Tattoos of the future could give you real-time information about your physiology or environment. The Laboratory for Emergent Nanomaterials is one of the first research groups in the world to begin developing the unique inks and compounds needed to realize this vision. 

Unfabricate: Designing Smart Textiles for Disassembly  
Shanel Wu, Laura Devenorf
Unstable Design Lab, ATLS 207
With the e-textile industry still in its infancy, an opportunity exists to establish design standards that facilitate disassembly, recycling and reuse of used yarn and conductive thread. Unfabricate demonstrates one such approach with a design that makes unraveling for reuse quick and easy.

Visualization for People with Cognitive Disabilities  
Keke Wu, Emily Shea Tanis, Danielle Albers Szafir
VisuaLab, ATLS 208
To help individuals with cognitive disabilities make sense of budgetary data and become better self-advocates, this project explores how different visual design elements, such as chart types, chart embellishments and data continuity, impacts visual communication. (A collaboration with Coleman Institute for Cognitive Disabilities.)

Wearable Friends  
Sasha de Koninck 
Unstable Design Lab, ATLS 207
A wearable textile composed of tactile knot structures is combined with embedded interactive technology to provide soothing sensations to the wearer when under stress.

REGISTER NOW