Who: Open to the public
What: ATLAS Expo 2017
When: Wednesday, May 3, 5 to 6:30 p.m.
Where: Roser ATLAS Center lobby
Would a room-sized, working model of the Internet—complete with moving ping pong balls—help you understand how it works? How might an event-specific social media platform work? Interested in gaming live with friends on your phone over Bluetooth?
Come to the ATLAS Expo this Wednesday, May 3, and you can explore all of this and more. It’s a colorful kaleidoscope of 100-plus student projects in virtual reality, physical computing, mobile apps, human-computer interaction, design and creative technologies of all kinds.
Over the course of the semester, students have dreamed big, ventured outside the box and mashed disciplines to create useful, beautiful, striking, informative and surprising devices and objects of all kinds. For a sense of what you might see, check out the fall 2016 student Expo projects.
The 90-minute event kicks off at 5 p.m. and is free and open to the public. Below are a handful of projects you’ll find there. And if you come, make sure you download the event-specific social media app to discover what’s hot. Visitors will receive a link to engage with the web-based app when they arrive.
Luminous Science by Lila Finch
This 9-foot "garden spirit" paper-and-wire sculpture, illuminated by 240 individually-programmed LEDs, is both a thing of beauty and a lesson in biology. Linked wirelessly to data from sensors in a nearby hydroponic garden, the lights in the sculpture will illustrate the biological processes actually taking place in plants.
When the lights are on, LEDs on the sculpture’s leaves twinkle to indicate photosynthesis, and pulsing lights indicate how sugar is being transported down into the plant. When conditions in the garden change (temperature, moisture, light, nutrient concentration, etc.) the sculpture responds accordingly.
Finch is an ATLAS doctoral student working in Ben Shapiro’s Laboratory for Playful Computation. And before earning an MS in chemistry from Caltech, she taught high school art and chemistry.
"I am interested in more creative ways for students to learn science and represent scientific ideas," Finch said.
She hopes her approach with this project can be used to engage students in similar projects to provide rich interdisciplinary learning opportunities. For example, students would choose a science topic to investigate, create a sculpture using nebuta (a form of Japanese lantern-making), learn basic programming skills and electronics, and build circuits.
No doubt, Luminous Science has the potential to light up a lot of young minds.
SwimSense by Varsha Koushik and Annika Muehlbradt
SwimSense is a wearable device that helps visually-impaired and blind swimmers swim independently in a pool by warning them when they are about to reach the wall.
The device, which attaches to a swimmer’s thigh using velcro, uses computer vision to track the black line on the bottom of the pool and provides audio feedback when the swimmer reaches the T at the end.
Currently, the device includes a Raspberry Pi computer and small camera. Koushik and Muehlbradt, both computer science majors, completed the project for their class, Inclusive Design and Adaptive Technologies, taught by Shaun Kane, assistant professor of computer science.
ATLAS 360 Tour by Ian Smith
Imagine touring college campuses without expensive plane tickets or the environmental impact of long-distance travel. Or finding out where next semester’s classes are located without leaving your dorm room. Or virtually touring a building to help you navigate your surroundings before you even visit.
This project, created in the Unity game engine, takes users on a Google Cardboard-powered tour of the ATLAS building. Smith is a graduate student in the Creative Technology and Design track of the ATLAS MS program.
Heartbeat Visualizer by Ryan Doner
Learn to control your heart rate using biofeedback in virtual reality (VR). Users connect their index finger to a pulse sensor, put on the VR headset and then are able to navigate and get up close to a model of a heart beating synced to their own pulse.
"Meditation is a really important aspect of my life," Doner said. "I believe that using virtual reality to teach people meditation can be a powerful tool for behavior change."
Inspired by Stanford's Virtual Human Interaction Lab, Doner hopes to study how this technology can improve environmental and health behaviors. The project uses an Arduino Nano and HTC Vive and is built in the Unity game development platform.
Doner is a graduate student in the Information and Communication Technology for Development track of the ATLAS MS program.
SMART Tentacle by Adam Siefkas, Surjith Singh, Byron Becker, Ryan Riley, Vidur Sarin, Stan James
SMART Tentacle is a pool noodle transformed into a robotic tentacle that mimics a subject’s arm movement. The project is made from everyday products including paracord and laser-cut sections of plywood.
The tentacle is actuated by cords and servo motors that substitute for tendons and muscles. Arm movement is detected by a device worn on the forearm called a Myo Band. Packed with a variety of sensors, the Myo Band provides data that is used to actuate the appropriate servo motors on the tentacle, making the flexible arm mimic the physical movement of the person wearing the band.
Building the robotics and ensuring the data resulted in the correct movement were two distinct challenges, and the student team includes students from two different classes: Soft Robotics, taught by ATLAS Director Mark Gross, and Machine Learning, taught by Ben Shapiro.