Spatial Sonification & Visualization of Genetic Data for Rapid Identification of Cancer-Causing Mutations

3d rendering of a DNA strand, light pink

This project will build and study an immersive multimodal system to represent and interact with human genome data to discover genetic mutations that cause cancer. In conducting this research we seek two outcomes that will transform cancer data exploration. The first is to enable the free and creative exploration of massive genetic datasets unconstrained by the scope and bias of other analysis methods. The second is to provide an interface that extends a user’s ability to process massive amounts of data quickly and with high sensitivity. To provide maximum flexibility to future explorations, we propose directly connecting users to the raw genome alignment data generated for hundreds of thousands of patient tumors, cancer cell lines, and health cohorts.

Multimodal natural user interfaces will enable users to engage fluently and quickly with our system. Gestures like multi-touch or dragging can select, move and manipulate data, hand-poses can execute commands, and embodied tilting or rotating can guide navigation. This multifaceted interaction streamlines the perception-interpretation-action-decision cycle and grants users the flexibility to explore continuously, jump to varied sections, or simultaneously analyze different segments for comparison.

ACME Lab Brain Music Lab

Associated Researchers

Additional Researcher

Ryan Layer