My Virtual Dream publishes PLOS study on Nuit Blanche results

Baycrest my virtual Dream

“My Virtual Dream” is an innovative and interactive live performance experience of the Virtual Brain technology developed by researchers at Baycrest Health Sciences Centre in partnership with neuroscience experts around the world.

Baycrest Heath Sciences’ “My Virtual Dream,” an innovative and interactive live performance experience at the intersection of science, art and music, is currently touring the installation with appearances in Amsterdam in May and more scheduled for Irvine, CA in October.

MaRS Innovation is working with the Baycrest team to commercialize the technology behind the demonstration, known as the virtual brain.

My Virtual Dream was featured in TechVibes on July 8 and in a PLOS blog published on August 14, 2015.

“The Virtual Dream tour is a ‘living lab’ that engages the public, fuels science, creates art and educates while it entertains,” says Richard Tavener, executive producer of the Virtual Dream tour.

The exhibition and research project was originally mounted in partnership with the University of Toronto, Nuit Blanche 2013, and InteraXon and also made a January 2015 appearance at the Ontario Science Centre.

Participants wear the Muse, a brain-computer interface headset provided by InteraXon, and use focus and mental relaxation states to complete a science game and create a stunning array of visuals and music.

The brain data collected at Nuit Blanche has yielded insights about how the brain learns and a science paper about this massive, one-night neuroscience experiment. The paper, which appeared in the July issue of PLOS One, found that:

Besides validating robust time-of-night effects, gender differences and distinct spectral power patterns for the two mental states, our results also show differences in neurofeedback learning outcome. The unusually large sample size allowed us to detect unprecedented speed of learning changes in the power spectrum (~ 1 min). Moreover, we found that participants’ baseline brain activity predicted subsequent neurofeedback beta training, indicating state-dependent learning. Besides revealing these training effects, which are relevant for BCI applications, our results validate a novel platform engaging art and science and fostering the understanding of brains under natural conditions.

Dr. Stacey Ivanchuk and Fanny Sie are the commercialization leads for the virtual brain at  MaRS Innovation.