February 28, 2017
The broadening field of neuroscience has often focused on the brain activity of isolated, individual subjects to better understand how various regions of the organ respond to a given set of stimuli.
But what if the stimulus is a dialogue with another person or storyteller? Assuming both brains are processing roughly the same information, is it reasonable to expect that neuron activity will be closely mirrored in both individuals' brains?
Biomedical engineers at Drexel University, building on previous findings out of Princeton University, are attempting to answer new questions at the forefront of social neuroscience, an interdisciplinary field that examines how interpersonal ties and communication impact the operations of the brain and body.
The research team, whose findings were published this week in Scientific Reports, sought to improve on current brain imaging methods by creating a device to collect data outside the lab setting. Princeton researchers, using functional MRI (fMRI), previously demonstrated that a listener's brain activity does mirror the brain of a speaker telling a story about a real-life experience. The better the story is understood, the more this coupling phenomenon can be observed.
Unfortunately, fMRI doesn't allow researchers to measure brain activity in natural environments and contexts. That's where the Drexel team stepped in to create a wearable headband that uses functional near-infrared spectroscopy (fNIRS), a light-based measurement tool that captures cellular blood-oxygen responses relative to neuron behavior.
“We live in a social world where everybody is interacting," study leader Hasan Ayaz, an associate professor in Drexel's School of Biomedical Engineering, told DrexelNOW. "And we now have a tool that can give us richer information about the brain during everyday tasks — such as natural communication — that we could not receive in artificial lab settings or from single brain studies."
The current study used brain responses to three unrehearsed real-life stories, one told by a native English speaker and two told by native Turkish speakers. The researchers accurately predicted that 15 English speakers listening to all of the stories would mirror the prefrontal and parietal brain activity of the speakers in their native language.
Based on these findings, future applications of fNIRS could include brain synchronization studies in environments from classrooms and business meetings to hospitals and protests.
“Now that we know fNIRS is a feasible tool, we are moving into an exciting era when we can know so much more about how the brain works as people engage in everyday tasks."