Tag Archives: Daniel J. Levitin

The song is you: a McGill University, University of Cambridge, and Stanford University research collaboration

These days I’m thinking about sound, music, spoken word, and more as I prepare for a new art/science piece. It’s very early stages so I don’t have much more to say about it but along those lines of thought, there’s a recent piece of research on music and personality that caught my eye. From a May 11, 2016 news item on phys.org,

A team of scientists from McGill University, the University of Cambridge, and Stanford Graduate School of Business developed a new method of coding and categorizing music. They found that people’s preference for these musical categories is driven by personality. The researchers say the findings have important implications for industry and health professionals.

A May 10, 2016 McGill University news release, which originated the news item, provides some fascinating suggestions for new categories for music,

There are a multitude of adjectives that people use to describe music, but in a recent study to be published this week in the journal Social Psychological and Personality Science, researchers show that musical attributes can be grouped into three categories. Rather than relying on the genre or style of a song, the team of scientists led by music psychologist David Greenberg with the help of Daniel J. Levitin from McGill University mapped the musical attributes of song excerpts from 26 different genres and subgenres, and then applied a statistical procedure to group them into clusters. The study revealed three clusters, which they labeled Arousal, Valence, and Depth. Arousal describes intensity and energy in music; Valence describes the spectrum of emotions in music (from sad to happy); and Depth describes intellect and sophistication in music. They also found that characteristics describing music from a single genre (both rock and jazz separately) could be grouped in these same three categories.

The findings suggest that this may be a useful alternative to grouping music into genres, which is often based on social connotations rather than the attributes of the actual music. It also suggests that those in academia and industry (e.g. Spotify and Pandora) that are already coding music on a multitude of attributes might save time and money by coding music around these three composite categories instead.

The researchers also conducted a second study of nearly 10,000 Facebook users who indicated their preferences for 50 musical excerpts from different genres. The researchers were then able to map preferences for these three attribute categories onto five personality traits and 30 detailed personality facets. For example, they found people who scored high on Openness to Experience preferred Depth in music, while Extraverted excitement-seekers preferred high Arousal in music. And those who scored high on Neuroticism preferred negative emotions in music, while those who were self-assured preferred positive emotions in music. As the title from the old Kern and Hammerstein song suggests, “The Song is You”. That is, the musical attributes that you like most reflect your personality. It also provides scientific support for what Joni Mitchell said in a 2013 interview with the CBC: “The trick is if you listen to that music and you see me, you’re not getting anything out of it. If you listen to that music and you see yourself, it will probably make you cry and you’ll learn something about yourself and now you’re getting something out of it.”

The researchers hope that this information will not only be helpful to music therapists but also for health care professions and even hospitals. For example, recent evidence has showed that music listening can increase recovery after surgery. The researchers argue that information about music preferences and personality could inform a music listening protocol after surgery to boost recovery rates.

The article is another in a series of studies that Greenberg and his team have published on music and personality. This past July [2015], they published an article in PLOS ONE showing that people’s musical preferences are linked to thinking styles. And in October [2015], they published an article in the Journal of Research in Personality, identifying the personality trait Openness to Experience as a key predictor of musical ability, even in non-musicians. These series of studies tell us that there are close links between our personality and musical behavior that may be beyond our control and awareness.

Readers can find out how they score on the music and personality quizzes at www.musicaluniverse.org.

David M. Greenberg, lead author from Cambridge University and City University of New York said: “Genre labels are informative but we’re trying to transcend them and move in a direction that points to the detailed characteristics in music that are driving people preferences and emotional reactions.”

Greenberg added: “As a musician, I see how vast the powers of music really are, and unfortunately, many of us do not use music to its full potential. Our ultimate goal is to create science that will help enhance the experience of listening to music. We want to use this information about personality and preferences to increase the day-to-day enjoyment and peak experiences people have with music.”

William Hoffman in a May 11, 2016 article for Inverse describes the work in connection with recently released new music from Radiohead and an upcoming release from Chance the Rapper (along with a brief mention of Drake), Note: Links have been removed,

Music critics regularly scour Thesaurus.com for the best adjectives to throw into their perfectly descriptive melodious disquisitions on the latest works from Drake, Radiohead, or whomever. And listeners of all walks have, since the beginning of music itself, been guilty of lazily pigeonholing artists into numerous socially constructed genres. But all of that can be (and should be) thrown out the window now, because new research suggests that, to perfectly match music to a listener’s personality, all you need are these three scientific measurables [arousal, valence, depth].

This suggests that a slow, introspective gospel song from Chance The Rapper’s upcoming album could have the same depth as a track from Radiohead’s A Moon Shaped Pool. So a system of categorization based on Greenberg’s research would, surprisingly but rightfully, place the rap and rock works in the same bin.

Here’s a link to and a citation for the latest paper,

The Song Is You: Preferences for Musical Attribute Dimensions Reflect Personality by David M. Greenberg, Michal Kosinski, David J. Stillwell, Brian L. Monteiro, Daniel J. Levitin, and Peter J. Rentfrow. Social Psychological and Personality Science, 1948550616641473, first published on May 9, 2016

This paper is behind a paywall.

Here’s a link to and a citation for the October 2015 paper

Personality predicts musical sophistication by David M. Greenberg, Daniel Müllensiefen, Michael E. Lamb, Peter J. Rentfrow. Journal of Research in Personality Volume 58, October 2015, Pages 154–158 doi:10.1016/j.jrp.2015.06.002 Note: A Feb. 2016 erratum is also listed.

The paper is behind a paywall and it looks as if you will have to pay for it and for the erratum separately.

Here’s a link to and a citation for the July 2015 paper,

Musical Preferences are Linked to Cognitive Styles by David M. Greenberg, Simon Baron-Cohen, David J. Stillwell, Michal Kosinski, Peter J. Rentfrow. PLOS [Public Library of Science ONE]  http://dx.doi.org/10.1371/journal.pone.0131151 Published: July 22, 2015

This paper is open access.

I tried out the research project’s website: The Musical Universe. by filling out the Musical Taste questionnaire. Unfortunately, I did not receive my results. Since the team’s latest research has just been reported, I imagine there are many people trying do the same thing. It might be worth your while to wait a bit if you want to try this out or you can fill out one of their other questionnaires. Oh, and you might want to allot at least 20 mins.