Facebook's Mood Manipulation Study: What Does It Mean To Brands?

By now, most of you will have read or seen multiple media stories about Facebook's recently published mood manipulation study. There's a lot of debate about the ethical implications of the research, and several European data protection agencies have already announced investigations into whether Facebook violated local privacy laws with the study.

But we think the questions for marketers go deeper: how will this research, and user response to it, affect how brands are able to engage with their customers on Facebook? My colleague Nate Elliott and I have just published a Quick Take on the subject. Our high-level assertions: 

  • While Facebook’s study crosses ethical lines, the data use is likely legitimate. Consumers are understandably outraged by why they perceive as an abuse of their postings. But Facebook’s Data Use Policy explicitly allows the firm to use data for internal research purposes. Still, the potential for users to abandon Facebook is real.
  • Facebook has novel data to analyze, and long term, that could change marketing practices significantly. The kinds of data that Facebook is starting to exploit are highly unique. It could actually combine evergreen affinities with contextually specific emotional states to change how brands buy media and measure performance.
  • But the short term implications may cut its opportunities off at the knees. If Facebook, with all of its research and experimentation, causes users to feel like lab rats, it’s possible that they will leave the site in droves. That outcome could severely limit brand reach — and that could signal the end of Facebook’s marketing customers, especially given today’s already reduced reach.

We'd love to hear your thoughts. Has Facebook cut off its own nose by publishing the study? Or is it simply on the bleeding edge of tomorrow's hyper-individualized advertising?

Comments

Outrage du jour

It seems that the only people who are really outraged by this study are those who don't really understand how Facebook works to begin with. ("Breaking: Facebook manipulates users' News Feeds to get them to look at ads, click and Like!") It's not great PR, no - but I doubt that the vast majority of FB users care about this (or have even heard of it).

With A/B and MV testing becoming more and more widespread, a wide swath of digital interactions will increasingly fit the definition of "experiments." And let's remember that, at any given time, FB is probably conducting dozens, if not hundreds, of similar studies. I don't agree that any of this crosses ethical lines. While FB could have handled the PR around this study a little more artfully, I highly doubt that this will have one iota of measurable effect on FB user growth.

Facebook Brand Contagion & More

Good that you folks are raising this question. It's a HUGE deal and points to the dangerous trend of digital 'skinner box marketing' raised more than a year ago by veteran VC Bill Davidow.

In the post link below I point out that above and beyond the Facebook experiment blowback itself we are seeing a convergence of three things.

1. Emotional manipulation as 'strategy'
2. The role of 'martech' or marketing technology as an enabler of skinner box marketing
3. The role of big data (I point to Forrester's own advocacy of 'customer context') that feeds the emotional manipulation marketing 'drone' to drive lifetime customer monetization.

I think we are now in the top of the first inning on this issue and we will see Facebook competitors begin seeing unethical skinner box marketing as an opportunity window to differentiate.

Bottom line. The digital marketing industry has lost its 'moral compass' if it ever had one to begin with. It needs an intervention and a new framework for non-manipulative engagement.

Link below to 'Digital Skinner Box Marketing: The New Brand Contagion"
https://medium.com/@Platformula1/digital-skinner-box-marketing-196641c08678

Internal use

Facebook's terms of use policy do indeed specify "internal" use and everyone should expect that they are playing around with their algorithms to maximize revenue. However, a psychological experiment that is then published in a peer reviewed journal is decidedly not "internal" use and moves beyond the implied consent of accepting their terms of use.

This controversial paper

This controversial paper found a trivially small effect and that was not even necessarily related to changes in mood (because the study never measured mood directly). So this seems to be hype about Facebook and big data more than it is anything else.

Post new comment

If you have an account on Forrester.com, please login.

Or complete the information below to post a comment.

(Your name will appear next to your comment.)
(We will not display your email.)
Type the characters you see in this picture. (verify using audio)
Type the characters you see in the picture above; if you can't read them, submit the form and a new image will be generated. Not case sensitive.