Facebook mood-altering experiment on your own News Feed


comments

It has forced Facebook to issue an embarrassing apology and shown just how much social networks can affect our mood. 

Facebook's mass manipulation of almost 700,000 users has caused uproar - and today Sheryl Sandberg was forced admit it was a mistake.

However, a Brooklyn-based artist and programmer has built a tool that allows you to see it for yourself - and run Facebook's controversial mood experiment on your own news feed.

Scroll down for video

Sort it out: The Facebook Mood Manipulator allows you to filter the posts that appear in your feed based on the emotions they invoke

Sort it out: The Facebook Mood Manipulator allows you to filter the posts that appear in your feed based on the emotions they invoke

HOW TO DO IT

The plugin for Google's Chrome browser can be installed here.

When the Facebook Mood Manipulator is installed, a small white box will appear in the top right corner of the News Feed.

It contains a set of sliding scales labeled with four emotions: positive, emotional, aggressive, and open.

Above the scales the Facebook Mood Manipulator prompts, 'How would you like to feel?'

Users can slide the scales back and forth and watch how their feed is affected in real-time.

For instance, sliding the 'Emotional' scale all the way to the right yields more photos of weddings and babies. Maxing out the 'Aggressive' filter can lead to a feed peppered with politically oriented posts and news of a gun showdown in Georgia.

The 'Facebook Mood Manipulator' is a simple browser extension that lets you select how you want to feel and filters your Facebook Feed accordingly.

Lauren McCarthy, the mastermind behind the project, built the tool in response to Facebook's research into massive-scale emotional contagion.

 

When the Facebook Mood Manipulator is installed, a small white box will appear in the top right corner of the News Feed.

It contains a set of sliding scales labeled with four emotions: positive, emotional, aggressive, and open.

Above the scales the Facebook Mood Manipulator prompts, 'How would you like to feel?'

Users can slide the scales back and forth and watch how their feed is affected in real-time.

For instance, sliding the 'Emotional' scale all the way to the right yields more photos of weddings and babies. Maxing out the 'Aggressive' filter can lead to a feed peppered with politically oriented posts and news of a gun showdown in Georgia.

'You adjust the settings and it's not immediately clear what's changed,' McCarthy explained to Mail Online

'However, you slowly realize that these posts do have an effect on you.

It's not about turning your feed into a sunshine stream.

'But you can set this button, totally forget about it, and have a totally different experience on Facebook that's not explicitly clear to you, but is implicitly affecting you.'

Happy Facebook News Feed: Altering the filter ratios will display different types of posts. Maxing out the 'Aggressive' and 'Emotional' filters can lead to a feed peppered with politically oriented posts and sad news

Happy Facebook News Feed: Altering the filter ratios will display different types of posts. Maxing out the 'Aggressive' and 'Emotional' filters can lead to a feed peppered with politically oriented posts and sad news

Sad Facebook feed: Altering the filter ratios will display different types of posts. Maxing out the 'Aggressive' and 'Emotional' filters can lead to a feed peppered with politically oriented posts and sad news

Sad Facebook feed: Altering the filter ratios will display different types of posts. Maxing out the 'Aggressive' and 'Emotional' filters can lead to a feed peppered with politically oriented posts and sad news

The Facebook Mood Manipulator even uses Linguistic Inquiry Word Count (LIWC), the same system used in the Facebook study, to evaluate phrasing in each Facebook post.

McCarthy says she saw the Facebook study, but wanted to make the implications more tangible. 'Why should Zuckerberg get to decide how you feel?' She writes on her website, 'Take back control… manipulate your emotions on your terms.'

Facebook has publicly apologized for such manipulations.

FACEBOOK'S FEED EXPERIMENT

The California-based firm carried out the experiment during a week in 2012.

During that time, negative posts were deprioritized in the data feeds of 689,003 users, to see if it generated a more positive response.

Posts were determined by the experiment to be positive or negative if they contained at least one positive or negative word.

The experiment affected around 0.04 per cent of users - or 1 in 2500.

According to Facebook, nobody's posts were 'hidden,' they just didn't show up on some feeds.

It found that negative posts elicited a swell of positive responses, but also that a reduction in positive news led to more negative posts.

'When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,' said the researchers.

The company issued a statement last week claiming it 'never met to upset anyone' by altering the feeds of almost 700,000 users, adding, 'the research benefits of the paper may not have justified all of this.'

Today, while on a visit to New Delhi, Facebook COO Sheryl Sandberg admitted that experiment was 'poorly communicated,' marking the first public statement on the study by a Facebook executive since the outrage erupted. 

'This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated, and for that communication we apologize. We never meant to upset you,' she told the Wall Street Journal. 

During one week in 2012, Facebook manipulated feeds of just over 689,000 users to highlight either positive or negative items, and then monitored responses over the course of a random week. The site has since apologised for the way the paper described the research, and any anxiety that was caused

During one week in 2012, Facebook manipulated feeds of just over 689,000 users to highlight either positive or negative items, and then monitored responses over the course of a random week. The site has since apologised for the way the paper described the research, and any anxiety that was caused

'The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,' said Facebook data scientist Adam D. I. Kramer.

'We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.

'At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.

'We didn't clearly state our motivations in the paper.

'Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.'

During the experiment, Facebook deprioritized content in users' News Feeds, based on whether there was an emotional word in the post.

Tests affected around 0.04 per cent of users - or 1 in 2500 - for a week, in early 2012.

According to Kramer, no posts were 'hidden,' they just didn't show up in certain users' feeds.

'Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads,' he explained. 

The study found that negative posts elicited a swell of positive responses, but also that a reduction in positive news led to more negative posts. 

'When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,' said the researchers.

'These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.'

Of the millions of posts analyzed, 4 million were found to be positive and 1.8million were determined to be negative.

The findings led the team to conclude that 'in-person interaction and nonverbal cues are not strictly necessary for emotional contagion.'

This experiment was limited to users who viewed Facebook in English, but it is not known across which geographic boundaries.

'At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it - the result was that people produced an average of one fewer emotional word, per thousand words, over the following week,' continued Kramer.

'I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.'

Commenting on the reports, Brett Dixon, director of the digital marketing agency DPOM, said:  'Despite Facebook's insistence this was merely an academic experiment, it sails perilously close to the illegal world of subliminal advertising.

'There's a reason this insidious form of manipulation is banned - it is an abuse of people's freedom to choose.'

'But let's keep some perspective,' he added 'This was a research project, not the birth of some social media thought police.'



IFTTT

Put the internet to work for you.

Turn off or edit this Recipe

0 comments:

Post a Comment