Facebook under investigation for News Feed Experiment: Watchdog to assess if the firm breached data regulations


comments

Facebook may have broken data protection laws when it altered the news feeds of almost 700,000 users back in 2012.

The so-called 'emotion contagion' experiment, in which the social media giant edited feeds to highlight either positive or negative items, has caused outrage among users.

And now the Information Commissioner's Office (ICO) is said to be investigating whether the site mishandled the private data of those affected.

Scroll down for video

Facebook is under investigation over its so-called 'emotion contagion' experiment, in which it altered the news feeds of almost 700,000 users back in 2012. The Information Commissioner's Office (ICO) is said to be looking into whether the site breached regulations about the use of its users' data

Facebook is under investigation over its so-called 'emotion contagion' experiment, in which it altered the news feeds of almost 700,000 users back in 2012. The Information Commissioner's Office (ICO) is said to be looking into whether the site breached regulations about the use of its users' data

FACEBOOK'S FEED EXPERIMENT

The California-based firm carried out the experiment during a week in 2012.

During that time, negative posts were deprioritised in the data feeds of 689,003 users, to see if it generated a more positive response.

Posts were determined by the experiment to be positive or negative if they contained at least one positive or negative word.

The experiment affected around 0.04 per cent of users - or 1 in 2500.

According to Facebook, nobody's posts were 'hidden,' they just didn't show up on some feeds.

It found that negative posts elicited a swell of positive responses, but also that a reduction in positive news led to more negative posts.

'When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,' said the researchers.

The experiment's results were revealed in a paper published in the PNAS journal at the weekend.

Facebook has since apologised for the way the paper described the research, and any anxiety that was caused, adding, 'the research benefits of the paper may not have justified all of this.'

 

A spokesman from the ICO told the Financial Times it was too early to tell what part of the law Facebook might have infringed.

The paper added that the Data Protection Commissioner in Ireland will also be contacted as the technology giant's European headquarters are in Dublin.

Facebook's director of policy in Europe, Richard Allan, told MailOnline: 'It's clear that people were upset by this study and we take responsibility for it.

'We want to do better in the future and are improving our process based on this feedback.

'The study was done with appropriate protections for people's information and we are happy to answer any questions regulators may have.'

An official statement from the site added: 'Our academic research and publication proposals are reviewed internally when proposals are made and before they are submitted for publication.

A spokesman from the ICO told the Financial Times it was too early to tell what part of the law Facebook might have infringed. The paper added the Data Protection Commissioner in Ireland will also be contacted as the site's European headquarters (pictured) are in Dublin

A spokesman from the ICO told the Financial Times it was too early to tell what part of the law Facebook might have infringed. The paper added the Data Protection Commissioner in Ireland will also be contacted as the site's European headquarters (pictured) are in Dublin

'This review analyses the impact on user data, including appropriate safeguards to help ensure the research does not disclose information associated with a specific person's Facebook account and that it uses the minimum amount of data required for the study.'

The experiment was carried out in one week during January 2012 in collaboration with Cornell University and the University of California.

The aim of the government-sponsored study was to see whether positive or negative words in messages would lead to positive or negative content in status updates.

Many users reacted angrily following online reports of the findings, with some referring to it as 'creepy', 'evil', 'terrifying' and 'super disturbing'.

'The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,' said Facebook data scientist Adam D. I. Kramer.

'We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.

During one week in 2012, Facebook manipulated feeds of just over 689,000 users to highlight either positive or negative items, and then monitored responses over the course of a random week. The site has since apologised for the way the paper described the research, and any anxiety that was caused

During one week in 2012, Facebook manipulated feeds of just over 689,000 users to highlight either positive or negative items, and then monitored responses over the course of a random week. The site has since apologised for the way the paper described the research, and any anxiety that was caused

'At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.

'Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.'

This experiment was limited to users who viewed Facebook in English, but it is not known across which geographic boundaries.

'At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it - the result was that people produced an average of one fewer emotional word, per thousand words, over the following week,' continued Kramer.

'I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.

Facebook data scientist Adam D. I. Kramer issued a statement over the weekend (pictured). He said: 'The reason we did this research is because we care about the emotional impact of Facebook. Having written and designed this experiment myself, I can tell you our goal was never to upset anyone'

Facebook data scientist Adam D. I. Kramer issued a statement over the weekend (pictured). He said: 'The reason we did this research is because we care about the emotional impact of Facebook. Having written and designed this experiment myself, I can tell you our goal was never to upset anyone'

'In hindsight, the research benefits of the paper may not have justified all of this anxiety.'

Commenting on the reports, Brett Dixon, director of the digital marketing agency DPOM, said:  'Despite Facebook's insistence this was merely an academic experiment, it sails perilously close to the illegal world of subliminal advertising.

'There's a reason this insidious form of manipulation is banned - it is an abuse of people's freedom to choose.

'But let's keep some perspective. This was a research project, not the birth of some social media thought police.'



IFTTT

Put the internet to work for you.

Turn off or edit this Recipe

0 comments:

Post a Comment