Some Facebook users have always been concerned about privacy and digital manipulation, and their fears may have come true. Facebook recently admitted to manipulating the emotions of nearly 700,000 users.
Studying Users’ Emotions
GoodTherapy.org recently reported on the published study. To study the effects of “emotional contagion” on social networking sites, Facebook removed some status updates to users’ news feeds, and added other status updates. All of the statuses were real statuses, but the result of the manipulation was that some users saw larger quantities of negative content and others saw larger quantities of positive content. Researchers who analyzed the results found that users exposed to negative statuses were more likely to post negative messages of their own. The effect was the same for positive statuses, suggesting that emotions are “contagious” on Facebook.
Did Facebook Behave Unethically?
Because the study displayed the effects of emotional contagion, some Facebook users have suggested that Facebook toyed with users’ emotions. A few have even proposed that Facebook should compensate users for its behavior by donating money to organizations that focus on mental health issues. Facebook has pointed out that the statuses might have appeared in a user’s timeline anyway, and that the research was anonymous, so the company didn’t gain any information about identifiable individuals.
Even if the experiment caused no actual harm and even if user privacy was protected, though, this research deviates from standard ethical norms for scientific research. Informed consent is almost always necessary in scientific research, with some exceptions for research that examines things that would happen to users anyway. Facebook’s argument is that users would have seen positive and negative statuses no matter what. Because users saw statuses they might not otherwise see, though, some users have argued that Facebook directly manipulated emotions.In a statement to the Wall Street Journal, one of the data scientists who worked on the project argued that the impact on users was minimal. He did apologize for the wording of the study, which he admitted might have made the study seem “sinister.”
References:
- Albergotti, R. (2014, June 30). Furor Erupts Over Facebook’s Experiment on Users. Wall Street Journal. Retrieved from http://online.wsj.com/articles/furor-erupts-over-facebook-experiment-on-users-1404085840
- Gray, S. (2014, June 29). Facebook admits manipulating 689,003 users’ emotions in psychology experiment. Retrieved from http://www.salon.com/2014/06/29/facebook_admits_it_manipulated_689003_users_emotions_in_psychology_experiment/
© Copyright 2014 GoodTherapy.org. All rights reserved.
The preceding article was solely written by the author named above. Any views and opinions expressed are not necessarily shared by GoodTherapy.org. Questions or concerns about the preceding article can be directed to the author or posted as a comment below.