Facebook's Sheryl Sandberg Apologizes For Studying Users Without Their Permission
Facebook tracked nearly 700,000 of its users’ behavior in a psychological study without their knowledge in 2012, and now the company is trying to apologize for their lack of communication. The research was published in the March issue of the Proceedings of the National Academy of Sciences with the goal of figuring out how people reacted when one group was given a more negative feed compared to a group given a more positive feed.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Facebook’s Chief Operating Officer Sheryl Sandberg said while in New Delhi promoting Lean In, her new book. “And for that communication we apologize. We never meant to upset you.”
They suppressed certain people’s newsfeed from seeing their friends’ happy news, statuses, and photos; while others were limited in how much negativity they experienced. The psychological experiment observed users for one-week in 2012, and the data that was collected was interpreted using an algorithm to determine whether decreasing or increasing negativity could prompt or discourage positivity in users.
“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook,” lead researcher and Facebook data scientist Adam Kramar wrote.
Sandberg is a technology executive and activist for women in business. In 2012, she was named in the TIME 100, and after her book released in 2013, she became viewed as one of the strongest rising women in the world. She previously served as vice president of global online sales at Google Inc. until she joined Facebook in 2008. As Facebook’s number two in charge, she apologizes for the lack of communication, although many are still unhappy about it.
“We take privacy and security at Facebook really seriously because that is something that allows people to share,” Sandberg said.
Facebook’s secretive study was just “a glimpse into a wide-ranging practice,” Kate Crawford, a visiting professor at the Massachusetts Institute of Technology’s Center for Civic Media and a principal researcher at Microsoft Research told The Washington Journal. Companies “really do see users as a willing experimental test bed.”
Many users weren’t happy to find out they were used as unwilling mice in a large maze of experiments going above their heads. However, “like” it or not, there’s nothing users can do about it because anyone who’s a member of Facebook already signed their name off on the agreement.
“As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research,” read the study.
Only those who took the time to decipher the jargon could understand that meant their private information and newsfeed could be manipulated and used freely without their knowledge or permission at any given point.