Facebook has been the source of some intense scrutiny lately as it has been revealed that the online social mogul has used accounts to conduct research on the billions of people who routinely log in to the website. Let’s think about that for a minute. Facebook has gathered unwarranted and unsolicited data based off of its users. Though the social media website has now stated that it will no longer do so without close scrutiny from top managers, can we really trust a site that has already gone behind our backs to gather information?
Facebook has said that it researches many different things and has no intention of stopping. They will, however, try to be more responsible. PC World says “Facebook is making a series of changes, like enhanced reviews of research projects that study emotions or affect specific groups of people.” This means that the study conducted in July of how people react to positive or negative posts showing up on their feeds would have gone through extra examination before being implemented.
Facebook has also said that future research would be “subjected to greater internal scrutiny from top managers” according to The New York Times, and would “train all of its engineers in research ethics.” Despite these promises, though, the social media giant continues to keep its research endeavors secret and stated that “no outsiders would be invited to review Facebook’s research projects.” In addition, “the company declined to disclose what guidelines it would use to decide whether research was appropriate. Nor did it indicate whether it would seek consent from users for projects like the emotion study.”
Users of Facebook have mixed feelings about the studies being conducted by the social media network. Most people “weren’t at all shocked by Facebook studying their data for research purposes – they were mad that the network was manipulating their experience to achieve a specific result,” says PC World. Facebook’s chief technology officer, Mike Schroepfer said that the backlash was unexpected and forced the social media network to look closely at its policies. “But nowhere in its new framework did Facebook mention allowing people to opt in to these experiments, which is essential when you’re manipulating a user’s experience. The network should make participation in research optional – a privacy setting you can change like any other.”
Coming forward and admitting the error is a great start for Facebook, but continuing to hide some of the research being done will not help the network to regain the trust of users. Full transparency when it comes to research is what Facebook needs. They may have learned from this mistake and have tried to fix their policies, but how long will it be before they cross the line again?