Don’t expect much self-regulation from Facebook, even as scrutiny mounts against the tech giant. And no matter what happens between social media sites and Congress, don’t expect it to change the newspaper industry’s way – or at least this company’s way – of conducting business.
Controversy surrounds Facebook and its affiliated site Instagram over accusations that those sites spread misinformation, take advantage of customer data and raise the rate of depression among teens.
And now a whistleblower has come forward to say Facebook knows these accusations to be facts yet still chooses to distribute harmful information.
The reason for Facebook’s decision to let it go? Because it means more money, according to the whistleblower.
The algorithms associated with Facebook and Instagram track what we view on those sites. Once the algorithms detect a viewer’s interest, the company intentionally feeds the viewer more of that content. An engaged viewer means more time spent on the site. That means more views of advertisements and more money for Facebook. Divisive content, the company knows, creates more engagement, according to the whistleblower.
The whistleblower has revealed herself to be Frances Haugen, a data scientist and computer engineer. On the CBS program “60 Minutes” Sunday, she said Facebook consistently makes decisions that are about profit rather than public good. To back her point, Haugen leaked thousands of internal Facebook documents to the media and law enforcement.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” she told “60 Minutes.”
It could simply be attributed to a company doing what companies do – making money. But a big difference is that Facebook is a publicly traded company and isn’t allowed to be dishonest with shareholders about its business practices.
This week, Haugen is scheduled to testify before a Senate committee; the discussions are expected to focus on Facebook and Instagram’s effect on children’s mental health. The discussions likely will seek to uncover whether Facebook knows it is harming its users, as many claim.
Greater federal regulation could eventually be the result, and apparently even Facebook is open to it, likely because any self-regulation would put the company at a disadvantage against its competitors. Federal oversight, however, would equally limit them all.
Whatever happens between federal lawmakers and big tech, don’t expect it to affect the way newspapers do business.
Remember: traditional media is considered a “publisher,” while social media is a “platform.” Publishers adhere to stricter standards, while platforms – seen only as distributors, not creators, of content – are shielded by federal law, specifically Section 230 of the Communications Decency Act.
Our goal, as a publisher, is to protect our communities’ rights to free speech. We know it comes with limits – newspaper publishers and editors are tasked with monitoring what is or isn’t decent or offensive, and we’ve been doing it for our entire existence.
Whatever happens to the big-tech companies, so be it. Google, Facebook, Instagram, Twitter and the like can do whatever they want. They have created their own mess and should be held accountable for the national discord, depression and division they have helped create. If that means more federal oversight, so be it. They made their bed.
As for us, we know our role is to encourage, not limit, useful dialogue and to be a good steward of important community conversations.