Video source: YouTube, CBS This Morning
Facebook Inc (Nasdaq: FB ) acknowledged that it held back from publishing a report about the most widely viewed posts on its social media platform from January through March 2021.
According to The New York Times, which obtained a copy of the report before Facebook released it, the most shared external link on Facebook during the first quarter of this year was a since-updated article that suggested a Florida doctor’s death was linked to the coronavirus vaccine.
Facebook executives debated about publishing the report, but decided to withhold it over concerns it would reflect poorly on the company, The Times reported.
In a tweet on Saturday, Facebook spokesperson Andy Stone said the social media giant prepared the report but decided not to publish it “because there were key fixes to the system we wanted to make.”
Stone also said the company ultimately decided to release the quarterly report because of the interest it had generated following media reports.
"We're guilty of cleaning up our house a bit before we invited company. We've been criticized for that; and again, that's not unfair," said Stone, who went on to emphasize “just how difficult it is to define misinformation.”
Stone wrote, “News outlets wrote about the south Florida doctor that died. When the coroner released a cause of death, the Chicago Tribune appended an update to its original story; NYTimes did not. Would it have been right to remove the Times story because it was COVID misinfo? Of course not. No one is actually suggesting this and neither am I.”
The New York Times first reported on the shelved report Friday, two days after Facebook published a similar report for the second quarter of 2021 that offered a far rosier picture of what is trending on the popular platform.
In recent months, the White House has grown increasingly frustrated over how Facebook and other social media companies are handling misinformation and how this treatment could complicate efforts to curb the spread of the virus.
In its community standards enforcement report released last week, Facebook said more than 20 million posts have been removed from the platform for violating policies on COVID-19-related misinformation since the onset of the pandemic.
In addition, the company said it deleted more than 3,000 accounts, pages and groups from Facebook and Instagram platforms for repeatedly violating its rules on spreading false information and placed warnings on more than 190 million posts.
Source: Equities News