False Rumors Often Start at the Top

This article is a part of the On Tech e-newsletter. You can join right here to obtain it weekdays.

We know that false data spreads on-line like the world’s worst recreation of phone.

But we don’t discuss sufficient about the position of individuals in cost who say too little or the unsuitable issues at essential moments, creating circumstances for misinformation to flourish.

Think about the latest rumors and outrage that flew round President Trump’s well being, the wildfires in Oregon and the message of a Netflix movie. Ill-considered communication from these at the prime — together with the president himself — made cycles of bogus data and misplaced anger even worse.

Every phrase that highly effective individuals say issues. It is probably not truthful, however they have to now anticipate how their phrases is likely to be twisted — deliberately or not — into weapons in the on-line data struggle.

In short, Netflix’s communication projected the idea that its own movie was the opposite of what it really was. Some politicians, parents and a Texas prosecutor called the film child pornography and pushed Netflix to ban it. Outcry about the movie has been amplified by supporters of the QAnon conspiracy theory, the false idea that top Democrats and celebrities are behind a global child-trafficking ring.

I want to be clear: There are always people who twist information to their own ends. People might have misplaced blame for the wildfires or dumbed down the complexities of “Cuties” even if official communications had been perfectly clear from the jump. But by not choosing their words and images carefully, the people in charge provided fuel for misinformation.

We see over and over again that unclear, wrong or not enough information from the beginning can be hard to overcome.

Conspiracy theories about President Trump’s coronavirus diagnosis and health condition in the last week were fueled by people close to the president misspeaking or obfuscating what was happening. And the White House’s history of spreading false information contributed to a lack of trust in the official line. (My colleague Kevin Roose also wrote about this fueling wild speculation about the president’s health.)

Nature abhors a vacuum, and the internet turns a vacuum into conspiracies. All of us have a role to play in not contributing to misinformation, but experts and people in positions of power shoulder even more responsibility for not creating the conditions for bogus information to go wild.

If you don’t already get this newsletter in your inbox, please sign up here.

Facebook is expanding a blackout period for political and issue-related ads in the United States for days or longer after Election Day — a period in which officials might still be counting votes in the presidential election and other contests.

I want to make two points. First, Facebook’s ads blackout might be smart or it might be ineffectual, but it is definitely small fish.

Look at your Facebook feed. A lot of the overheated and manipulative garbage you see did not pay to be there. Those posts are there because they make people angry or happy, and Facebook’s computer systems circulate the stuff that generates an emotional reaction.

Yes, it’s extra galling if Facebook makes money directly from lies and manipulations. That’s a big reason some civil rights groups and company employees have called on internet companies to take a hard line against political ads or to ban them. But I suspect that most of the stuff that might rile people up if votes are still being counted after Election Day will be unpaid posts, including from President Trump — not ads.

Second, I am going to say something nice about Facebook. With the company’s ban on groups or pages that identify with the QAnon conspiracy announced this week and its gradually broadening crackdown on attempted voter intimidation and premature declarations of election victory, Facebook is showing courage in its convictions.

This is different. Too often the company myopically fixates on technical rules, not principles, and caves to its self-interest.

Facebook is taking a different tack in part because it doesn’t want to be blamed — as the company was four years ago — if there is confusion or chaos around the election. I love that Facebook is a little bit afraid.

It’s healthy for the company to ask itself: What if things go wrong? That’s something Facebook has often failed to do with disastrous consequences.

Source link Nytimes.com

Leave a Reply

Your email address will not be published. Required fields are marked *