Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Will WhatsApp’s Misinfo Cure Work for Facebook Messenger?

To protect the election, the platform will limit message forwarding to five people at a time.
intertwined arrows on blue background
Photograph: Jorg Greuel/Getty Images

On Thursday morning, Facebookannounced several new policies to wrangle misinformation on its platforms ahead of the November election. Among them: limiting the number of people or groups you can forward a message to at one time on Messenger. For a glimpse of whether that might work—and how well—you needn’t look further than another Facebook-owned company: WhatsApp.

Restricting Messenger forwards is just one of several tools that Facebook has rolled out to combat misinfo, and it barelymade an appearance in the company’s press release. But it’s also one of the only measures with an established track record, albeit an opaque one. More important, it’s one of the few steps Facebook can take without sparking accusations of political bias from either side.

In 2018, misinformation ran rampant on WhatsApp, and it waslinked to deadly consequences in countries like India, where the messaging app is the de facto means of online communication. Because WhatsApp is end-to-end encrypted by default, the platform can’t know the contents of messages as they propagate throughout its ecosystem. But it could at least slow the spread. That July, WhatsApp reduced the number of accounts that you could forward a message to, from 256 to 20 for most people. In January 2019, it trimmed that number again, to 5.

That’s the playbook Facebook is emulating with Messenger, lopping the maximum number of forward recipients from 150 down to 5. “We've already implemented this in WhatsApp during sensitive periods,” Mark Zuckerberg wrote in a Facebook post outlining Thursday's changes, “and have found it to be an effective method of preventing misinformation from spreading in many countries.”

Which is probably the case! WhatsApp did manage to cut the total number of forwarded messages on its platform globally by 25 percent after that first round of changes. And stricter limits, instituted in April, on “highly forwarded messages”—anything that routed through five or more people before it gets to you—have curtailed those nuclear-grade viral chains by 70 percent. “The limits we have put in place at WhatsApp over the last two years have certainly reduced the spread of forwarded messages,” says WhatsApp spokesperson Carl Woog. “It would be difficult for us to say with certainty it reduces misinformation ‘only’—the user feedback we’ve gotten is that it also reduces sharing of harmless memes like ‘good morning’ messages.”

In other words, limiting forwards is a blunt instrument. “Measuring the impact of misinformation and disinformation on messaging apps with accuracy is close to impossible at the moment,” says Irene Pasquetto, cofounder of the Harvard Kennedy SchoolMisinformation Review. “Especially on WhatsApp, given that all content is encrypted and we have no access to the data.”

That encryption has unquestionable, and essential, benefits for the privacy and security of billions of people. It also contributes to what Rutgers professor Britt Paris has coined as “hidden virality,” content that gets passed around in private groups and messages outside of the public eye. “The little data we have on misinformation is what we get from publicly available and open source intelligence,” says Cristina Lopez, a senior research analyst at the nonprofit Data & Society who focuses on disinformation. “When you think about "Plandemic," and the way that was amplified so quickly publicly, it makes me shudder to think what that looked like privately. We were not able to measure that scale; there’s a chance that privately the spread started way before we were able to notice.”

Limiting Messenger forwards won’t shed any more light on what kind of content traverses those corridors, or how it spreads. It’s just playing the odds that it’ll slow the process down. At least one recentstudy indicates that it’ll work. Last fall, researchers from the Federal University of Minas Gerais in Brazil used data sets comprising posts from public WhatsApp groups in India, Indonesia, and Brazil to track the spread of messages and images—and to model what impact forwarding limits have on their spread.

“The finding we had is that it works quite well,” says Kiran Garimella, who worked on the study and is currently a postdoctoral fellow at MIT’s Institute for Data, Systems, and Society. “It reduces the speed of the spread of misinformation.”

In fact, the study found that limits on forwarding can reduce “velocity of dissemination” by an order of magnitude. That plays out in practice, as well, at least anecdotally. “Talking from a purely personal perspective, I’ve felt that measure,” says Lopez. “My family is Salvadorean. In Latin America, WhatsApp is the only app, pretty much. It did make the platform less spammy.”

Which is not to say that it’s a panacea. While a recent Harvard Kennedy School Shorenstein Centersurvey found some evidence that misinformation spreads widely on messaging platforms, the nature of hidden virality means that no one really knows how much it propagates there compared with private groups and other online vectors. And though the Minas Gerais study found a reduction in thespeed of spread, Garimella also says a story that's viral enough will still find its audience eventually.

There’s also the question of whom exactly it limits most effectively. “I think this works for political operatives and organized groups, meaning those who tend to share the same piece of content with many, many people, often using automated means,” Pasquetto says. “But it does nothing to the average user, who normally shares content with a couple of closed groups and a few selected people on a daily basis.”

Still, while limiting message forwards might be a limited tool, it’s at least a relatively unassailable one. As a universally applied standard, it should be inoculated from the claims of bias that havecowed Facebook in the past. "I’m a big fan of defined measures that are neutral to the type of content,” Lopez says. “Enforcement on specific content leads to political problems.”

Or it leads to lack of enforcement altogether. Mere hours after Facebook rolled out its latest policies, Donald Trumpposted clear and present voting misinformation to his page. Rather than take it down, the company labeled it with a gentle corrective, so there it will stay. At least on Messenger its spread will be slow.


More Great WIRED Stories
Brian Barrett is the executive editor of WIRED. Previously he was the editor in chief of the tech and culture site Gizmodo and was a business reporter for the Yomiuri Shimbun, Japan’s largest daily newspaper. ...Read More
Executive Editor
Read More
The Curious Case of the Bizarre, Disappearing Captcha
While puzzling captchas—from dogs in hats to sliding jockstraps—still exist, most bot-deterring challenges have vanished into the background.
Got a Pixel 10? Google’s Android Phone Can Now Share Files With Apple’s AirDrop
You can now share deets (and other digital treats) across the mobile divide.
4 Clever Tricks That Make It Worth Switching to Proton Mail
Proton Mail is an appealing alternative to Gmail, Outlook, and Apple Mail. It also comes with advanced privacy and productivity features, including a way to manage newsletter overload.
These Are the Privacy and Security Settings to Change on Your iPhone
Apple gives you a lot of control over your privacy and security. Make good use of it.
Social Security Data Is Openly Being Shared With DHS to Target Immigrants
For months, the Social Security Administration was quietly sharing sensitive data about immigrants with DHS. Now it’s official.
Pranksters Re-Created a Working Version of Jeffrey Epstein’s Gmail Inbox
Using Jmail, you can read thousands of Jeffrey Epstein’s emails in a familiar format. Use the star function to highlight notable finds.
US Border Patrol Is Spying on Millions of American Drivers
Plus: The SEC lets SolarWinds off the hook, Microsoft stops a historic DDoS attack, and FBI documents reveal the agency spied on an immigration activist Signal group in New York City.
A Computer Science Professor Invented the Emoticon After a Joke Went Wrong
In 1982, Carnegie Mellon University professor Scott Fahlman suggested using :-) for humorous comments after his colleagues took a joke about mercury seriously.
The Password Managers You Should Use Instead of Your Browser
Keep your logins locked down with our favorite password management apps for PC, Mac, Android, iPhone, and web browsers.
No, SNAP Benefits Aren’t Mostly Used by Immigrants
SNAP benefits are set to run out on Saturday. Far-right influencers and extremists are incorrectly claiming that immigrants are the main recipients of food stamps.
A Fight Over Big Tech’s Emissions Has the Greenhouse Gas Protocol Caught in the Crossfire
An ideological war over how tech giants can account for AI data center emissions has bled into the international arena.
How Windows Recall Works—and Whether You Should Switch It On
Windows Recall is one of the biggest AI features Microsoft has added to Windows 11. It’s useful, but it comes with some privacy tradeoffs.

[8]ページ先頭

©2009-2025 Movatter.jp