Why Facebook couldn’t stop the New Zealand shooter from streaming his attack
The internet — including Facebook — is built for sharing.
“We don’t check what people say before they say it, and frankly, I don’t think society should want us to. Freedom means you don’t have to ask for permission first, and by default, you can say what you want.”
I’m assuming that he still stands by those comments today, after a gunman used Facebook to live-stream his slaughter of dozens of people in New Zealand. Though I’ve asked Facebook PR to make sure.
But regardless of how Facebook — and Twitter and YouTube and Reddit and the other platforms that helped spread images and videos from yesterday’s mass shooting in Christchurch, which authorities have deemed a terrorist attack — respond to criticism about their roles today, the key thing to remember about the platforms is that they did exactly what they’re designed to do: allow humans to share whatever they want, whenever they want, to as many people as they want.
I don’t want to be facile about this: Of course Facebook doesn’t want killers to live-stream their crimes worldwide. But the company built a tool that allows them to do exactly that. And it sits on a platform that is fundamentally built to let people say whatever they want, without asking for permission first.
As I wrote in 2017, this platform structure is key to Facebook’s enormous success as a company — users supply the content, and Facebook’s software spreads it around the globe, instantly, with as little friction as possible:
Facebook only works as a giant, billion-person-plus business because it allows users and advertisers to upload whatever they want to its platform, without human intervention. And the fact that Facebook doesn’t vet people’s comments, ads or (almost) anything else before it goes up is also what gives it a great deal of legal protection, particularly in the U.S.: If there’s something unpleasant or illegal up on Facebook, it’s not because Facebook put it there — someone put it on Facebook.
This set-up isn’t unique to Facebook. All of the giant consumer platforms that have sprung out of Silicon Valley in the last decade or so work the same way: YouTube and Twitter don’t sign off on your comments or videos before you upload them, and Airbnb doesn’t vet you before you rent space in your house.
As Zuckerberg noted in 2017, it does want to remove objectionable content after it has gone up, and the company says it took down the shooter’s account shortly after the live stream. The company also says it will be spending billions on a combination of software and human beings to combat abuse in the future.
Last week, Zuckerberg announced plans to shift Facebook’s focus away from a public newsfeed and toward more personal, encrypted communication. But at the end of Facebook’s planned pivot, it would still allow the New Zealand shooter to do exactly what he did yesterday.
It’s possible that Facebook’s shift would reduce the virality of shooting footage or other horrific stuff, but it wouldn’t prevent that stuff from going up on the platform. It’s also possible that Facebook would have a much harder time policing it, since the company plans to provide full encryption for the messages people pass back and forth.
But per Zuckerberg — and again, because Facebook is built this way — Facebook will be policing abuse on its platform once it has happened, the same way police respond to a crime once they know about it.
Here’s a larger passage from the Zuckerberg essay I quoted at the beginning of this story:
Now, I’m not going to sit here and tell you we’re going to catch all bad content in our system. We don’t check what people say before they say it, and frankly, I don’t think our society should want us to. Freedom means you don’t have to ask permission first, and that by default you can say what you want. If you break our community standards or the law, then you’re going to face consequences afterwards.
It’s hard to imagine what consequences Facebook can impose on a person who killed dozens of people Friday. And it’s hard to imagine that this won’t happen again.