The university’s View from the North blog.
As the controversy about beheading videos on Facebook restarted this week, I wrote a post about it on the University of Huddersfield’s View from the North blog, on which academics write about assorted current events. I’ve reproduced the post below.
WATCHING VIDEOS of people being beheaded is not a pleasant experience. I remember once mistakenly seeing unedited footage of a beheading in Iraq, as it came into the newsroom of a TV channel I was working for. The main lesson I took from it was to do what I could to avoid seeing another one.
But if I really wanted to, I could now satisfy my curiosity by visiting Facebook. The social network has quietly reversed its previous ban on the posting of beheading videos. Quietly that is, until today, when the change attracted the full attention of the media. David Cameron even used Twitter, Facebook’s bitter rival, to condemn the decision as “irresponsible”.
It’s not especially controversial to say that beheading videos are bad in general, and that watching them is probably bad for us too. But the dilemma facing Facebook is more complicated than that. It comes down to this: is Facebook a publisher, or a platform? Or put another way: is it more like ITV, or a simple transmitter?
If ITV broadcast the beheading video currently being shared on Facebook, it would be subject to potential sanctions from its regulator, Ofcom. But in Facebook’s case, there is no regulator. Nobody can fine it or take away its licence, even though members of the public have accessed the video using Facebook as surely as a theoretical TV viewer might access it using a particular channel.
The argument made by social networks that they are merely platforms for others to post content is fine up to a point. But where Facebook in particular gets on to crumbly ground is when it refuses to censor beheading videos on one hand, but steps in to enforce its own ‘Community Standards’ on the other.
It rules all sorts of things out of bounds, from fake accounts to pictures of self-harming. You can understand the reasons why. But Facebook knows the more it intervenes, the more it edges away from the transmitter towards the publisher. That could mean extra responsibilities for proactively policing material across its one billion users, which would be extremely costly in time and money. Facebook would much rather leave it to us.