Image: William Iven
When a member of the Aftenposten’s staff wanted to include Nick Ut’s photograph ‘The Terror of War’ in an article, it fell to the newspaper’s editor-in-chief, Espen Egil Hansen, to decide whether the image was too graphic to publish.
“The media have a responsibility to consider publication in every single case. This may be a heavy responsibility. Each editor must weigh the pros and cons,” outlined Hansen online, describing how he came to chose to run the famous photograph in his publication.
But soon after the article was shared onto Facebook, the writer’s own account was suspended from the social network, and later the editorial staff received an email from Facebook describing how the war photo broke the website’s own code of publication. Facebook also removed an article by the Aftenposten from their network.
Hansen wrote an open letter to Mark Zuckerberg, describing how he felt that Facebook’s actions had compromised the editor’s authority to decide what should be published: “This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California.” Facebook sits between readers and writers. Hansen declared Mark Zuckerberg “the world’s most powerful editor”, adding: “I think you are abusing your power, and I find it hard to believe that you have thought it through thoroughly.”
Even before the Aftenposten controversy, Mark Zuckerberg was already being forced to defend and clarify Facebook’s role as a news distributor. “We are a technology company, not a media company,” Zuckerberg had previously stated when asked about the matter. In reality, the Aftenposten scandal is just another example that proves that technology companies and media companies aren’t so different after all – and that Facebook really is both.
Earlier in the summer, a Gizmodo report raised serious questions about how Facebook calculated the ‘trending topics’ it chose to display in the upper right hand corner of its homepage.
After the publication of the report, the head of the project at Facebook, Tom Stocky, detailed how the service worked: “Popular topics are first surfaced by an algorithm, then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers.”
It was these “review team members” that the report cast doubt upon. Gizmodo’s sources detailed how bias could seep into the team’s output. It was suggested that the group was asked to ignore trending news about Facebook itself when producing the final lists, but one interviewee also related how conservative news stories were often mysteriously excluded by their colleagues.
In short, the ‘trending topics’ system suggested a computer-powered neutrality – but it was actually significantly reliant on humans, inherent with bias. Confidence has since been so shaken in the system, Facebook has laid off the editorial staff and changed the way the service works. If you look at the ‘trending topics’ box on the homepage now, where there used to be hard news and brief descriptions of world events, there’s now a bit more celebrity gossip and no context is given for each news item.
But here’s the thing: even after the removal of the editorial staff, ‘trending topics’ is still biased. Our trust in computer algorithms as an expression of pure objectivity is false. The set of instructions that comprise an algorithm have to be designed by a group of people first. Taking a hypothetical situation, it may seem like a purely technical decision to choose whether to take into account a user’s location when serving them news. If a system knows which area a reader is based in, it can mix in relevant local news. But this is actually an editorial decision: it is the difference between the New York Times and the International New York Times – it is choosing whether to prioritise major global events or more immediate local happenings. Remember what Zuckerberg said: “We are a technology company, not a media company.” When choosing how any media will be distributed – by algorithm or curated by humans – everything is both a technical and editorial decision.
This is what’s crucial: Facebook seems to wants to be a source for news. The company developed a news format called Instant Articles which Zuckerberg described as something that “will naturally read a lot more news.” Furthermore, when Facebook launched its Live Video platform, it paid companies like BuzzFeed and Vox to create content for the new format – actively commissioning journalists and videographers. This makes it all the more maddening that Facebook refuses to perceive itself as a media company. Facebook fails to recognise that it is inescapably a product of editorial decisions. It further fails to recognise that as a news outlet, it must abide by a moral code. And what it fails to recognise most is how dangerous its attitude is. It feels like Facebook wants to be the world’s biggest news source, but it doesn’t fully understand the responsibilities that come with that position.