Facebook is taking additional steps to lower the ranking of clickbait articles in the News Feed, something it began explicitly targeting last year. In the past, it’s used a combination of signals including the ratio of reads to shares to determine whether an article over-promises and under-delivers, and down-ranking sites and domains which persistently post clickbait. But it’s now examining the actual content of the headline for both withholding information and exaggeration and lowering the ranking for those pieces which exhibit these characteristics. On the one hand, this is a good thing: less of this content in Facebook means we’re all more likely to read worthwhile stories that actually tell us something useful or meaningful. But on the other hand, this stuff has always existed and no-one has ever attempted to regulate it in the way Facebook now is. Unlike fake news, which has the power to sway elections and have other significant negative real-world impacts, clickbait has far less real-world impact. And if people continue to click on those headlines, it suggests they’re interested in reading the contents whether or not the headlines are misleading or manipulative. The stuff wouldn’t be shared by users on Facebook or show up in the News Feed in the first place if it wasn’t popular, which means Facebook is making value judgments here which not all of its users would agree with. As with Google’s frequent tweaking of its search algorithms to suppress sites with behaviors it disapproves of, I always feel this is dangerous territory.
This is a great idea, and I hope we’ll see a lot more of this kind of innovation around news – we need it. One of the things I’m most struck by almost daily is the different universes that I’m a part of on Twitter and Facebook – during the day, I’m surrounded by mostly very liberal perspectives among the coastal tech people I follow on Twitter, and in the evenings at weekend I spend more time on Facebook, where the people I’m connected to tend to be more conservative. But I suspect many of us inhabit mostly one or other of these worlds, or tend to shut out those perspectives which are different from our own on social media, tending to reinforce our perceptions and prejudices. Not everyone will go for this kind of experiment – some may choose to continue to see a narrower view of the world, but we could all benefit from putting ourselves in others’ shoes and seeing the news through other lenses than our own.
The topic of fake news and the related topic of filter bubbles has been one BuzzFeed has been particularly strong on in recent months (abuse on Twitter is another). This analysis is fascinating, and shows how even the experience of watching video on Facebook can be colored by the outlets a user chooses to follow. This isn’t quite the same as Facebook’s algorithms showing users different things – in this experiment, the user consciously chose to watch either a Fox News or Fusion live video stream. But it’s a great illustration of how users on Facebook can have completely different experiences even when engaging with the same underlying content.
This is one of two bits of news from Facebook today (the other concerns metrics), this one about dealing with fake news (though that’s a term Facebook continues to eschew in favor of talking about genuineness and authentic communication). Facebook is tweaking its algorithms again to provide better feeds with fewer sensationalist or inaccurate news reports, for example. It looks like this is mostly about ordering within the feed rather than whether something appears there at all, however, which is a nice way of avoiding perceptions of outright censorship, though of course the lower something appears in the feed, the less likely people are to see it. It’s good to see that Facebook continues to tweak its strategy for dealing with fake news, and as with previous moves around news it’ll be very interesting to see how it’s perceived by users and publications.
Continuing Our Updates to Trending – Facebook (Jan 25, 2017)
It’s a big day for Facebook news – I’ve already covered the new Facebook Stories feature and ads in Messenger, both of which are being tested. This is the only one that’s been publicly announced by Facebook, however, and it concerns Trending Topics, which appear on the desktop site. The changes are subtle but important – each topic will now come with a headline and a base URL such as foxnews.com, topics will be identified based on broad engagement by multiple publications and not just one, and the same topics will be shown to everyone in the same region rather than personalized. Though Facebook doesn’t explicitly say so (perhaps because it fears a backlash, perhaps because it would be a further acknowledgement of a thorny issue), but all of these can be seen as partial solutions to the fake news issue. Citing specific headlines and publications allows users to see the source and make a judgment about whether it’s a reliable one, prioritizing broad engagement will surface those stories that are widely covered rather than being promoted by a single biased source, and showing the same topics to all users could be seen as an attempt to break through the filter bubble. These all seem like smart changes, assuming Facebook can deliver better on these promises than some of its abortive previous changes to Trending Topics.
How Facebook actually isolates us – CNN (Jan 23, 2017)
This isn’t a new idea – it’s been around at least since Eli Pariser’s Filter Bubble was published in 2012. But this study dives a little deeper and provides a scientific foundation for the claims made. However, it also demonstrates how much of the filtering and bubble behavior on sites like Facebook is really tapping into deeper human tendencies like confirmation bias, of which content shared through the mechanism of a social network is a massive enabler. Though the article doesn’t mention Facebook once beyond the headline, the study itself was focused on Facebook, so these findings are specifically about that specific network, though the patterns would largely apply to others too. Because so many of these features are grounded in fundamental human behaviors, they’re very tough to change too, so although Facebook may share some blame for enabling rather than challenging those tendencies, it’s going to be very tough to change them unless Facebook makes a very deliberate attempt to break up the filter bubbles and actively challenge users with new information that contradicts their existing views, which seems very unlikely.
This news (FB’s own blog post here) should obviously be taken together with the hiring of Campbell Brown as head of news partnerships at Facebook, announced last week. It’s easy to see this as being about the whole fake news story, and there’s an element of that, but this goes much further than that. What’s interesting is the number of value judgments in Facebook’s own post about this – it isn’t neutral here when it comes to fostering news sites, and local news in particular. That’s clearly in its interests, but it goes further than that too. It’s also very sensibly looking at business models beyond display ads for monetizing news content on Facebook, something the industry needs as Facebook becomes the place where many of their readers consume their content.
Although Zuckerberg sets himself a personal goal every year, this one feels like a more corporate one than those he’s set in the past, and it’s hard not to read it as an attempt to understand and assuage concerns about Facebook’s increasing power and its role in our lives. I’m curious to see how Zuck goes about connecting with ordinary people and what he hears from them (and who else will be present to hear that feedback). It’s hard to tell at the outset whether this will be more of a stunt or PR exercise or a true listening tour, but Facebook and Zuckerberg definitely need to do more of the latter.
This is a fantastic post about how tech companies hide behind that identity, and shouldn’t. Facebook is the obvious example that springs to mind, and does seem to be coming around on this point, but it applies to others too. Many tech companies abdicate responsibility, because responsibility means an imperative to act and self-examine, and most importantly to question the assumption tech is always a force for good. We need more of that questioning in 2017.
BuzzFeed editor makes 2017 predictions – Business Insider (Dec 30, 2016)
Fake News is one of several areas where BuzzFeed has done excellent investigative work this year (harassment on Twitter is another), and editor-in-chief Ben Smith thinks we’re in for more in 2017 (I agree). The big question is whether 2017 will eventually see some sort of return to normalcy or whether we’ll see a growing divide between the realities embraced by different groups of people – sadly, I believe it will be the latter.
This is a narrative that gained significant steam during the course of 2016 – the idea that Facebook is becoming incredibly powerful as a filter through which people experience the Internet and the world, and that this much power is dangerous. That danger is arguably heightened by the incredible power Mark Zuckerberg still has as CEO to single-handedly shape policy for the company. I suspect we’ll see a lot more of this kind of thing in 2017.
Mark Zuckerberg says it’s ‘extremely unlikely’ fake news on Facebook changed the election outcome – Recode (Nov 13, 2016)
Mark Zuckerberg has continued to resist calls for Facebook to see itself as a media company, and to accept the editorial responsibilities that come with this role. This puts him in conflict with not only much of the rest of the industry and its commentariat but many users too, and it’s a tension that can only be resolved as Zuckerberg and Facebook recognize the product’s evolution and take steps to improve the user experience while reassuring users Facebook won’t abuse its power. That’s a really tough line to walk.
Mark Zuckerberg and Facebook in general have long strenuously resisted the media company label, not least because media companies are valued much lower than tech companies. And yet Facebook has become arguably the most influential media company in the world over the past few years, a fact that’s only become clearer as 2016 has gone on. This identity crisis also makes it harder for Facebook to make smart decisions about how to manage problems like fake news on the site – the sooner it reaches some conclusions, the better.
The article has several statements from Facebook itself at the end, which deny the main points of the article. However, this article helped feed a narrative which was already emerging, that Facebook was deliberately or otherwise suppressing trending topics with a conservative bent. It also played into the larger narrative that Facebook has too much power over what its users read and see of the world, a narrative that gained a lot of steam in 2016.
Free Basics is an initiative that Facebook cares about, but it’s not necessarily massively important for its overall performance financially or otherwise. As such, this is an emotional setback, but not necessarily an important one. However, the reasons for the decision are indicative of broader concerns which Facebook should be more concerned about: increasing worries about Facebook’s power and the ways in which it shapes users’ experience of the Internet.