Narrative: Facebook's Power

Each narrative page (like this) has a page describing and evaluating the narrative, followed by all the posts on the site tagged with that narrative. Scroll down beyond the introduction to see the posts.

Each post below is tagged with
  • Company/Division names
  • Topics
  • and
  • Narratives
  • as appropriate.
    Narrative: Facebook’s Power (Jan 24, 2017)

    Updated: July 1, 2017

    This narrative was the subject of the Weekly Narrative Video for the week of June 26-30, 2017. You can see the video on YouTube here, and it’s also embedded at the bottom of this page. 

    Facebook is one of the companies with the broadest reach in the world – the week this essay was updated in late June 2017, it announced 2 billion people (or 27% of the world’s population) use it at least monthly, and well over a billion use it every single day. Only Microsoft, with Windows, and Google with Android and several of its websites, can claim similar scale. The big difference between Windows and Android on the one hand and Facebook on the other, though, is that the former merely offer a framework for a variety of individual experiences presented through apps and websites, while Facebook is an experience in its own right.

    Facebook controls the time spent in its apps to a far greater extent than Microsoft or Google do within their respective operating systems. It controls the content people see, writing algorithms which prioritize which of the trillions of pieces of content available should be presented to users, and in what order. It is the funnel through which many users now get much of their news content, showing those users more of the things they click on, comment on, and share, and showing them less of the things they don’t respond to. And increasingly it is a home for that content itself, through its own live and recorded video services and other formats like Instant Articles.

    All of this gives Facebook enormous power to shape our world and our views of it. None of this is intended to narrow our view of the world, to shape our opinions, or to reinforce prejudices or stereotypes, but by seeking the objective of maximum engagement and time spent, Facebook tends to do all of those things anyway. However, it has shied away from accepting its role as a shaper of our world or even as a media company at all until very recently. Now, it finally seems to be taking its role as a filter more seriously, and is working with news organizations more proactively both to help them find sustainable business models (in response to lower monetization on its platform) and to combat fake news.

    However, these efforts place Facebook into a new role – one where it must make decisions about what’s true and false, what’s good for journalism and bad for it – and that will bring it into conflict with some powerful forces. The Donald Trump presidency has already been characterized by distortions of the truth and outright lies, and his election was partly enabled by an alternative view of reality created and fomented by a variety of online publications with various agendas, some of them like Facebook’s own only incidentally fostering some of these outcomes. Setting itself up as an arbiter of fake news will put Facebook in opposition to some of these groups and their acolytes, and that may well make Facebook’s role in all this increasingly complex.

    The fact remains that Facebook has a very powerful position in our popular culture, helping to shape opinions and enabling our inherent confirmation bias.  Better that it stop denying this role and begin thinking about concrete ways to help mitigate both the real and perceived effects that has on our society.

    Facebook Launches Basic Food Ordering Feature Through Partners (Oct 13, 2017)

    Facebook has announced that it’s now offering an option to order food through its app, as a result of integration with a range of delivery partners and individual restaurant chains. The “Order Food” option is so far buried deep in the “Explore” tab where few people are likely to find it, though I’m guessing users who make use of it frequently may see it move into their core navigation over time. The integration is basic, mostly launching an in-app browser aimed at the website for either a delivery service or the restaurant chain, with the user having to fill in all relevant information such as credit card, address, and other contact information if they don’t have an existing account. As such, this integration feels like it adds little value over and above the minimal utility of shopping for food within the Facebook app rather than a separate one. It’s a good reminder that, for all Facebook’s reach and power, its new feature launches are often pretty lacking and unlikely to gain much traction, at least in their original form.

    via Facebook

    Facebook Cites Fake News Flagging Progress, Sandberg Discusses Russian Ads (Oct 12, 2017)

    Facebook’s COO Sheryl Sandberg was interviewed today by Axios’s Mike Allen on the subject of Russian election interference and other topics, while Facebook also issued some data about the effectiveness of its program to flag fake news on the platform. At the same time, the Washington Post reports that Facebook has removed a set of data from its site and tools which allowed for analysis of Russian-related postings.

    The Sandberg interview shed little additional light on the topic, with the only real news that Facebook is sharing both the ads bought by Russian-backed entities and additional details associated with them with Congress, which in turn may share them publicly. However, she was also asked whether Facebook was a media company, a characterization she pushed back on, leading to several articles from actual media companies arguing that she’s wrong. There continues to be something of an ulterior motive for these stories given the tense relationship between Facebook and the media, but I continue to believe that these characterizations are wrong. To my mind, Facebook is much more like a cable TV company than a TV programmer, putting together a package of content for users but not mostly doing that programming itself, while selling ads that appear within the programming. I don’t think most would argue that cable TV companies are media companies or that they’re responsible for the specific content of the programming, while they are responsible for establishing general policies and rules about what will run on their platforms.

    The data Facebook shared on its fake news flagging effort suggests that a fake news label applied after fact checking from third parties effectively reduces sharing and views once it’s applied, but the problem has always been that it takes several days for this to happen, which means most of the views have already happened by the time it takes effect. It shared the data with its fact checking partners as a way to incentivize them to do better (something they’ve been asking for) but without massive new resources from Facebook or elsewhere, it’s not clear how those organizations will be able to work faster or cover more ground. That, in turn, will continue to limit the effectiveness of the program.

    Lastly, Facebook says the data it has pulled from its site with regard to Russian accounts should never have been available in the first place, and its disappearance therefore reflects the squashing of a bug rather than a decision to pull otherwise public information. Whether you believe that or not likely depends on your view of Facebook’s overall level of transparency in relation to the Russia story, which has clearly been limited. It appears Facebook at a corporate level is still desperate to control the flow of information about Russian influence on the platform, which likely isn’t helping its PR effort here – better to be as transparent as possible so that all possible bad news can come out quickly rather than continuing to trickle out.

    via Recode (Sandberg)BuzzFeed (fake news), Washington Post (data removal)

    Facebook Tightens Ad Review, Security Exec Criticizes Media Coverage (Oct 9, 2017)

    I’m actually tying three stories together here, only two of them referenced in the headline. The first is news that Facebook is tightening the review process for ads that seek to target by politics, religion, ethnicity, or social issues, requiring human approval before these ads can be shown to users. Secondly, Facebook’s Chief Security Officer, Alex Stamos, went on something of a Twitter ant on Saturday in which he complained about what he described as overly-simplistic coverage of complex issues by the media. And thirdly, CBS had an interview on Sunday with the Trump campaign’s digital director, who claims that it worked in very direct and sophisticated ways with Facebook to do micro-targeting of its ads, including having Trump-sympathetic members of the Facebook staff working directly with the campaign in its offices.

    The ad review change is a sensible one in response to recent revelations about how these tools were used in the past, but is likely to catch lots of entirely innocent activity too – e.g. someone targeting members of a particular religion with products or services relevant to them – and will likely slow down the approval process for those ads. It will also slow down the approval process for political ads during campaigns, when the volume of ads tends to rise dramatically, and the review team will need to be augmented significantly. That delay could prove costly as campaigns become more nimble in responding to news in real time and want to target ads immediately. We won’t know the impact of that until next year, as mid-term campaigns ramp up.

    The Stamos rant garners some sympathy from me, because I agree that some of what’s been in the press has assumed that Facebook should have been aware of these attempts to game its systems at a time when the US government and security agencies hadn’t yet addressed the issues at all in public. But the rant is also indicative of what appears to be a split between the security and engineering teams at Facebook, which clearly want to speak out more, and the PR and broader senior management team, which seem to want to say as little as possible – several reporters I follow on Twitter responded to the thread with frustration over the fact that Facebook hasn’t made people available to talk about the details here.

    Lastly, the CBS story doesn’t seem to have been picked up widely and may be partly exaggeration on the part of the source, but there’s no doubt that the Trump campaign did use the tools Facebook offers extremely effectively during the campaign, and that it played an important role in the outcome. What’s important here is that its uses were all legitimate, in contrast to the use of Facebook by Russian actors claiming to represent US interests, but the effects and even techniques used were in many ways similar. Even as Facebook clamps down on one type of influence, the broad patterns will remain similar, and as long as foreign actors can find US-based channels willing to act as fronts, it’s going to be extremely difficult to shut down this type of activity entirely.

    via Axios (ad review changes), Twitter (Stamos), CBS (Trump campaign)

    Research Suggests Reach of Russian Facebook Posts Much Larger Than Ads Alone (Oct 6, 2017)

    Facebook still hasn’t shared all of the details of the ads bought by Russian agents on Facebook over the last few years with Congress, and hasn’t really shared any of the details with the general public. However, some of the details have emerged regardless, and one researcher has used that information to do some analysis of the reach of some of the posts on the accounts controlled by entities tied to the Kremlin. What he found is that the organic reach of those posts has been enormous, much larger than the numbers reached by the ads themselves alone as reported by Facebook, suggesting that Facebook is using the narrowest possible definitions of reach in its reporting and thereby downplaying the impact.

    Until Facebook does release the full details of the Russian operations, we can’t know the true reach for sure, and this analysis is merely indicative of organic reach achieved by half a dozen of the biggest accounts we do know about. But it’s clear that the operation was both sophisticated and very effective in reaching large numbers of people, leveraging many of the same techniques used by legitimate news organizations and others on Facebook. Given that these techniques are all available to anyone who uses Facebook, the only way they could have been stopped is if there was clear evidence that the accounts behind them were “inauthentic” (to use Facebook’s terminology) way earlier in the process. And given that neither it nor the US government were actively investigating that possibility during the election, that was never likely to happen. It’s also not clear how Facebook would go about policing this kind of thing going forward.

    via The Washington Post

    Instagram Offers Cross-Posting of Stories to Facebook (Oct 5, 2017)

    Facebook seems determined to keep trying to make the Stories format a success in its core app, even as all the evidence shows that hardly anyone is using it. The latest push is a feature which enables users of Instagram’s Stories feature to cross-post a Story created in that app over to the Facebook equivalent as well. That will certainly provide a low-friction way to get people to create content for the Facebook Stories feature, and will therefore likely lead to at least a small increase in usage. But the big difference between Instagram and Facebook is often the size and nature of the audience. Yes, some people have big followings in both places and for them cross-posting will be natural and even useful, but for many others the appeal of Instagram is the smaller, more intimate audience they publish to there in contrast to the mishmash of people known well and not so well that clutter many people’s Facebook networks. As such, the appeal and usage of the feature is likely to be somewhat limited, for all the same reasons that Facebook’s Stories feature in general hasn’t taken off.

    via TechCrunch

    Facebook to Share Russian Ads with Congress, Hire 1000 to Review Ad Submissions (Oct 2, 2017)

    Facebook has made yet another announcement in what’s rapidly becoming the saga of Russian ad-buying on the platform and the ongoing fallout from it. This time around, it says it’s going to share the details of the 3000 suspicious ads placed on the platform with the US Congress, and it’s also going to hire a thousand additional people for its ad review team to ensure that inappropriate ads don’t get through. The rest of the announcement focuses mostly on fleshing out promises made over the last couple of weeks, though there’s still relatively little transparency on what’s actually going to change and/or when in some cases. Over the weekend, Mark Zuckerberg also personally apologized for any role Facebook may have had in sowing divisions in the world and promised to work to make things better in future, as part of a post relating to the Yom Kippur Jewish holiday. It’s clear that he’s taken all of this far more seriously and increasingly personally as well over recent months, though many still want him and Facebook to do far more to increase transparency over how Facebook has been used for ill and how it will change as a result.

    via Recode

    Mark Zuckerberg Sought to Use Facebook Resources For His Foundation’s Projects (Sep 29, 2017)

    Even though the shareholder lawsuit over plans to create a new class of shares at Facebook has been settled, some of the materials in the suit have just been unsealed and revealed some interesting tidbits. Business Insider has latched onto this particular one, which is that Mark Zuckerberg wanted to have Facebook employees work on projects for his Chan Zuckerberg Initiative foundation, but got pushback from other board members, notably Marc Andreessen. On the one hand, this feels like just the kind of thing some people worried about when Zuckerberg created the foundation while still running Facebook, risking a blurring of lines between the two. It feels clearly inappropriate for him to try to swing corporate resources behind his personal projects and it’s good that he was shot down. But I’m also minded of the many things that Google invested in during its pre-Alphabet days which were effectively personal passion projects for the two founders. Some of those things had at least tenuous connections to Google’s core business, but others felt much more disconnected, and yet the fact that there was no real shareholder oversight with most of them meant that they happened anyway (and the founders could at least argue they were potentially commercial projects, even if some of them were years from generating revenue). The fact that Zuckerberg created a separate foundation to pursue his projects makes the separation that should exist much clearer and thereby highlights these potential conflicts of interest much more clearly.

    via Business Insider

    Facebook Provides Update on Efforts to Prevent Interference in German Elections (Sep 27, 2017)

    Facebook has provided an update on its efforts to prevent interference in the recent federal elections in Germany, and says that in the month before the election it removed tens of thousands of fake accounts in Germany (it had previously said something similar about the recent French elections). But Facebook’s post also tries to make clear a distinction which has often been lost in the context of Facebook’s influence on elections: that its algorithms and the way it treats news and other posts have not swayed elections, while its official and transparent tools for politicians and political parties absolutely have had multiple effects. That’s a theme in the post, in which Facebook talks about politicians using the platform to reach voters and a variety of tools the company set up to help voters understand the issues and so on, much as it has promoted voting and information about candidates and their policy positions in the US. I frequently see Twitter posts which suggest that Facebook claims not to influence elections at all, but that’s misleading – its point is that it has transparent and official tools for politicians and parties to legitimately influence voters, while it’s doing its best to crack down on illegitimate uses of its platforms for those purposes. However, this post reiterates a point from Mark Zuckerberg’s brief speech last week on the topic – that Facebook will never be perfect at this, and its goal is to make it harder to mislead through the platform.

    Update/Related: Mark Zuckerberg later today posted about President Trump’s critical tweet this morning with more of the same message and something of an apology for his initial reaction to criticisms of Facebook’s role in the election.

    via Facebook

    Facebook Signs up Walmart as Customer for Workplace Corporate Social Product (Sep 26, 2017)

    Facebook has signed up the biggest private company in the world – Walmart – as a customer for its Workplace enterprise product, something of a coup as it competes against other products which have been in the market for far longer. Of course, many of Walmart’s employees work in stores and not at desks and so likely will never need accounts on the corporate social network, so though its overall size is dazzling, the size of the deployment is likely to be rather smaller. But it appears that Facebook has been quietly making good progress signing up other businesses of different sizes including some big names like Telenor (Norway’s incumbent telecoms operator) and Starbucks. All of this is indicative of Facebook’s enormous power to leverage its dominance in consumer social networking into other fields, whether messaging, photo sharing, or even crossing over into the enterprise, based on its familiar brand, interfaces, and tools. That’s never been a guarantee of success in any market for Facebook, as its repeated failures to compete organically with Snapchat demonstrate, but it has given it a leg up in many areas over companies starting from scratch and continues to make it a formidable competitor for any company going up against it directly.

    via TechCrunch