Facebook’s Fight Against Misinformation and Protection of Election Integrity

Speaking Notes for Kevin Chan's, Global Director and Head of Public Policy, Canada at Facebook, talk at the Max Bell School on September 25, 2018

Good afternoon everyone. I want to begin by thanking Professor Chris Ragan, Director of the Max Bell School of Public Policy, for inviting me here to speak about the work we are doing at Facebook to fight misinformation and protect election integrity on our platform. Before joining Facebook, I served as the Deputy Secretary-General of McGill University, and it is always a pleasure to be back on campus. Thank you again for the opportunity.

I'm going to start by talking very specifically about the cyber landscape in Canada as it pertains to election integrity, and the specific initiatives we are undertaking at Facebook to guard against bad actors during elections in Canada, including the ongoing provincial election here in Quebec. I'll then expand more generally on some broad public policy questions that have arisen over the last two years about Facebook, democracy, and the flow of information online. Serious issues have been raised, and they deserve an answer. At the end, Professor Ragan and I will sit down for a more casual discussion about these issues, and I hope that we'll get to answer as many of your questions as possible.

Election Integrity in Canada

About a year ago, when Mark Zuckerberg announced the steps we’re taking to protect elections from abuse and exploitation, he started by going back to Facebook's mission: which is all about giving people a voice and bringing people closer together. “Those are deeply democratic values,” he said, “and I don't want anyone to use our tools to undermine democracy because that's not what we stand for.” When it comes to election integrity, we take our responsibility extremely seriously. And that's why we are devoting significant time, energy, and resources to this issue. In the lead-up to the U.S. presidential election in 2016, we were slow to identify the new risks and bad actors, and we were slow to act. We are determined to make this right.

As our Chief Operating Officer Sheryl Sandberg recently told Congress, “We're investing heavily in people and technology to keep our community safe and keep our service secure. This includes using artificial intelligence to help find bad content and locate bad actors. We're shutting down fake accounts and reducing the spread of false news.” We've also introduced new ad transparency features and fact-checking partnerships with leading journalistic organizations.

Ultimately, safeguarding our democratic processes is a job for all of us — for governments, civil society, and the private sector. While we cannot, and do not, do this work alone - we are committed to making Facebook a force for good in democracy.

Let's start off by talking about what we're doing in Canada and Quebec.

While our federal election isn't until 2019, we started this work in 2017 with an eye toward the number of provincial elections (including in Quebec) that would happen over the course of the next 2 years. The basis was a report[1]released by the Communications Security Establishment (or CSE) that outlined the potential cyberthreats to Canada's next federal election. Account hacking and misinformation were identified as the biggest threats.

In response, we announced our Canadian Election Integrity Initiative[2], a package of five initiatives to help protect election integrity on Facebook. These include:

1. The launch of a new two-year partnership between Facebook Canada and MediaSmarts, to promote digital and news literacy[3];

2. The release of our “cyber hygiene” guide[4]for Canadian political parties and politicians, to support cybersecurity. I believe copies of the guide are available for distribution today;

3. The launch of an emergency Facebook cyber hotline for political parties to help address any cyber challenges in real-time, including suspected hacks;

4. The announcement of a Facebook cyber hygiene training program open to all Canadian federal political parties; and

5. An advertising transparency test in Canada enabling users to click “Info and Ads” on a Facebook Page and see all the ads that are running from that Page, whether or not the person is in the intended audience for the ad.

This last element, I'm proud to say, launched first in Canada, and is only now available across our global network.

Since that initial announcement, we have taken additional steps specifically to address the spread of misinformation. In June we launched a third-party fact-checking partnership[5]with Agence France-Presse (AFP)[6], to engage Canadian fact-checkers to review news stories on Facebook in French and English in order to rate their accuracy. We don't believe that Facebook should be the arbiter of truth, which is why we are partnering with independent fact checkers to ascertain the facts on any given issue. Stories that the AFP rate as false have their distribution reduced in News Feed, dropping future views on average by more than 80%. Facebook will also send notifications to anyone who has shared, and anyone who is about to share, the false story advising them that the story has been fact-checked.

Through our partnership with Media Smarts, we are working to help people spot false news in order to make more informed decisions, including what they share. Our joint program, called “Reality Check”, currently includes videos and tip sheets in French and English.

In fact, in the lead up to the Quebec election, MediaSmarts released a new video called Authenticity 101. It outlines five essential steps people can take to help ensure that the information they find online is accurate.

The other important step we are taking specifically in Quebec, is to remind candidates and their Page admins how to keep their accounts secure during the provincial campaign. Just prior to the start of the electoral campaign:

We conducted outreach to all Quebec political party and candidate Page admins reminding them about two-factor authentication(2FA) and ensuring they have access to our cyber threats crisis line; and

We sent in-app notifications to all political Page admins in Quebec, which will appear at the top of their feed, reminding them about turning on 2FA.

These specific efforts, all part of our Canadian Election Integrity Initiative, are a direct response to the key risks identified by the CSE — namely cybersecurity and misinformation online. Of course, our work can never be done, and we remain vigilant vis-à-vis bad actors and new and emerging cyber risks. We expect to roll-out additional election integrity measures in the months to come leading up to the 2019 federal election.

The Broader Public Policy Issues

You may be thinking to yourselves, these specifics may all be well and good, but what about the broader public policy issues that have been raised about Facebook and democracy? Today, I'd like to address what I think are five of the most important ones in the current public discourse in Canada:

1. Doesn't Facebook's News Feed keep us in our own filter bubbles, undermining social cohesion and exasperating polarization?

2. Shouldn't online platforms be legally liable for what appears on their platforms, similar to a newspaper or broadcaster?

3. Is the content that I see on Facebook really beyond the reach of rules and laws, as has been asserted by some commentators?

4. Isn't Facebook opposed to government regulation?

5. Are things getting better or worse in the fight against misinformation?

Misconceptions about Facebook's News Feed exacerbating polarization 

Doesn't Facebook's News Feed keep us in our own filter bubbles, undermining social cohesion and exasperating polarization?

I want to start by clarifying a common misunderstanding about how the News Feed algorithm works — it doesn't choose what you see on Facebook, but rather orders all of the content that you as a user have already explicitly chosen to connect with on the platform, either through friend connections or by liking and following Facebook Pages. The average person on Facebook is eligible to see thousands of posts on their News Feed in a given day, and it is just not possible for most people to go through all this content. The purpose of the News Feed algorithm is to help sort through all of this information, prioritizing posts that are expected to be most meaningful to you based on your past interactions on Facebook.

The filter bubble argument that has been advanced claims that individuals are trapped in a self-reinforcing echo chamber because the News Feed algorithm is serving up only content that reinforces people's pre-existing world views, leading to polarization and societal division. While this idea has certainly captured the imagination, empirical research shows us that the reality is more nuanced. Allow me to explain.

Sociologists make a distinction between the strong and loose ties that we have with the many people we interact with in our lives, and it is helpful to think about Facebook as the place where you connect with your loose ties — not necessarily the tight family members and friends you see all the time, but the broader array of people that you keep in touch with — work colleagues, casual acquaintances, or if you are like me, the parents of my kids' friends in grade school. While many people in this broader constellation of connections may share similar views to your own, not everyone will, and you will see those differing views in News Feed. How much? Empirical research[7]suggests that 23%, or about a quarter, of the content you see on News Feed will come from a point of view that will be different from your own. This is not perfect diversity, but it isn't all one-sided either. In fact, it's probably more diversity than what people were exposed to on a daily basis thirty, twenty or even ten years ago.

For those who prefer not to rely on the News Feed algorithm to sort content on Facebook, you can also switch to a reverse chronological feed, where you will see the most recent post at the top of your News Feed, regardless of its predicted relevance to you.[8]Or you can do what my wife does, which is manually prioritize content that you want to see first in News Feed by using the "See First" option, available in the News Feed Preferences menu.[9]She is a Quebecker who lives in Ontario and wants to see Quebec media articles in her News Feed before anything else (yes, including me), so she has told News Feed to show her first all posts from Radio-Canada, La Presse and Le Devoir every time she goes on News Feed. Both of these methods — reverse chronological and manual prioritization — are ways to actively shape and personalize your own experience.

A side note on polarization — studies show that forcing opposing views onto people may actually result in even greater polarization. It is much better to show people a variety of views and sources on a given issue and allow them the opportunity to discover these different views themselves.[10]This is an approach that we have begun exploring through our third-party fact-checking partnership with Agence France-Presse, showing AFP's fact-checking articles in the News Feed unit right below articles on Facebook that have been found to be false.

We also launched last week in Canada the Context Button on Facebook, an icon superimposed on a link post that provides more background information on the publishers and articles people see in News Feed. This includes the publisher’s Wikipedia entry, other recent stories posted by the publisher, and 'Shared By Friends', which will show people any of their friends who have shared the article.[11]

We continue to learn from experts, and continue to refine our efforts to build an informed community on our platform.

Regulating online platforms

Shouldn't online platforms be legally liable for what appears on their platforms, similar to a newspaper or broadcaster?

Some commentators have likened online platforms to a digital variation of a traditional newspaper or broadcaster, and thus have called for them to be legally liable for content on their platforms, requiring vetting by editors in the same way as traditional media.[12]

Given that the vast majority of content on Facebook is communication from individual to individual, online platforms like Facebook are actually more like virtual extensions of real-world communications and speech between people, than traditional newspapers or broadcasters. Treating Facebook, or for that matter any other social media platform, as a digital variant of news publishers requiring oversight by editors, means that all content should be reviewed and approved by these editors before anything gets posted. Aside from the practical challenges that this would entail, it is not something that we think is appropriate for Facebook to be doing. From the beginning, Facebook has been about giving people voice, and we think people should be allowed to communicate and share information freely with each other without being reviewed and approved by Facebook or some other intermediary, and I think most people would agree with that.

The idea that individuals should be able to post and communicate freely online is a cornerstone of good Internet policy, ranking right up there with net neutrality. As uOttawa Professor Michael Geist has noted, many experts believe that intermediary liability protections may be “the single most important legal protection for free speech on the Internet.”[13]The Oxford scholar Timothy Garton Ash, in his recent book Free Speech: 10 Principles for a Connected World, called this policy “crucial to global internet freedom.”[14]Treating Internet platforms interchangeably with newspapers and broadcasters that impose legal liability and require them to have the same level of editorial control would run counter to this policy of intermediary liability protection.

Of course, while Facebook is a platform that gives people voice, we understand that there must and should be limits to what is permissible, and I will now turn to those constraints.

Is the content that I see on Facebook really beyond the reach of rules and laws, as has been asserted?

Those who argue for editorial control over speech online tend to also make the charge that global Internet platforms are beyond the reach of rules and laws. I want to clarify Facebook's position here. We have a responsibility for what appears on our platform, and that's why we have strict rules in place, and why we adhere to local laws.

Content on Facebook is governed by our global Community Standards, which spell out in great detail what is allowed and not allowed on the platform.[15]Things like hate speech, fake accounts, pornography, bullying, the glorification of terrorist activity, and non-consensual intimate images all violate our Community Standards, and if they are reported to us by people, or if we discover them through our automated systems, we will remove them from our platform. To take just one example — terrorism-related content — in the first quarter of 2018 we took action on 1.9 million pieces of ISIS and al-Qaeda content, about twice as much than from the previous quarter. 99% of this content we took action on through our artificial intelligence system before a user reported it, and the median time on platform for newly uploaded content was less than one minute.[16]

Over and above our Community Standards, we respect local laws and will comply with requests from lawful authorities. My wife and I just had a third baby, so I'll give you an example that is highly relevant to us right now. Some of you may remember the existence of baby walkers, basically bouncy chairs on wheels for babies and toddlers who cannot walk yet. It allows them to be autonomously mobile, which does sound kind of scary. Due to safety concerns about baby walkers, Canada is now the first country in the world to ban this product, under the Canada Consumer Product Safety Act.[17]As it turns out, from time to time, posts of used baby walkers for sale do appear on Facebook, and we are contacted by Health Canada officials and asked to remove this content. We comply with this lawful request under the Canada Consumer Product Safety Act. Beyond our Community Standards and local laws, we are constantly seeking to better understand the cultural context for content on Facebook and to adjust our content policies as appropriate. It was in this spirit that we hosted a roundtable on Indigenous Culture and Content Online last spring with Indigenous groups and leaders.[18]We learned a lot about the different needs of Indigenous communities, such as the importance of being able to sell animal products online. We also heard loud and clear from roundtable participants the importance of being able to interface on Facebook in their own Indigenous languages. In response to this feedback, we are partnering with institutions and communities in Nunavut to translate Facebook into Inuktut for 2019.[19]

Isn't Facebook opposed to government regulation?

Despite what you may have read or heard from pundits, Facebook is not opposed to regulation. Our CEO Mark Zuckerberg has long said it's not a matter of whether there will be regulation, but what is the right regulation. That said, we are not waiting for regulation in order to take action to address the challenges with respect to misinformation and election integrity that exist here and now.

Take ad transparency as an example. As I noted earlier, Canada was the first country in the world where we tested our new “Info and Ads” feature, as part of our Canadian Election Integrity Initiative back in November 2017. We built this feature for all Facebook ads, not because there was government regulation requiring it, but because we believe it is the right thing to do. Everybody on Facebook should be able to see all the ads that an advertiser is running on Facebook, even if you are not in the intended audience for a given ad. This is a greater level of ad transparency than exists on any other platform or medium, online or offline. There is an ad transparency bill in the U.S. Congress called the Honest Ads Act, and we support it.

Some of you may be aware that there is also a piece of legislation currently before Parliament — Bill C-76, the Elections Modernization Act— which if passed would require organizations selling advertising space to not knowingly accept elections advertisements from foreign individuals. Facebook supports this new proposed government regulation.[20]

So Facebook is not opposed to regulation. And we would be pleased to work constructively with policymakers and governments, here in Canada and around the world, on other proposals that impact the digital economy and our democracy.

Are things getting better or worse in the fight against misinformation?

The battle against misinformation online is obviously much larger than Facebook, and I can only comment on what we are doing on our platform. I think it is too early for anyone to be definitive, but I am cautiously optimistic that we are moving in the right direction. Facebook is investing heavily both in technology and in people. We have doubled our personnel investment and now have 20,000 people on our security team, many of whom review reports of bad content and bad actors in over 50 languages, 24 hours a day, 7 days a week. There are various degrees of sophistication when it comes to problems related to bad actors and misinformation. The most complex and harmful threats have strategic goals such as domestic or foreign interference, and whether they are run by the Internet Research Agency (IRA) out of Russia or other bad actors, these malicious operations have the potential to cause real world harm and we are escalating our efforts to defeat them.

As you may have heard, following an intense investigation, we removed 32 Pages and accounts from Facebook and Instagram this past July in the U.S. because they were involved in coordinated inauthentic behaviour.[21]We removed an additional 652 Pages and accounts for coordinated inauthentic behaviour in August, with some activity originating from Russia and Iran.[22]The techniques these bad actors used exceeded the likes of what we saw from the IRA in the 2016 U.S. election. Our efforts to fight these operations will never be done. But increased investments in people and technology, and collaborations and partnerships with other technology companies, the public sector, and with academics and researchers, will help us face this challenge. We intend to continue our work to find and stop this behaviour.

It is worth noting that while we have not observed this type of attempted coordinated inauthentic behaviour in Canada, we remain vigilant nonetheless.

Fundamentally, these are security problems and there is a broad team at Facebook focused on tackling them including experts from threat intelligence, data science, product, engineering, public policy and legal. We take a multi-pronged approach including manual investigations and automated detection to disrupt these threats. You can never fully “solve” a security problem - threat actors will constantly find new ways to cause harm; but our goal is to make it much, much harder for these actors to operate across our platforms.

A good example of this is our recent work to scale up our ability to automatically detect and remove fake accounts using artificial intelligence. We have seen repeatedly that bad actors rely heavily on fake accounts, and so we're drastically reducing their ability to use them. In the first quarter of 2018, we disabled over 583 million fake accounts, the vast majority of which were removed within minutes of registration and before a human could report them.[23]Just last week in Canada, we deactivated two fake accounts involved in the Vancouver municipal election, a development that was featured on the front page of The Globe and Mail.[24] Our work is far from done, but we believe our efforts are making a difference. We've seen promising results with elections around the world — in France, Germany, the special election in Alabama, and Mexico — where we saw our tools were working to address misinformation and fake accounts. In Canada, we maintain open lines of communications with election officials. And we remain vigilant in Quebec in the lead-up to October 1. With every subsequent election, we are learning and refining our techniques, and we feel we are moving in the right direction.

In conclusion, I do want to re-emphasize the degree to which Facebook is committed to making the right investments in people, technology and time to protect election integrity on our platform in Canada and around the world. Working together with partners in government, academia and the private sector, we are determined to get this right.

Thank you for having me, and I look forward to the discussion.

[1]https://www.cse-cst.gc.ca/sites/default/files/cse-cyber-threat-assessmen...

[2]http://facebookcanadianelectionintegrityinitiative.com

[3]http://mediasmarts.ca/digital-media-literacy/digital-issues/authenticati...

[4]http://facebookcanadianelectionintegrityinitiative.com/pdfs/cyber-hygien...

[5]https://www.facebook.com/help/1952307158131536

[6]https://factcheck.afp.com

[7]http://science.sciencemag.org/content/348/6239/1130

[8]https://www.facebook.com/help/218728138156311

[9]https://www.facebook.com/help/371675846332829

[10]https://newsroom.fb.com/news/2018/01/sunstein-democracy/

[11]https://newsroom.fb.com/news/2018/04/news-feed-fyi-more-context/

[12]https://www.thestar.com/opinion/contributors/2018/08/27/we-wont-save-dem...

[13]https://www.theglobeandmail.com/report-on-business/rob-commentary/nafta-...

[14]Free Speech: 10 Principles for a Connected World, page 23, Yale University Press

[15]https://www.facebook.com/communitystandards/

[16]https://newsroom.fb.com/news/2018/04/keeping-terrorists-off-facebook/

[17]https://www.canada.ca/en/health-canada/services/science-research/activit...

[18]http://facebookcanadahardquestionsroundtables.com/roundtables/indiginous...

[19]https://www.newswire.ca/news-releases/facebook-canada-opens-inuktut-for-...

[20]http://www.ourcommons.ca/DocumentViewer/en/42-1/PROC/meeting-114/evidenc...

[21]https://newsroom.fb.com/news/2018/07/removing-bad-actors-on-facebook/

[22]https://newsroom.fb.com/news/2018/08/more-coordinated-inauthentic-behavior/

[23]https://transparency.facebook.com/community-standards-enforcement#fake-a...

[24]https://www.theglobeandmail.com/canada/british-columbia/article-bc-candi...

Twitter

Back to top