Take a closer look: How to fight back against online disinformation

Share this:

By Katie Compton, Policy and Politics editor

For my last post on disinformation, I spoke with researchers who study the sources and impacts of intentionally misleading information online. I came away from those interviews with a better sense of the problem’s scale and how disinformation erodes people’s trust in the media, government, and other institutions. But those discussions raised a big question— how do we address the problem of fake news and rebuild trust?

We will never be able to completely stop false information from spreading online, but we can take steps to create a healthier information landscape. The resources and research I came across in preparation for this post emphasized that we need both education and regulation to create a safer, more transparent online ecosystem.

Education: Building Canadians’ Resistance to Disinformation

MediaSmarts is a Canadian non-profit organization founded in the 1990s that works to increase digital literacy and build people’s resilience to the possible harms of online information. According to Dr. Kara Brisson-Boivin, Director of Research for MediaSmarts, recent events such as the Cambridge Analytica scandal, the COVID-19 pandemic, and the conflict in Ukraine have made policymakers and the general public more aware of the danger that online disinformation poses.

“There are specific conversations in the zeitgeist, and more importantly, between people in positions of power, recognizing things that MediaSmarts has been saying for decades,” says Dr. Brisson-Boivin. “We’ve been around for over 25 years now, and the organization has evolved greatly, as the technological and digital context has changed so dramatically over that time. But authenticating and verifying sources, and sorting facts from fiction has been a value and objective of the organization since its inception.”

MediaSmarts has developed several resources to authenticate and verify online information. They have even brought back a Canadian icon – the “House Hippo”- to get people thinking about sorting fact from fiction. Image from breakthefake.ca.

Through campaigns like Break the Fake and Check First, Share After, MediaSmarts is giving Canadians the tools they need to become more thoughtful about the information they read and share online. “It is not time consuming, it’s easy, and anyone can do it,” says Dr. Brisson-Boivin. “And that’s why we came up with a four-step model to tell if something is true online.” These four steps include:

  1. Use fact-checking tools. Sites like Snopes.com will tell you if a story has been debunked.
  2. Find the source. Don’t just stop at the headline— click through to see where the story comes from.
  3. Learn more about the source. MediaSmarts suggests doing a quick internet search to see if a source is real and reputable.
  4. Check other sources. See if other sites are also reporting the same story.

“Typically, you only need one of those steps to be able to figure out if something is real or true online,” says Dr. Brisson-Boivin.

Legislation: How can policymakers help?

Educating the public is key to solving the disinformation problem, but government and industry also have roles to play. The challenge is figuring out where and how policy solutions can help stem the flow of online disinformation.

Canadian Heritage launched the Digital Citizen Initiative to help fund efforts combatting online disinformation. One funding recipient is MediaSmarts; another is Public Policy Forum’s Digital Democracy Project, a “a multi-year project to analyze and respond to the increasing amounts of disinformation and hate in the digital public sphere.”

Research by the Public Policy Forum (PPF), including insights from the Digital Democracy Project, led to the Canadian Commission on Democratic Expression, a three-year initiative to strengthen democracy by addressing digital information’s potential harms. In May 2022, the Commission published a report outlining core principles and recommendations for making the online platforms that we use every day more transparent and accountable to Canadians.

In their recent report, the Canadian Commission on Democratic Expression outlines fundamental principles and recommendations for policymakers who are aiming to reduce online harms. Image from https://ppforum.ca/.

The report draws on original research, input from digital media stakeholders and a representative national citizen assembly. It lays out some of the core ideals and issues that we need to consider when addressing disinformation. These principles include:

  • Free speech is fundamental to a democratic society.
  • Hatred, disinformation, politically polarizing content, conspiracies, and bullying can undermine citizens’ ability to participate in the democratic process.
  • Online platforms are not neutral disseminators of information. They must take more responsibility for reducing the harms associated with the content they distribute, but they cannot be the sole moderators.
  • Public agencies (i.e. the government) must take steps to protect democratic expression and reduce online harm.
  • Any policy solutions need to be rights-based, meaning that they safeguard against possible privacy infringements and give people greater control over their data and democratic expression.
  • Special protections are needed for people under 18, as they are particularly vulnerable to online harm.
  • Canada should look at what other countries have done right when it comes to addressing online disinformation.

While identifying these core principles, the Commission noticed that there is currently a fundamental power imbalance at the heart of how we communicate and interact online. That is, online platforms hold significant power and aren’t subject to much scrutiny. The Commission outlined three key themes that the policymakers, tech leaders, and the public should keep in mind as they seek to correct this power imbalance:

  1. Transparency. Legislators must require that online platforms be more open and transparent about how they operate. This accountability includes giving users, researchers, and regulators the ability to look inside these closed systems and better understand how online algorithms use people’s data.
  2. Accountability. Once we have greater transparency, we must have ways to hold digital platforms accountable for any harms they perpetuate. This could include creating new regulatory bodies and passing laws that clarify when information intermediaries, such as social media companies, are liable for online harms.
  3. Empowerment. We need ongoing efforts to educate citizens about managing their data and online presence and the Canadian government needs to modernize its privacy laws to catch up with the tremendous technological advances of the past 20 years.

As the efforts of organizations like MediaSmarts and the PPF illustrate, addressing the issue of online disinformation is a complex challenge that requires a multifaceted approach. It will require careful consideration, ongoing learning, and input from the public, government agencies, and the tech companies that collect and use our data to fashion comprehensive and effective solutions.

Feature image: Canadian organizations are looking for ways to combat online disinformation. Photo by cottonbro. 

Share this: