Categories: Uncategorized

How Europe Can Tackle Influence Operations and Disinformation

Alicia Wanless

Concerns over social media’s potential to disrupt democratic societies have existed at least since the emergence of the so-called Islamic State in the first half of the 2010s.

The outcome of the UK’s Brexit referendum in 2016 and evidence of Russian influence operations during the 2016 U.S. presidential election only made such concerns more salient.

Between the rampant spread of misinformation around the coronavirus pandemic and the failed U.S. Capitol insurrection by supporters of then U.S. president Donald Trump on January 6, 2021, it might seem like we aren’t that much farther along in addressing these concerns.

When Trump was taken off several online services—including Twitter and Facebook—in the wake of the insurrection he was accused of inciting, some European officials took the opportunity to advocate for the regulation of tech platforms.

German Chancellor Angela Merkel stressed that social media platforms should not be making decisions to suspend political accounts by themselves “but according to the law and within the framework defined by legislators.”

European Commissioner for the Internal Market Thierry Breton made the case for the Digital Services Act (DSA), introduced in December 2020, which the EU hopes will “better protect consumers” online, encourage the transparency and accountability of tech companies, and foster competition.

But while there is a need for increased regulation in the short term, governments should also encourage the development of mechanisms that enable research to better understand what works against influence operations and related tactics like disinformation over the longer term.

As it is, the DSA—if adopted—will compel large social media platforms to share data with “vetted researchers” that relates to risks such as “the dissemination of illegal content” and “inauthentic manipulation” of online services. But the DSA stops short of outlining how that will work in practice and should outline a concrete roadmap for better collaboration between platforms and researchers.

Moreover, it isn’t enough to compel the bigger companies to share information with researchers upon request. There is still a significant knowledge gap in terms of understanding the effects of influence operations and the efficacy of different countermeasures. New regulations like the DSA are great if they solve problems—but rules ought to be grounded in evidence that they work, not just in evidence that the problem exists.

What is needed is a permanent mechanism that facilitates collaborative research between industry and academia. To better understand influence operations and related countermeasures, researchers need more than one-time access to data; they need to regularly collect and update quantitative data to facilitate hypothesis testing and design of intervention strategies.

Here, the EU has an opportunity not just to lead in implementing regulation of the information environment, but also in fostering longer-term collaborative research. The DSA could articulate a viable model for how research collaboration would work in practice and encourage industry to support the development of an independent collaborative research center. The EU could match funds with industry to support these initiatives.

Nearly everyone working on this challenge calls for increased data and information sharing. But they seldom say in detail how to make that happen. So far, the DSA seems to be falling into the same trap.

In forthcoming research with Princeton University’s Empirical Study of Conflict Project, we found that, outside of fact-checking, there is scant research on the effectiveness of influence operations countermeasures. If social media platforms are measuring the efficacy of the interventions they make to counter influence operations, findings are seldom disclosed in subsequent public announcements about them.

For academics to answers the difficult questions of how to counter influence operations, they need access to what can be sensitive personal data, which currently can be achieved only by working inside social media companies—where publishing in peer-reviewed journals becomes more difficult while external attacks on one’s credibility for working with the industry increase.

It is becoming increasingly clear that if democracies are to get a handle on the problem of influence operations, a bridging mechanism to facilitate research is paramount.

One model is a multi-stakeholder research and development center (MRDC). An MRDC would be a new kind of research entity—funded and supported, at least in part, by the social media platforms but allowed to operate independently.

It takes inspiration from U.S. government–sponsored entities like the RAND Corporation, which for decades has produced trusted, high-quality research on public policy issues and handled highly classified information responsibly. With an MRDC, online platforms would take the place of the U.S. government, providing money, data, and some input on research priorities—but not direct control.

An MRDC could provide a venue where industry and academic researchers come together for a sustained period to collaborate within a shared structure. The key is that such an institution must be independent. With sustained funding from industry, governments, and philanthropies, a multi-stakeholder research and development center could address five key issues:

Facilitate funding for long-term projects.

Provide infrastructure for developing shared research agendas and a mechanism for executing studies.

Create conditions that help build trusted, long-term relationships between sectors.

Offer career opportunities for talented researchers wishing to do basic research with practical application.

Guard against inappropriate disclosures while enabling high-credibility studies with sensitive information that cannot be made public.

Not to be confused with a more operational threat-assessment center, an MRDC would focus on longer-term research, such as understanding the effects of influence operations on democratic decisionmaking.

But if the event-driven timing of interventions by social media platforms is any indication, an MRDC is unlikely to emerge in the absence of leadership and pressure from governments and society.

To go one step beyond calling for an MRDC in the Digital Services Act, the EU could offer to co-host such a center or to lead the charge in laying out the details for establishing one. And either of these initiatives could be done in collaboration with the social media platforms.

Courtesy: (carnegieeurope.eu)

The Frontier Post

Recent Posts

Sindh govt to bring more buses for entire province

F.P. Report KARACHI: The Sindh government has decided to bring in more buses for the…

16 hours ago

Gandapur gives Centre 15-day ultimatum

F.P. Report PESHAWAR: Khyber Pakhtunkhwa Chief Minister Ali Amin Gundapur has threatened to take over…

16 hours ago

PHC CJ orders CS, IGP to appear before court

Humayun Khan PESHAWAR: Chief Justice Peshawar High Court (PHC) Justice Ishtiaq Ibrahim ordered Chief Secretary,…

16 hours ago

PIA flight to Toronto diverted to Karachi due to technical fault

F.P. Report KARACHI : Pakistan International Airlines flight from Islamabad to Toronto has been diverted…

17 hours ago

Didn’t stop govt from blocking SIMs of non-filers: IHC

F.P. Report ISLAMABAD: Chief Justice Aamer Farooq of the Islamabad High Court (IHC) on Friday…

17 hours ago

Pak Hockey team calls on COAS

F.P. Report RAWALPINDI: General Syed Asim Munir, NI (M), Chief of Army Staff (COAS), received…

17 hours ago

This website uses cookies.