The Role of Social in Disseminating Information (and Misinformation) in the US Election

 
element5-digital-ls8Kc0P9hAA-unsplash.jpg

Social media is an ever-present element of our lives. It helps maintain relationships, it acts as a source of entertainment and provides jobs.

During its conception social media was invented with the vision of connecting people in mind. Unconstrained by geography, availability or wake times people are connected via these channels on a scale that would have been unimaginable only 20 years ago.  As people have become more connected, so too is the ability to disseminate knowledge and information to a wider audience.

With over 3.6 billion people using social media in 2020, time is much better spent choosing a cover photo for a profile page than speaking through a mega-phone on top of a milk crate. Adopting an idealistic mindset is a good thing. Humans are less lonely with the ability to connect to like-minded groups, arguments are better considered with a more diverse set of inputs and citizens are more informed with an incomprehensible amount of knowledge literally available at our fingertips. However there exists a stark dissonance between the perspective of an idealist and that of the realist when considering the impact social media has on society, and no clearer has this been than during the most recent US election in 2020.

The fallout from the 2016 Cambridge Analytica debacle taught a very valuable lesson and put immense pressure on social media companies to take necessary steps to limit the misinformation distributed on accounts for the 2020 election. The New York Times reported that in the days preceding the ballot casting, when the precise results were still up in the air Facebook actioned a ‘break-glass’ contingency plan. Triggered by the spike in misinformation that was gaining traction on the platform, the intention was to improve the visibility of approved news sources whilst stifling those that were flagged potentially inaccurate. What’s more, Facebook took the measure of deleting personal accounts responsible for pedalling these falsehoods.

Twitter was faced with a similar set of challenges and critics, however in their case the attention came not from their own failures in 2016 but inevitable traffic President Trump would stir with tweets. Described in a blog released by Twitter, their policies during the 2020 election were the culmination of months of planning. Contained within the Civic Integrity Policy the most radical of their changes, labels, warnings and pre-bunks that were added to tweets to cast doubt on the legitimacy of the claims and even limit their spreading when deemed incorrect. In all, 300 000+ tweets received this treatment over the course of the election weekend representing 0.2% of all election related post; the sheer magnitude of tweets requiring corrective action from twitter demonstrating the beast social media organisations faces when it comes to preventing the dissemination of misinformation.

The issue of misinformation is undeniable, and when rhetoric about undermining the democratic process are brought into the discussion about consequences, the urgency is clear. What is less clear cut is the responsibility of organisations, politicians and citizens in resolving the issue. As discussed, the big players in the social media space have taken steps to improve the state of information on their sites and these actions haven’t been without criticism e.g., stifling free speech, taking political preference and favouring majorities. But it’s also important to remember the other side of the coin, and that is the economic incentives of the organisations. Facebook’s mission is to bring the world closer together, and in 2020 to re-affirm the trust of their billions of users they have been required to take a more aggressive stance to misinformation. But the policies discussed have not been radical, they have been reactive, a way of weathering the blows that begun during the 2016 election.

As an implementation and support partner of Sprinklr, KINSHIP digital understands the importance of listening to online social conversations related to the election so organisations can listen, analyse and act. This enables organisations to be proactive rather than reactive during an already volatile time and stop the spread of misinformation. Using Sprinklr, brands can listen, analyse and report on data across the web, including 23 social platforms, 11 messaging channels, and millions of blogs, reviews, news sites and forums. 

Finally, the filtering and monitoring of misinformation begins with the citizens. The spread of disinformation can happen both intentionally and unintentionally but ultimately, it’s the citizens who can control what information is being shared. One way is by using the Stop, Investigate, Find and Trace (SIFT) technique, a fact-checking process developed by digital literacy expert Mike Caulfield of Washington State University Vancouver.

While it’s impossible to stop all disinformation, we can all do our part to help.

 
Previous
Previous

International Women’s Day - #ChooseToChallenge

Next
Next

Verint Training - Membership and Roles