Should the concept of propaganda directed at the public consciousness be restored? Global Voices

Photo by Amia Najran

Misinformation is a common term and a major source of unrest and conflict around the world. Starting with a presidency, Donald Trump – and his accusations during his tenure of the media of spreading “fake news” – election obstruction and the fake debates that flood social media have become commonplace, and this is still a very disturbing aspect of the internet. The unintentional spread of misinformation during the Corona pandemic, and a significant burden on the health system, when the conspiracy theory emerged, was clearly the cause of the rise of the anti-vaccine movement. Governments, activists and scientists invest considerable human and financial resources to study the phenomenon of misinformation and find ways to correct it.

There are currently a number of discussions taking place about the nature of misinformation and how it works. The first set of reasons is attributed to the way social media and other IT platforms work. It is widely argued that social media use algorithms that favor controversial and emotionally charged content, leading to an increase in extremism and fueling more fake information. In the same vein, the privacy of data management on social media creates opportunities for targeted media campaigns. It depends on access to a large number of users’ personal information, which is used to analyze human behavior and access to different groups of religious, ethnic and racial groups. Such data is widely available for sale on the open market and the black market. Promoted posts and paid advertisements are the targeted communication tools available across various IT platforms.

Another set of discussions is less vociferous, but no more important. The world’s fall into the informational influence of the media increases the level of distrust in governments and institutions, as they are now unable to provide complete and unbiased information about the way the media works. What Les I. Levreux, professor in the Department of Information Studies at UCLA, said in a Pew Research Center report on the future of truth and false information online in 2017 is still relevant today:

So many players and interests see online information as a uniquely powerful shaper of individual action and public opinion in ways that serve their economic or political interests (marketing, politics, education, scientific controversies, community identity and solidarity, behavioral ‘nudging’, etc. . ). These very diverse players are likely to oppose (or attempt to undermine) technological or policy interventions or other efforts to ensure the quality, and especially the disinterestedness, of information.

Many actors and stakeholders consider Internet information as a powerful tool to shape individual activism and public opinion in ways that serve their economic and political interests (marketing, politics, education, scientific debate, identity and community solidarity, behavioral push, etc.) Extremely diverse. (or try to sabotage) technological, political or other efforts to ensure the quality and impact of information, especially if there is a lack of interest in it.

At the same time, watchdogs such as the press and the non-profit sector may not be in the public interest because they are more interested in the commodification of economic information. The lack of institutional trade forums pushes people to social media, which has become the only place to participate in social and political life. In addition, these platforms embrace social polarization, xenophobia and the spread of fake news.

However, it is not only these problems in the civil context that can help the increase and spread of misinformation, but also the role of citizens in the use and sharing of this information. According to research in the psychology of false information, this highlights that people on average are unable to distinguish between fake news and the truth and have no deliberate intention to share false information. Scientists have identified one of the factors influencing the spread of fake news is the phenomenon of “slow thinking” – a thinking style that requires less cognitive effort. People like them are inclined to follow their emotions and intuition when using and sharing content, and make their decisions based not only on the content itself, but also on the metadata of the posted content, such as the assumption of the reliability of the published source or based on the amount of interaction and shares on the post.

Today’s misinformation disaster is driven by a combination of these factors: the worsening political situation, lack of effort from technology platforms, information and the reality of human nature. There is great concern that a series of disinformation campaigns based on success in the above three factors could encourage negative behavior change among people. Despite the fact that IT professionals working on online influence campaigns or marketing show that targeted advertising influence in policy and marketing communication has on average little impact on target groups. In other words, aggressive advertising campaigns can only convince someone who already has an established attitude, but it is not enough to make a drastic change in one’s way of thinking. In their book, Network Propaganda, Pinckler, Knight and Roberts point out that while monitoring data manipulation and the spread of disinformation online is important, its impact must also be remembered, and despite Russian disinformation having little impact, they nevertheless exploited existing conflicts within. American society.

Why is Russian disinformation working amid the war in Ukraine?

The recent surge of misinformation about the war in Ukraine highlights an interesting pattern. While the Ukrainian information campaign shows success in communicating a powerful and influential approach to Western countries, Russian disinformation targets the rest of the world, including the BRICS countries, Asia and Africa. In his novels, the Ukrainian side seeks to raise awareness of the war crimes taking place there or to show the strength of the Ukrainian resistance, in contrast to the disinformation campaign of the Russian side, which operates in scattered ways and spreads messages that implicitly positions of the target group to receive a wide resonance.

As Carl Miller, director of the Center for Social Media Analysis at the Demos think tank in London, claimed in his article, who is behind #I stand with Putin – for The Atlantic.

Disinformation campaigns are much more effective when they have a powerful truth at their core and use that truth to guide discussion. The stark reality is that antipathy for the West runs deep in many parts of the world and sympathy for Russia is real. It is in these contexts that I would expect influence operations to be targeted—and to work.

Disinformation campaigns are most effective when they include a strong truth at their core and use that fact to guide the discussion. The hard truth is that in many parts of the world they have a deep hatred for the West and genuine sympathy for Russia. Whereas in these circumstances I would expect Russian influence operations to be targeted – and work to boot.

Recent research by a team of Karl Analysts of a group of posts with the hashtag #I stand with Putin and #I stand with Russia sheds light on the spread of narratives of Western hypocrisy, of NATO expansion and solidarity with the BRICS in selected regions.

People have been asking for more examples via messages. Although I do not like amplification, it is important to show the rhetorical position used here.

Despite the fact that the Russian digital disinformation strategy is driven and communicated by modern IT tools, in its content it is similar to technical propaganda. The goal is to establish the legitimacy of particular narratives by introducing them into the media ecosystem and repeating them until they become common sense. It is enhanced by internet technology. This allows the production of fake content that simulates reality but is reliable to the inexperienced or the unwary.

Back to ads

Despite the apparent effectiveness of misinformation, the public must prepare for a change in approach. This may be based on the perception that Jacques Ellul, in the 1960s, considered pre-propaganda – equipping minds with an abundance of conflicting information, provided with hidden goals and distributed as “facts and “education””. without a direct approach. or conspicuous, and seemingly aimless, merely creates confusion and reduces prejudice and perceptions against them.” Propaganda is primarily the goal of the Kremlin’s geopolitical information campaigns, and Russian trolls embrace Cold War techniques: from disrupting the 2016 US election to justifying the Russian invasion of Ukraine.

The explanatory power of the term “advance propaganda” becomes more apparent when compared to the more commonly used term disinformation. Many of the information manipulation campaigns that have become more sophisticated and widespread aim to have a psychological impact on people by creating an alternative image of reality. Disinformation is only one of the elements that produce a media effect on the societal imagination based on dominant narratives and ideas.

Returning to the Russian issue, despite the reaction of the general public that fully supports the Russian invasion of Ukraine, there is a strong negative attitude towards the war that cannot be ignored. This is strong evidence of internal propaganda that uses a lot of wrong information to pollinate the attitudes of the people at different times. . In this and upcoming issues, focusing on disinformation and targeting fake news may not be enough to prevent conflict and political obstruction.

Please visit the project page for more knowledge on monitoring the suppression of freedom

Leave a Comment