News

Don’t be a target: How to identify adversarial propaganda

  • Published
  • By Capt. Dorothy Sherwood
  • 16th Air Force (AFCYBER)

Strategic competitors, adversaries and proxies use information to gain an advantage over the U.S. joint force.

Adversarial disinformation campaigns and influence operations are “gray zone” activities that use technology and tactics to disguise themselves, making it challenging to identify the source of the content. 

“At the United States Cyber Command, we see the influence piece, which is much more prevalent these days,” said U.S. Army Gen. Paul M. Nakasone, commander, U.S. Cyber Command during congressional testimony in March 2023.

Nakasone said that CYBERCOM’s operations aim to disrupt adversary campaigns designed to harm America by going “after troll farms and other different actors that are trying to create influence.”

U.S. Cyber Command’s mission spans across the globe to disrupt, degrade and destroy the capabilities of malicious cyber actors and foreign state adversaries as directed.

In support of its U.S. election defense operations, U.S. Cyber Command has the tools and expertise to identify Russian troll farms posing as Americans and the ability to block them.

U.S. Cyber Command isn’t the only command with cyber operators, every service member is a cyber-operator. It is their responsibility to protect themselves from adversarial misinformation, disinformation and propaganda by educating themselves on how to identify it and how to detect credible sources of information.

Identifying Orderers of Disinformation and Disinformation Actors

Adversarial disinformation campaigns and influence operations can be identified by knowing where they start — with the Orderers of Disinformation.

Orderers of Disinformation are strategic competitors and adversaries, who want to generate and spread a false narrative to distort facts pertaining to past and future events.

They hire creators to develop legitimate looking news agencies and social media accounts to push the false narrative into the Information Environment.   

To spread the orderers’ false narrative in the IE, creators use disinformation actors, such as bots, cyborgs, trolls, sockpuppets and amplifiers.  
 

  • Bots use AI to saturate the IE with the false narrative.   
  • Cyborgs are employees who take over a bot to counter comments about the false narrative.
  • Trolls are also employees who post comments to push the false narrative.
  • Troll Farms are several employees that work together on a large-scale campaign to push the false narrative internationally.
  • Sockpuppets are fake personas that look like a real person’s profile that pushes the false narrative.
  • Amplifiers are ordinary people who believe the false narrative from the bots, cyborgs, trolls and sockpuppets and spread it further.

The aftermath of spreading their false narrative could change audiences’ behavior and even transform cultures.    

Stopping these disinformation actors can be done by identifying repeated posts on their recently created accounts, no personal updates to their page with less than 100 followers and if they are mimicking real users.

The Defense Information School has created a checklist to help spot these disinformation actors at:  www.pavilion.dinfos.edu/Checklist/Article/2404709/spotting-disinformation-actors/.

Detecting Propaganda Ecosystems

Orderers of Disinformation are taking a page from marketing by using a method called the “media multiplier effect” for their false narratives.

They use the media multiplier effect to repeat a false narrative simultaneously from different sources to make it appear more credible to the public. This method creates a propaganda ecosystem.

Detecting propaganda ecosystems is another way to identify their false narratives and in turn, verifies credible sources of information.

The Global Engagement Center’s Disarming Disinformation website at, www.state.gov/disarming-disinformation/, has resources depicting already identified propaganda ecosystems.

As the U.S. Air Force component to U.S. Cyber Command, 16th Air Force conducts defensive and offensive cyberspace operations against adversarial disinformation campaigns and influence operations.

“You, personally, are being targeted by our adversaries,” said U.S. Air Force Lt. Gen. Kevin Kennedy, 16th Air Force (Air Forces Cyber) commander. “Whether it’s through your social media and personal devices or more directly through efforts to collect, disrupt or manipulate the information you are using in your personal or professional life — our adversaries are seeking opportunities to influence your perceptions and ultimately your behavior.”

At 16th Air Force, empowered Airmen and Guardians understand our strategic competitors and adversaries, which is the foundation of Information Warfare.

The U.S. military’s most important asset is our service members, a resilient, ready force. A force ready to enhance their digital literacy and build a defense ecosystem against a new era of great power competition in the IE.

Be a part of this defense ecosystem—use these techniques to counter strategic competitors and adversaries’ misinformation, disinformation and propaganda.

Don’t be a target; help mature IW across the services and the globe by being able to outthink, outmaneuver and outfight any adversary or threat.