“Democracy will not reform itself by some hidden or automatic process. It takes citizens to awaken from their inertia, apathy, or fear. No one else will demand change for us.” — Larry Diamond, III Winds: Saving Democracy from Russian Rage, Chinese Ambition, and American Complacency
LAST Sunday, I was browsing through my Netflix catalog of documentary movies when I chanced upon “The Great Hack.” It was released in July 2019. It is about the Facebook–Cambridge Analytica data scandal.
A private military contractor, SCL Group, was behavioral research and strategic communications company with expertise in influencing mass behavior patterns. The company had been contracted for military and political operations across the globe in the late 1990s, including electioneering in the developing countries throughout the early 2000s.
For the company to move into the US elections, it formed Cambridge Analytica in 2012.
The scandal stems from the illicit harvesting of personal data and was used to create massive campaigns albeit approaching users in a personal manner. However, the mined big data ended up being used to create a huge artificial intelligence that reached the point of the disruption of democratic processes in the countries where they operated.
Facebook entered the picture when Cambridge Analytica bought personal data from the social media platform’s users. But as what we have all witnessed during the televised US Senate inquiry into the matter, Facebook owner Mark Zuckerberg denied any complicity. He offered to “look into it.” We haven’t heard since from Zuckerberg after that inquiry.
Allow me to digress, if only to share with you my “eureka” moment in the movie. Again, yes, I may be a voracious reader but I admit I am kind of slow when it comes to technology. Watching the movie, it is easy to get lost with all the technical terms.
My father, Emilio, used to work as a civilian consultant for a certain major general who was assigned at Camp Olivas in San Fernando, Pampanga.
It was the time when Emilio, then a director of the Philippine Information Agency for Northern Mindanao, was named one of the “frozen” regional directors under then-President Corazon Aquino through Executive Order 120, series of 1987.
As far as I can recall, he worked as a “communications specialist” for the major general. I do, however, recall some military jargon he taught me. Yes, I was that kind of kid — a kid who asked a lot of questions outside my sphere of formal education.
I learned from him the terms plausible deniability, low-intensity conflict, psychological operations, credible reality, and other fancy-sounding terms. For this column, I’ll focus on the terms of psychological operations and plausible deniability.
On one hand, psychological operations are operations to “convey selected information and indicators to audiences to influence their emotions, motives, and objective reasoning, and ultimately the behavior of governments, organizations, groups, and individuals,” as defined by the United States Marine Corps in Fort Bragg.
On the other hand, plausible deniability is defined as “denying responsibility for the actions due to the lack of substantial evidence of direct involvement, allows to shift the blame and to minimize reputational losses while maximizing the result of such actions.” This phrase was coined by the Central Intelligence Agency in the early 1960s to withhold information from senior officers to shield them from possible repercussions of their clandestine operations.
From the definitions of the two terms above, it is easy to realize that these are the foundations of great propaganda work.
Propaganda work targets our biases and emotions. With the national elections in 2022, you can bet your sweet “A” that these two military terms will be employed again in what has become a “hybrid communications warfare.”
As a public service, I would like to share with you a checklist in fighting misinformation by Carl Sagan called the “baloney detection.”
a.) Wherever possible there must be independent confirmation of the “facts.”
Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
b.) Arguments from authority carry little weight— "authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
c.) Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
d.) Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way-station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
e.) Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course, there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
f.) If there’s a chain of argument, every link in the chain must work (including the premise)—not just most of them.
g.) Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
h.) Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle—an electron, say—in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.
Let us remember that it was the weaponization of social media platforms that stunned us way back in 2016. Misinformation or its more ubiquitous and infamous term “fake news” target powerful emotions within you — wonder, fear, greed, grief.
It was the weaponization of social media platforms that stunned us way back in 2016. Let us all learn our lesson and start the new decade with a 20/20 vision.