By Edward Morgan —
In an age where truth is speed, and the fastest to the “send” or “post” button is life’s winner, it is no surprise that eclectic words, such as those used in the title of this piece, are out of fashion.
As it happens both these words, epistemology and polyverity, explain something critical to the new Information Age. One, about how we know things (epistemology); and two, about how we assess things as true or false (polyverity).
The Greek word “epistemology” means “theory of knowledge”. The ancient discipline associated with it concerns how humans come to know or believe anything at all. The test of belief is sometimes the source of the beliefs – so, knowing where we “got something from” conditions how we think about a fact, or an alleged fact.
“Polyverity” is not a word. It is one made up for this piece. It is simply the compound of two ancient words – the Latin word for “truth” (veritas) and the Greek word for “many”, or “multiple” (polus). I made it up for two reasons – one, to point out how quickly the modern world is changing and our words and language are changing with it. Words that mean one thing one week can mean something completely different the next. This is not new – words always change for context, and it certainly is not alarming. But the speed at which this is occurring is probably eyebrow-raising to the linguist.
The second reason was to highlight a phenomenon. The U.S. president made news in 2017 when he called out sections of the media for their “false” reporting, which he catchily called “fake news”. All of a sudden, as news outlets were classified as “true” or “false”, their center of gravity shifted from being an outlet for fact and perspective to concerns about truth or falsehood. Philosophers were called in for comment. Credibility was the only vogue. But for all the distraction of this phenomenon, it was nothing more than the sharp politicization of news outlets.
U.S. Navy analyst Robert Kozloski recently called for the development of a new discipline of intelligence collection and reporting, which he called “public intelligence”, or PUBINT. Kozloski argued that the proliferation of information systems and the alleged Russian interference in the U.S. electoral process calls for “a new intelligence paradigm, one that embraces transparency and spreading non-politicized information to a broad audience.” Kozloski argued this should have four components – to 1) monitor, 2) expose, 3) alert, and 4) detect the sources of information and so the information-receiving public can make more informed judgements as to what information means.
Kozloski’s idea was aimed at the U.S. defense and security community. But its potential reach has further implications. If, for example, smartphone reading-apps could provide a percentage count of the origin and spread of a message; and a percentage-chance that information is being automated in its distribution (e.g., Russia; >50 percent bots); the average citizen would be in a better position to judge which information is truly politically relevant and which is not. A message on U.S. gun-control with high Russian replication, for example, might be judged to be a non-U.S. actor seeking deliberately to influence a domestic political debate in the United States.
The obvious point remains that anyone could gain hold of such mark-up software and exploit it for their own ends. Nation-states, hackers, or other interest groups could “hijack the cyber bus” to mislead the public on the source and spread of internet messages.
But, if enough companies were to create such software and use it – or a variation on it – for their own websites and social media, a critical mass of such technologies might emerge to give the average internet user at least a general sense of if they were being played for geopolitical purposes.
It is in companies’ as well as governments’ interest to prevent the internet from becoming a naked geopolitical battleground. Companies’ investment in establishing good cyber norms will facilitate the internet’s growth as a ground for global trade, rather than all-out psychological warfare between nation-state rivals.
So, Kozloksi’s proposal is interesting but requires caution. It denotes a premise of “accepted transparency” when, self-evidently, companies, people, and entities seek to promote and protect their own interests. However, its strength comes in recognizing that in an internet age, the question of “who” says what – i.e., where a message has come from, is at least as important as what they have said. We are now in a period where the truth of what has been said can, and should, be weighed against the context in which it is said.
Take the South China Sea, for example. It is a central trading route for multiple countries at the sea’s periphery. China is pressed up against other countries as it nationalistically seeks to claim sovereignty over South China Sea areas which have been firmly denied to it by international law. The Chinese Communist Party’s nationalist narrative will inform its South China Sea messaging on the internet, both in what Beijing permits and what it protests.
Commercial investment in technologies to expose the geopolitical contours of the narratives in play over the South China Sea would surely help identify friction points and off-ramps for trade and commercial interests operating in this region. Certainly, it would help prevent the inadvertent co-opting of companies and citizens for nationalist purposes. Similar arguments apply to the East China Sea and the Taiwan Straits, areas where geopolitical interests directly transverse the trading zones of multiple nations.
The theatrics of expression are part of what it is to speak as a human being, rather than a machine. Literature and history testify to this. Neil Armstrong’s first words on the moon would have lost their significance were he not on the moon when he said them. Macbeth would have been a mere brigand were he not a trusted ally of the king he killed. The Information Age – with its slim smartphones and trillions of bytes of daily data – tells us we need context and depth to inform these politically capable machines. This will ensure, no less, that the theatrics of the most political animal of all – the human person as an evolved social citizen – can be measured against true context, not just the digital events whose veracity may leave one wanting.
Dr. Edward Morgan is a Military Fellow at the Center for Strategic and International Studies. These views are his own.