There is no secret that the Washington establishment is making every possible effort to boost mainstream propaganda around the world. However, now, these efforts are being openly militaryzed.
The US Defense Advanced Research Projects Agency (DARPA), subordinate to the Pentagon is priming to combat ‘fake news and disinformation’ through a program called Semantic Forensics (SemaFor).
Currently, it appears to be only at the planning stages, but in its completed form it will “automatically detect, attribute, and characterize falsified multi-modal media assets (text, audio, image, video) to defend against large-scale, automated disinformation attacks.”
This, in DARPA’s view is needed due to some small but common errors produced by automated systems that manipulate media content.
For example, images of a woman’s face created with generative adversarial networks, which use a database of real photographs to produce a synthetic face, might include mismatched earrings – a semantic error easier to spot than to avoid making.
Currently, the media manipulation tools heavily rely on ingesting and processing massive amounts of data, according to DARPA, which makes them easier to make mistakes that can be spotted with the right algorithm.
“These semantic failures provide an opportunity for defenders to gain an asymmetric advantage,” DARPA wrote. “A comprehensive suite of semantic inconsistency detectors would dramatically increase the burden on media falsifiers, requiring the creators of falsified media to get every semantic detail correct, while defenders only need to find one, or a very few, inconsistencies.”
This is the timeline of the program, and it is to be ready in approximately 4 years, if everything goes according to plan.
Notably, in TA4, DARPA plans to start anticipating the “threats” and start proactively fighting them.
“TA4 will curate state-of-the-art (SOTA) challenges drawn from the public domain to ensure that the SemaFor program addresses relevant threat scenarios. TA4 will also develop threat models, based on current and anticipated technology, to help ensure that SemaFor defenses will be highly relevant for the foreseeable future. TA4 will include multiple challenge problem curation teams who will collaborate to maximize coverage of the challenge space and threat models. TA4 will regularly deliver challenges and updated threat models to the TA3 evaluation team and DARPA.”
These are the promised deliverables that would be result from the phases of the development and work:
This is simply the initial stage in which work on the tools is beginning and information is being aggregated and so on: “TA1 performers will deliver algorithms that detect, characterize, and attribute falsified multi-modal media”;
What’s been made is now applied into practice, at least in part – “The TA2 performer will work with the TA1 performers to integrate algorithms, knowledge, and resources from TA1 performers into the TA2 system. The gathered information will then be reviewed by an analyst, it will not simply be automated. TA2 will also deliver periodic proof-of-concept systems that integrate multiple TA1 components into a SemaFor system targeting scalable cloud deployment. TA2 will be expected to provide a demonstration to the government in each program phase of the progressing capabilities of the SemaFor system.”
This is where it gets interesting. Initially, the description of the program suggests that this will curate and look for fake news and disinformation. This is where it is actually admitted that media content will also be created – “TA3 will design, organize, plan, and conduct the SemaFor evaluations and results analysis. While the evaluations will be conducted at the TA2 performer’s location, they will be under the control and supervision of the TA3 performer. TA3 deliverables include program metrics, evaluation protocols, and a library of multi-modal media assets for development and test purposes. Media may be collected or created.”
And as previously mentioned, TA4 then focuses on threat anticipation and assisting “hackatons” – “TA4 will deliver challenge problems and threat models in support of hackathons and program evaluations.”
DARPA also said it wants to keep a tight lid on some of the technical details of the project, saying it will treat program activities as controlled technical information (CTI). These details would not be classified, but contractors would be barred from sharing or releasing information to other parties since it could “reveal sensitive or even classified capabilities and/or vulnerabilities of operating systems.”
“A key goal of the program is to establish an open, standards-based, multisource, plug-and-play architecture that allows for interoperability and integration,” the announcement stated. “This goal includes the ability to easily add, remove, substitute, and modify software and hardware components in order to facilitate rapid innovation by future developers and users.”
The SemaFor program is most likely a follow-up on an already running DARPA program – MediFor.
It is trying to plug a technological gap in image authentication, as no end-to-end system can verify manipulation of images taken by digital cameras and smartphones.
“Mirroring this rise in digital imagery is the associated ability for even relatively unskilled users to manipulate and distort the message of the visual media,” according to DARPA. “While many manipulations are benign, performed for fun or for artistic value, others are for adversarial purposes, such as propaganda or misinformation campaigns.”
Most likely to the discontent of many critics of US President Donald Trump, the Republicans and the current state of elections in the US, the program will be ready in 4 years.
“This timeline is too slow and I wonder if it is a bit of PR,” Syracuse University assistant professor of communications Jennifer Grygiel said. “Educating the public on media literacy, along with legislation, is what is important. But elected officials lack motivation themselves for change, and there is a conflict of interest as they are using these very platforms to get elected.”
The current trend in forming the narrative is becoming and more evident, with constant claims of “educating the public in media literacy” and other dubious statements of the sort. At least, this time it’s not specifically aimed at children.
MORE ON THE TOPIC: