DE / EN
Search
Close this search box.
DE / EN

Deep Fake and Disinformation

How the news-polygraph research project is using an AI platform to counter media manipulation

The Pope as a hip-hop mogul in a white down jacket, the violent arrest of Donald Trump or a supposed video call between Berlin’s former Governing Mayor Franziska Giffey and Kiev’s mayor Vladimir Klitschko: artificial intelligence is now capable of suggesting a false, but perfectly staged reality to us in seemingly real photos or videos. This may have entertainment value, but we need to ask ourselves with many articles or social media posts: what’s true and what isn’t? Because nowadays anyone can publish. There isn’t a need to give references or multi-level verification in social media. This is a challenge for journalists and media companies – they have to decide in a matter of hours whether material they’ve received is newsworthy and should be included in their news coverage.

How does one verify these sources? What are the standards one sets for classification? Good tools and awareness-raising are needed at all levels in order to counter propaganda and fake news. Such a toolkit is currently being developed by the news-polygraph research project which is coordinated by transfermedia in the Media Tech Hub. The platform, in effect a digital lie detector, aims to facilitate the work of journalists in the future.

Checking facts is and always has been one of the central tasks of journalism. What is new is the massive amount of data – good reliable tools are needed here along with a greater awareness in order to be able to make valid statements. Many media companies have fact-checking editorial teams or offer further training. And some themselves have put together a kind of toolbox such as a folder with various programmes. These must then be selected somewhat awkwardly by the user. news-polygraph builds on existing processes, optimises them and brings them together in one interface. The research alliance has already entered into a dialogue with media partners and asked them about their needs. The goal of the project is to provide a platform offering fast, easier/intuitive support.

Which content is being manipulated – and which isn’t?

transfermedia COO Claudia Wolf, who is responsible for the alliance’s project coordination, explains: “From a scientific point of view, we would like to dispense with the popular term of “fake news” in our research project. For us, it’s about verifying information. At the end of the day, people are the ones who decide whether or not something is ‘fake’. We have designed a platform that gives media professionals and journalists tools to make a quick evaluation of texts, videos, audio and photos. That’s particularly important for newsrooms.”

Drag and drop features can be used via an intuitive user interface to enter questionable media content and have it checked by an artificial intelligence programme. Models and services run in the background and remain invisible to them. The journalists simply have to decide in advance what they would like to have checked, determine the running order for the queries – and whether they want to integrate so-called crowd panels.

While text and image verification in editorial offices is already advanced, the verification of video sequences is much more complex. This is where the Fraunhofer Institute is getting involved as a partner. They are specialised in audio sequences, including producing forensic audio analyses in the area of law enforcement. Other alliance partners such as rbb or Deutsche Welle are also making equally important contributions.

The human crowd as a verificationist

Fact-checking thanks to human crowd support is a special feature of news-polygraph. The integrated crowd panels kick into action if, for example, material is made available by whistleblowers to newsrooms who don’t have any access to contacts in the respective country to confirm its authenticity. A mountain range, trees, vegetation at a certain time of year: people living there onsite are able to clearly identify pictures and videos without having to resort to the use of technical aids. Such a line of questioning can work if volunteers are grouped into crowds in advance and qualify after a registration process in order to participate according to their specific knowledge. The input from the crowd comes from the academic environment, including involvement by the TU Berlin who had already been conducting a test phase with Ukrainian native speakers during the initial stages of the war in Ukraine. Following a crowd query, the programme then reports back on whether a predetermined high percentage of participants have made the same statement.

Here again, it’s the journalists who ultimately decide whether they agree with the assessment. As Claudia Wolf points out, “people are the ones who have to make the decision.” That’s also what the project’s title of “news-polygraph” is referring to. Like a lie detector, it is only a means to an end. It’s not wishing to diminish the original work of the journalists because that would neither be possible nor desirable: “It is in the DNA of journalists to be classifying and evaluating content. We are just providing a simplified, fast, well-functioning and trustworthy tool.” The enhanced approach through the crowd support not only makes it possible to respond quickly, but also to go into much more detail with another line of research.

An alliance of specialists

The semantic linking is being handled by the transfermedia team who had already installed an ontology for metadata for the dwerft research project and is also responsible for project management and coordination of the ten partners. The technological part is based at Fraunhofer IDMT. Other partner companies are Ubermetrics, which has specialised in social media posts; neurocat, which tests AI models for safety compliance; and the German Research Centre for Artificial Intelligence and Delphai which provide a Google alternative for B2B purposes.

Looking to the future, MediaTech Hub will work with news-polygraph on the technical-media level, while the Medienanstalt Berlin-Brandenburg (mabb) will handle legal matters. The successful completion of the research project should see the development of a marketable software which the media companies can integrate into their own systems with constant updates being made available.

The project began on 3 May as it moved from the preparatory phase to implementation. Media professionals can now start looking forward to an event in November when the project’s partners will show what one can expect from news-polygraph in the coming years.

Foto: DeepMind on Unsplash

This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.