Yesterday’s BBC News at Ten had a fun little segment on “fake news”. No, it wasn’t a crisis of contrite self-awareness, rather a comment on the dark potential of facial manipulation in post-production editing suites, so we can overdub people talking with a different audio, and enable them to say something different.

The example they used, which is particularly sh*tty for a news programme, was the possibility of the hit BBC drama, Luther, being overdubbed for foreign viewers, with Sexiest Man Alive™ Idris Elba’s mouth contorting to form words of a language he doesn’t speak.

It doesn’t take a genius to extrapolate that the BBC’s possible gain in selling their enhanced programmes abroad could be the truth’s loss in other areas of life. The hipster video editor questioned on the subject paid the dangers very casual lip service, appropriately, by suggesting safeguards to ensure the technology is not used for nefarious ends. Because if you’re willing to edit a video to say someone said something they didn’t, you’d surely sign up to a code of conduct first, and be sure to stick to it.

We’ve seen, in recent days, the White House - they be the ones with all the power amongst global civilisation, if you weren’t aware - chucking out a video on Twitter, seemingly edited to make a guy giving a physical “F*ck no!” to someone trying to take a microphone off him look like he’s about to tell something a third time to a woman with the proverbial two black eyes. To be clear, the apparent manufactured certainty of an assault on the edited video does not remove any possibility that the supposed raw data doesn’t show one, but that’s overly specific. This shoddy technologically-centric lie - which crucially became the truth for millions when President Donald Trump (just checked, and yep) denied any alteration, rather than apologising and pinning it on InfoWars - is a sign of things to come. Terrible things to come.

Video can be sped up, slowed down, frames added and taken away. We don’t know what’s real and what’s not, but I’m inclined to think Jim Acosta was made to look worse than he is by the Right, rather than better than he is by the Left or Centre. Regardless, this is the news warfare of the future. One person’s word against another’s, and the facial and movement manipulation techniques will only get better and faster.

Take this video of Barack Obama. As explained, he never said those things, as much as it looks like he's saying them. It’s scary, and it was done by a comedian with a relatively modest budget. Imagine what the CIA could do. In fairness, I’m also imagining how Jordan Peele's other hugely famous Obama video could possibly be any more entertaining than it already is, but it could happen. The bad stuff too. To think I used to be impressed by a photograph of someone wearing something they’ve never worn, occasionally to mislead.

But what’s the issue with the Obama trick vid? Yeah, it doesn’t really sound like Obama. Enter horseman number two.

If you click through here, you’ll find one of the most terrifying videos on YouTube. Yep, it’s pretty popular too. Adobe have created software, to sum up, which means that through recording twenty minutes of speech from any person, you can then type out as much or as little as you like for “them to say” (in their own voice). That is to say, we can now literally (figuratively) put words into people’s mouths. Remember the vocal technology used in the Be Right Back episode of Black Mirror five years ago? Fun when speculative fiction becomes fact, right?

Having painstakingly spliced syllables together when editing distorted tracks for podcasts - never to misrepresent - and having been told by a music producer friend that he routinely does similar to capture the “best” vocal, this software is hugely attractive. But when you consider the implications of the horseman-on-horseman action, oh, the abyss is f*cking calling.

If you watch that whole video, you’ll perhaps be reminded of the hipster from earlier. This technology is so powerful, Adobe recognise, that we need to ensure it is not abused. So we put, what, signatures denoting an edit into the background of saved productions? Every technology is being appropriated by villains. Chances are it got a*seholes Brexit and and King A*sehole into the Oval Office. Villains’ll make use of this too.

All that is necessary for the triumph of evil is that good men do nothing. Final horseman, number three (I know there are four! I’m an actual Catholic and everything!), is the combination of corruption, ineptitude and unconscientiousness. That will be what sees the technology abused. Not artificial intelligence, which we assume will develop evil before compassion, but the human beings who stick the good person inside them into an internal jail, and allow the social construct monsters to take over their decision-making.

When a police force, and the Crown Prosecution Service, and a prosecution legal team, submit an assistingly incomplete set of electronic messages as evidence to secure a rape conviction rather than presenting a comprehensively accurate picture of dialogue between accused and accuser, heaven knows what the corrupt, inept and/or unconscientious will do with technology which enables them to create audiovisual evidence of someone saying something they never said.

And who will suffer? The bad? Or those with the least resources and inclination to fight this fire with fire of their own?

So what do we do? I’m afraid you’ve stumbled upon the wrong blog site if you think I have an answer. Clearly, we’re f*cked. Or, at least, the truth is. Maybe we’ll find out the real truth in the afterlife. Or when we take off our virtual reality headsets. Perhaps, and that’s our last chance, technology isn’t all bad.

Follow Jay29ers on Twitter

 

Comments (0)

There are no comments posted here yet

Leave your comments

Posting comment as a guest.
Attachments (0 / 3)
Share Your Location