Deepfake used to attack activist couple shows new disinformation frontier

A combination photograph showing an image purporting to be of British student and freelance writer Oliver Taylor (L) and a heat map of the same photograph produced by Tel Aviv-based deepfake detection company Cyabra is seen in this undated handout photo obtained by Reuters. The heat map, which was produced using one of Cyabra's algorithms, highlights areas of suspected computer manipulation. The digital inconsistencies were one of several indicators used by experts to determine that Taylor was an online mirage. Cyabra/Handout via REUTERS

By Raphael Satter

WASHINGTON (Reuters) – Oliver Taylor, a student at England’s University of Birmingham, is a twenty-something with brown eyes, light stubble, and a slightly stiff smile.

Online profiles describe him as a coffee lover and politics junkie who was raised in a traditional Jewish home. His half dozen freelance editorials and blog posts reveal an active interest in anti-Semitism and Jewish affairs, with bylines in the Jerusalem Post and the Times of Israel.

The catch? Oliver Taylor seems to be an elaborate fiction.

His university says it has no record of him. He has no obvious online footprint beyond an account on the question-and-answer site Quora, where he was active for two days in March. Two newspapers that published his work say they have tried and failed to confirm his identity. And experts in deceptive imagery used state-of-the-art forensic analysis programs to determine that Taylor’s profile photo is a hyper-realistic forgery – a “deepfake.”

Who is behind Taylor isn’t known to Reuters. Calls to the U.K. phone number he supplied to editors drew an automated error message and he didn’t respond to messages left at the Gmail address he used for correspondence.

Reuters was alerted to Taylor by London academic Mazen Masri, who drew international attention in late 2018 when he helped launch an Israeli lawsuit against the surveillance company NSO on behalf of alleged Mexican victims of the company’s phone hacking technology.

In an article in U.S. Jewish newspaper The Algemeiner, Taylor had accused Masri and his wife, Palestinian rights campaigner Ryvka Barnard, of being “known terrorist sympathizers.”

Masri and Barnard were taken aback by the allegation, which they deny. But they were also baffled as to why a university student would single them out. Masri said he pulled up Taylor’s profile photo. He couldn’t put his finger on it, he said, but something about the young man’s face “seemed off.”

Six experts interviewed by Reuters say the image has the characteristics of a deepfake.

“The distortion and inconsistencies in the background are a tell-tale sign of a synthesized image, as are a few glitches around his neck and collar,” said digital image forensics pioneer Hany Farid, who teaches at the University of California, Berkeley.

Artist Mario Klingemann, who regularly uses deepfakes in his work, said the photo “has all the hallmarks.”

“I’m 100 percent sure,” he said.

‘A VENTRILOQUIST’S DUMMY’

The Taylor persona is a rare in-the-wild example of a phenomenon that has emerged as a key anxiety of the digital age: The marriage of deepfakes and disinformation.

The threat is drawing increasing concern in Washington and Silicon Valley. Last year House Intelligence Committee chairman Adam Schiff warned that computer-generated video could “turn a world leader into a ventriloquist’s dummy.” Last month Facebook announced the conclusion of its Deepfake Detection Challenge – a competition intended to help researchers automatically identify falsified footage. Last week online publication The Daily Beast revealed a network of deepfake journalists – part of a larger group of bogus personas seeding propaganda online.

Deepfakes like Taylor are dangerous because they can help build “a totally untraceable identity,” said Dan Brahmy, whose Israel-based startup Cyabra specializes in detecting such images.

Brahmy said investigators chasing the origin of such photos are left “searching for a needle in a haystack – except the needle doesn’t exist.”

Taylor appears to have had no online presence until he started writing articles in late December. The University of Birmingham said in a statement it could not find “any record of this individual using these details.” Editors at the Jerusalem Post and The Algemeiner say they published Taylor after he pitched them stories cold over email. He didn’t ask for payment, they said, and they didn’t take aggressive steps to vet his identity.

“We’re not a counterintelligence operation,” Algemeiner Editor-in-chief Dovid Efune said, although he noted that the paper had introduced new safeguards since.

After Reuters began asking about Taylor, The Algemeiner and the Times of Israel deleted his work. The Jerusalem Post removed Taylor’s article after Reuters published this story. Taylor emailed the Times of Israel and Algemeiner protesting the deletions, but Times of Israel Opinion Editor Miriam Herschlag said she rebuffed him after he failed to prove his identity. Efune said he didn’t respond to Taylor’s messages.

Arutz Sheva has kept Taylor’s articles online, although it removed the “terrorist sympathizers” reference following a complaint from Masri and Barnard. Editor Yoni Kempinski said only that “in many cases” news outlets “use pseudonyms to byline opinion articles.” Kempinski declined to elaborate or say whether he considered Taylor a pseudonym.

Oliver Taylor’s articles drew minimal engagement on social media, but the Times of Israel’s Herschlag said they were still dangerous – not only because they could distort the public discourse but also because they risked making people in her position less willing to take chances on unknown writers.

“Absolutely we need to screen out impostors and up our defenses,” she said. “But I don’t want to set up these barriers that prevent new voices from being heard.”

(Reporting by Raphael Satter; editing by Chris Sanders and Edward Tobin)

Leave a Reply