“Deep fake” tech brings Mona Lisa to life

Remember Max Headroom? If you’re a child of the 80s, you’ll know him; he was the slightly creepy AI talking head who starred in a satirical British science fiction series about 30 years ago.

He was stilted, digitised, distorted, and had an eerie electronic voice.

In other words: he was the complete opposite of the deep fakes we’re starting to encounter now.


ARTICLE CONTINUES AFTER THIS ADVERTISEMENT

Deep fakes are an emerging, entirely terrifying advancement in technology that allow us to make anyone say, well, anything at all.

For instance, Samsung’s new deep fake tech has animated Mona Lisa.

The new algorithms, developed by a team from the Samsung AI Center and the Skolkovo Institute of Science and Technology, work the most effectively with a variety of sample images taken at different angles. But, they can be effective with as little as just one picture to work from, like a photo or a painting.

Engineer Egor Zakharov says these advancements will lead to a number of benefits, including “a reduction in long-distance travel and short-distance commute; it will democratise education, and improve the quality of life for people with disabilities. It will distribute jobs more fairly and uniformly around the world, and better connect relatives and friends separated by distance.”

“To achieve all these effects, we need to make human communication in AR and VR as realistic and compelling as possible, and the creation of photorealistic avatars is one (small) step towards this future,” Egor says.

With further research and development, he hopes that in the future, “telepresence systems” will allow us to easily create realistic semblances of ourselves as avatars.

The applications of this tech sounds incredible – and incredibly terrifying. After all, as Danielle Citron, author of Hate Crimes in Cyberspace, says, “What about the average person… where a deep fake sex video emerges with a Google search of your name, and becomes almost impossible to debunk?”

Strategically placed deep fakes could also impact the results of political elections.

“While I don’t think it’s likely,” says digital forensics expert Hany Farid, “I also don’t think it’s out of the question, and that’s enough to keep me up at night.”

Me too, Hany.

Leave a Reply

avatar
  Subscribe  
Notify of