His animated image and others like it -- at the same time unsettling, emotional, and a bit fantastical, are made possible by Deep Nostalgia, an artificial intelligence program from the genealogy platform MyHeritage.
As far as AI-animated images go, the technology behind these Harry Potter-esque photos isn't particularly complex.
Users are invited to supply old photos of their loved ones, and the program uses deep learning to apply predetermined movements to their facial features. It also makes up for little moments that aren't in the original photo, like the reveal of teeth or the side of a head. Together it creates, if not an entirely natural effect, than a deeply arresting one.
Responses to the Deep Nostalgia images -- tears at seeing a grandmother's smile, an eerie feeling of connection to a long-dead historical icon -- knock on a mysterious emotional wall between us and this type of rapidly-evolving technology.
We rely on perception and emotion
"The draw here is that visual imagery is visceral and compelling and we respond to it," says Hany Farid, associate dean and head of the School of Information at UC Berkeley. "We are visual beings. When you see your grandmother or Mark Twain come alive, there's something fascinating about it."
Fascinating -- and yes, a little frightening.
Our brains, as sophisticated as they are, have a prehistoric response to things that are almost human, but not quite. This is commonly called the uncanny valley, and a lot of deepfakes and AI-driven image manipulations set off this ancient alarm bell. Even MyHeritage addresses this reaction in their explanation of the program.
"Indeed, the results can be controversial and it's hard to stay indifferent to this technology," their FAQ page reads.
RELATED: DNA test reveals woman's dad wanted by FBI for murder of wife, kids
When it's a beloved relative occupying that almost-but-not-quite space in reality, the parts of our brain that love and fear are pitted against each other, even if we know full well that what we're looking at isn't real.
"The way our brain processes images of people is different than inanimate objects. It taps into neural circuitry," Farid says. "For years we have been able to synthesize inanimate objects, and that completely fools the visual system because we don't have preconceived notions of how they move. But when it comes to humans, it is lagging. Part of that is the subtle way we move and recognize these movements."
"My sense of wonder might be tinged with a sense of terror," said La Marr Jurelle Bruce, a professor at the University of Maryland who shared the animated image of Frederick Douglass, capturing the attention of hundreds of thousands of people online.
AI relies on data and rules
Deepfakes, which are a sophisticated melding of synthetic audio and images, have been a point of contention for digital ethicists for years, especially when it comes to issues like altered pornography and fake videos that could threaten national and financial institutions.
For a more positive application, companies have been turning to the technology more and more to create widely customizable ad campaigns. The payoff, experts say, is that the consumer can feel more attached to a brand or a product by seeing the specific way it would fit into their life -- on a model with similar proportions, in their own language, or in a micro-targeted ad that speaks to their interests.
Those kinds of applications are courting a similar kind of human connection as Deep Nostalgia. But the fact is, there's nothing human about artificial intelligence.
Farid is careful to point out that machine learning, which is what drives more widely available animation technologies like Deep Nostalgia, is a field within the greater world of artificial intelligence. Machine learning pores over data and finds patterns. While a program may get better with more input, there isn't any intelligence or analysis involved in the way it applies these patterns.
There are many applications that benefit greatly from this type of data.
"When you're predicting the stock market, you want patterns," Farid offers as an example. "Or doing cancer diagnoses. I don't need to understand at the moment why cancer shows up, I just want to know if it does."
When applied to more human pursuits, the lack of, well, intelligence shows through.
The smiling faces of our ancestors, though touching, naturally don't hold up once we drop our suspension of disbelief.
AI-generated renderings of human faces, another threat to the security of our online environs, often contain hilarious glitches where a program, not quite sure what to do with irregular things like ears or glasses, spits out little monstrosities hidden in otherwise convincing visages.
Even extremely sophisticated, true deepfakes like the elaborate ones of Tom Cruise currently being passed around often have small inconsistencies that make us second guess our own sense of reality.
Those inconsistencies, however, will lessen as the technology develops, and Farid says the time has come for companies to look critically at the ethical implications of using them.
"The tech sector has done things because they can and not because they should," he says. "We need to stop building things because they're cool and start asking these hard questions before it's too late."
Before, say, the technology gets so good that our emotions are able to override our keen senses of perception.
When using Deep Nostalgia, MyHeritage cautions users against uploading photos of living people without their consent and says the company did not add audio options in the interest of user safety.
In the future, perhaps another program will be able to fill in those gaps, and we'll be able to see, hear and converse with those long lost to us. Such technology will present stunning challenges to our security and our sense of reality as we know it.
But when it's smiling at us through the comforting faces of our dearest loved ones, it will be that much harder to resist.
(The-CNN-Wire & 2021 Cable News Network, Inc., a Time Warner Company. All rights reserved.)