Want to spot a deepfake? Look for the stars in their eyes

In an era when the creation of artificial intelligence (AI) images is at the fingertips of the masses, the ability to detect fake pictures — particularly deepfakes of people — is becoming increasingly important.

So what if you could tell just by looking into someone’s eyes?

That’s the compelling finding of new research shared at the Royal Astronomical Society’s National Astronomy Meeting in Hull, which suggests that AI-generated fakes can be spotted by analysing human eyes in the same way that astronomers study pictures of galaxies.

The crux of the work, by University of Hull MSc student Adejumoke Owolabi, is all about the reflection in a person’s eyeballs.

If the reflections match, the image is likely to be that of a real human. If they don’t, they’re probably deepfakes.

“The reflections in the eyeballs are consistent for the real person, but incorrect (from a physics point of view) for the fake person,” said Kevin Pimbblet, professor of astrophysics and director of the Centre of Excellence for Data Science, Artificial Intelligence and Modelling at the University of Hull.

Researchers analysed reflections of light on the eyeballs of people in real and AI-generated images. They then employed methods typically used in astronomy to quantify the reflections and checked for consistency between left and right eyeball reflections.

Fake images often lack consistency in the reflections between each eye, whereas real images generally show the same reflections in both eyes.

“To measure the shapes of galaxies, we analyse whether they’re centrally compact, whether they’re symmetric, and how smooth they are. We analyse the light distribution,” said Professor Pimbblet.

“We detect the reflections in an automated way and run their morphological features through the CAS [concentration, asymmetry, smoothness] and Gini indices to compare similarity between left and right eyeballs.

“The findings show that deepfakes have some differences between the pair.”

The Gini coefficient is normally used to measure how the light in an image of a galaxy is distributed among its pixels. This measurement is made by ordering the pixels that make up the image of a galaxy in ascending order by flux and then comparing the result to what would be expected from a perfectly even flux distribution.

A Gini value of 0 is a galaxy in which the light is evenly distributed across all of the image’s pixels, while a Gini value of 1 is a galaxy with all light concentrated in a single pixel.

The team also tested CAS parameters, a tool originally developed by astronomers to measure the light distribution of galaxies to determine their morphology, but found it was not a successful predictor of fake eyes.

“It’s important to note that this is not a silver bullet for detecting fake images,” Professor Pimbblet added.

“There are false positives and false negatives; it’s not going to get everything. But this method provides us with a basis, a plan of attack, in the arms race to detect deepfakes.”

Speak Your Mind

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Get in Touch

350FansLike
100FollowersFollow
281FollowersFollow
150FollowersFollow

Recommend for You

Oh hi there 👋
It’s nice to meet you.

Subscribe and receive our weekly newsletter packed with awesome articles that really matters to you!

We don’t spam! Read our privacy policy for more info.

You might also like

Mitsubishi M80 CNC Solutions to Support Make in India...

Mitsubishi Electric has announced the unveiling of new, M80 series electric CNC solutions, in...

Facebook Is Finally Bringing Dark Mode to Its Android,...

Facebook is finally bringing dark mode to its mobile app on Android and iOS....

Zoom introduces two-factor authentication to boost users’ security

New Delhi: Video meet app Zoom has introduced Two-Factor Authentication (2FA) for additional security,...