Research & Technology April 13, 2026 · 7 min read

Why Your Camera Might Be the Next Deepfake Detector, and Why That's Not Enough Yet

ETH Zurich published a chip that cryptographically signs images the moment they're captured. Google's Pixel 10 already ships with hardware-backed photo signing by default. The technology is real, it's here, and it still doesn't solve the problem for most of the images you'll encounter today.

Glowing teal microchip circuit traces representing hardware-level cryptographic signing for deepfake detection

On March 24, 2026, researchers at ETH Zurich published a paper in Nature Electronics describing a sensor chip that embeds a cryptographic signature into images, video, and audio at the moment of capture. This happens at the hardware level, before the data ever touches software.

Fernando Cardes, a research associate in Andreas Hierlemann's Biosystems Engineering lab, leads the project. The idea started in 2017 as a side project at ETH's Bio Engineering Lab in Basel, before large language models, before Midjourney, before deepfake fraud was a line item on any security budget. The chip is only now approaching commercial readiness, nine years on.

How the Chip Works

The chip records three things at capture: device origin, timestamp, and tamper state. That signature travels with the media file. Any post-capture manipulation, cropping, color adjustment, or face swap, leaves detectable traces in the signature. To forge a clean signature, you'd need physical access to the chip itself. You can't do it in software. The team proposes uploading signatures to a public immutable ledger, like a blockchain, so anyone can verify authenticity without specialised tools.

A working prototype exists. A patent has been filed. Commercial implementation is still a few steps away.

The Cameras That Are Already Doing This

Hardware signing isn't just a lab experiment. The C2PA standard (Coalition for Content Provenance and Authenticity) has been pushing this concept into real devices for two years.

  • Leica shipped the SL3-S with native C2PA signing in January 2025, the first camera to do so commercially.
  • Nikon, Canon, Fujifilm, and Panasonic all joined the Content Authenticity Initiative in April 2025.
  • Sony announced the PXW-Z300 at IBC 2025, the first camcorder with C2PA signing built in, creating a chain of custody from camera to audience.
  • Google Pixel 10 signs every photo by default using hardware-backed keys in the Titan M2 chip and on-device timestamping via Tensor G5.

C2PA 2.3, released in December 2025, extended provenance coverage to live streaming via CMAF segment signing, a first for the standard. More than 6,000 organisations now back C2PA, and Samsung devices have integrated it natively alongside Google. The conformance program launched in mid-2025 means you can actually check whether a product's signing claims are verified, not just marketed.

The scale of the problem this is responding to: Identity fraud linked to deepfakes hit 4.2 million cases in Q1 2026 alone, a 217% increase from Q1 2024. Deepfakes now account for 6.5% of all fraud attacks globally, a 2,137% increase since 2022. And according to a Forbes report from April 9, "Deepfake-as-a-Service" platforms, complete with plug-and-play toolkits available on dark web marketplaces, are emerging as the new Ransomware-as-a-Service. People correctly identify high-quality deepfakes only 24.5% of the time.

The Problem Hardware Signing Doesn't Solve

The field has placed two parallel bets: hardware signing at capture, and detection after the fact.

Hardware signing is a strong guarantee for new content captured on new devices. It does nothing for content that already exists. Billions of images circulating online right now carry no provenance metadata. Old content doesn't get a retroactive signature. A deepfake created from a 2019 photo has no signing chain to break.

The second problem is metadata stripping. C2PA provenance data is frequently removed when images and videos pass through social media upload and transcoding pipelines. The verification chain breaks at the exact point where most people consume content. The standard acknowledges this as a critical gap. There's no clean fix yet, because platforms have their own compression and format pipelines that predate provenance requirements.

The third problem is adoption lag. Most cameras in use globally today, and most phones, have no signing capability. The Pixel 10 is a real step forward, but the vast majority of smartphones on the market don't sign photos at hardware level. Even by optimistic projections, mainstream hardware signing is a multi-year rollout.

What Happens When There's No Provenance Signal

When you encounter a photo or video with no C2PA metadata, no digital signature, no provenance trail, you can't simply conclude it's fake. The absence of a signature is not evidence of manipulation. It might be an older device. It might have passed through a platform that stripped the data. It might be perfectly authentic.

This is exactly where detection-after-the-fact tools still matter. Visual and audio analysis can surface compression artifacts, lighting inconsistencies, physiological implausibilities, and generative model fingerprints that reveal manipulation even when no chain-of-custody data exists.

The two approaches aren't competing. Hardware provenance tells you a file is authentic when a signature exists and validates. Detection tools cover the gap when it doesn't, which for the foreseeable future is most of what you'll see.

FakeOut runs detection analysis on images and video without requiring provenance metadata. It works on unsigned content from any source or device. It's free on Android, with iOS beta currently in development. The analysis is based on the content itself: visual artifacts, lighting inconsistencies, physiological patterns, and generative model fingerprints that exist in the file regardless of what metadata is or isn't attached.