ChatPaper.aiChatPaper

SynthID-图像:互联网规模下的图像水印技术

SynthID-Image: Image watermarking at internet scale

October 10, 2025
作者: Sven Gowal, Rudy Bunel, Florian Stimberg, David Stutz, Guillermo Ortiz-Jimenez, Christina Kouridi, Mel Vecerik, Jamie Hayes, Sylvestre-Alvise Rebuffi, Paul Bernard, Chris Gamble, Miklós Z. Horváth, Fabian Kaczmarczyck, Alex Kaskasoli, Aleksandar Petrov, Ilia Shumailov, Meghana Thotakuri, Olivia Wiles, Jessica Yung, Zahra Ahmed, Victor Martin, Simon Rosen, Christopher Savčak, Armin Senoner, Nidhi Vyas, Pushmeet Kohli
cs.AI

摘要

我们介绍了SynthID-Image,一种基于深度学习的系统,用于对AI生成的图像进行隐形水印处理。本文详细记录了在互联网规模上部署此类系统的技术需求、威胁模型及实际挑战,重点探讨了有效性、保真度、鲁棒性和安全性等关键要求。SynthID-Image已在谷歌服务中为超过百亿张图片和视频帧添加水印,其对应的验证服务已向受信任的测试者开放。为了全面性,我们还展示了对外部模型变体SynthID-O的实验评估,该模型通过合作伙伴关系提供。我们将SynthID-O与文献中的其他后处理水印方法进行基准测试,证明了其在视觉质量和对抗常见图像扰动方面的最先进性能。尽管本工作聚焦于视觉媒体,但关于部署、限制和威胁建模的结论可推广至包括音频在内的其他模态。本文为基于深度学习的媒体溯源系统的大规模部署提供了全面的文档记录。
English
We introduce SynthID-Image, a deep learning-based system for invisibly watermarking AI-generated imagery. This paper documents the technical desiderata, threat models, and practical challenges of deploying such a system at internet scale, addressing key requirements of effectiveness, fidelity, robustness, and security. SynthID-Image has been used to watermark over ten billion images and video frames across Google's services and its corresponding verification service is available to trusted testers. For completeness, we present an experimental evaluation of an external model variant, SynthID-O, which is available through partnerships. We benchmark SynthID-O against other post-hoc watermarking methods from the literature, demonstrating state-of-the-art performance in both visual quality and robustness to common image perturbations. While this work centers on visual media, the conclusions on deployment, constraints, and threat modeling generalize to other modalities, including audio. This paper provides a comprehensive documentation for the large-scale deployment of deep learning-based media provenance systems.
PDF12October 15, 2025