Deepfakes and Authenticity
Generative AI tools are constantly improving, producing better synthetic images, audio and video every year.
Worryingly, AI‑generated images purporting to depict Auschwitz already circulate online. Such fabrications blur the line between documentation and invention, retraumatize survivors and their families, mislead learners encountering history for the first time, and erode trust in authentic archival records.
Authenticity infrastructure does not stop deepfakes from being created. What it can do is protect authentic material from being wrongly dismissed as inauthentic, highlight when content has no recorded, verifiable history, and support journalism, education and research in mixed synthetic/real environments.