On cars, old, new and future; science & technology; vintage airplanes, computer flight simulation of them; Sherlockiana; our English language; travel; and other stuff
ZEYNEP TUFEKCI WRITES, “LAST WEEK THE TELEVISION HOST Chris Cuomo took to social media to share a video…” Prof. Tufekci is one of my most highly regarded thinkers/most valuable sources, but I almost stopped dead in my tracks.
“Television host”? “social media”? My television viewing consists of ’30s and ’40s movies on TCM, Grand Sumo Highlights on NHK Japan, and Formula One Racing.
I have eschewed social media (other than my own SimanaitisSays) ever since “On Surveillance Capitalism—And My Facebook Identity,” when the latter refused to accept its own Terms of Service in recognizing me as Dennis Simanaitis.
Fortunately, Prof. T.’s opening line continues that the video “purported to show Representative Alexandria Ocasio-Cortez on the House floor denouncing….”
Gee, I wonder what AOC is up to these days?
What I learned is that she has been subjected to A.I. fabrication of real footage, to what’s known as deepfake.

Portraits (first column) and stills from Gan-generated videos of (from top) Marilyn Monroe, Albert Einstein and the Mona Lisa. Image by Egor Zakharov/YouTube from “What Do We Do About Deepfake Video?,” The Guardian, June 23, 2019.
AOC’s Deepfake. Tufekci explains in “The A.O.C. Deepfake Was Terrible. The Proposed Solution is Delusional,” The New York Times, August 11, 2025: “Except Ocasio-Cortez had never made that speech. The video was a deepfake, generated by artificial intelligence. Cuomo deleted the post, but not before she chided him for ‘reposting Facebook memes and calling it journalism.’ ”
Well, it’s comforting to me that AOC and I seem to agree about Facebook. And I got interested to read what Prof. Tufekci had to say about deepfakes. Here are tidbits gleaned from her article.

Seeking Definite Proof. Tufekci observes, “We had long ago lost photos as definite proof, given how easily they could be manipulated. [I surely agree.] Audio, too, is increasingly easy to fake. Video was among the last bastions of verification, exactly because it was difficult to fake. Now that that’s gone, the real, and increasingly the only, way to be confident of something that one did not witness is to find a reputable source and verify. Ah, what’s a reputable source, you ask? And therein lies what’s left of our society.”
Whom to Trust? “Trust the authorities?,” Tufekci posits. “Well, good luck with that. Authorities aren’t always truthful or correct; making them the final and sole arbiter isn’t going to end well.”
A Fake Explosion. “The cases that make a splash,” Tufekci recounts, “are usually about high-profile people or topics. In May 2023, an A.I.-generated image that was said to show a large explosion near the Pentagon spread on Twitter as breaking news. It was then amplified by many high-profile ‘verified’ accounts. The fire department in Arlington, Va., where the Pentagon is, quickly posted a notice stating that there was no fire. The stock market recovered from the loss it suffered during those few elapsed minutes. Whew, right? Just an estimated $500 billion drop in the value of the S & P before bouncing back — some people’s loss and other people’s gain.”
TMI Can Lead To Loss of Attention. Tufekci describes, “In 1971, Herbert Simon—recipient of both the Nobel in economic science and its computer science equivalent, the Turing Award— provided one of the greatest insights about what happens when technology switches us from a regime of scarcity to one of glut, as it has so many times throughout history. Discussing the new abundance of information that printing, mass media and computers begat, he noted that ‘wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes.’ He was referring to attention, now the most precious commodity of all.”

Professor Zeynep Tufekci, Istanbul-born, sociologist, op-ed writer, TED speaker. Image from UNC Center for Media Law & Policy.
Credibility. Tufekci adds, “The other crucial thing that the abundance of such easily generated information makes scarce is credibility. And that is nowhere more stark than in the case of photos, audio and video, because they are among the key mechanisms with which we judge claims about reality. Lose that, lose reality. Scientists and parts of the tech industry have come up with a few very promising frameworks — known as zero-knowledge proofs, secure enclaves, hardware authentication tokens using public key cryptography, distributed ledgers, for example — about which there is much more to say at another moment. Many other tools may yet arise.”
“But,” she concludes, “unless we start taking the need seriously now before we lose what’s left of proof of authenticity and verification, governments will step right into the void. If the governments are not run by authoritarians already, it probably won’t take long till they are.”
Thanks, Prof. T. I’m glad I got over my tv/social media hangups. ds
© Dennis Simanaitis, SimanaitisSays.com, 2025
I’m glad you follow Ted Talks, which I find thought provoking.I learned a lot from this YouTube vid.Ted talk detecting AI images – https://www.youtube.com/watch?v=q5_PrTvNypY
Cheers, Bob
I’m glad you follow Ted Talks, which I find thought provoking.I learned a lot from this YouTube vid.Ted talk detecting AI images – https://www.youtube.com/watch?v=q5_PrTvNypY
Cheers, Bob