Simanaitis Says

On cars, old, new and future; science & technology; vintage airplanes, computer flight simulation of them; Sherlockiana; our English language; travel; and other stuff

CHATBOTS—A PANDORA’S BOX OF CONVERSATIONS

INTERACTING WITH A.I. HAS SOCIETAL BENEFITS as well as a dark side. In London Review of Books, October 10, 2024, James Vincent writes about chatbots in a scarily titled “Horny Robot Baby Voice.”  

A Royal Intrusion. Vincent opens his essay by citing Christmas morning 2021 when “guards at Windsor Castle discovered an intruder in the grounds. Wearing a homemade mask and carrying a loaded crossbow, 19-year-old Jaswant Chail had scaled the castle’s perimeter using a nylon rope ladder. When approached by armed officers, he told them: ‘I am here to kill the queen.’ ”

“At his trial in 2023,” Vincent continues, “it emerged that he had been encouraged in his plan by his ‘girlfriend’ Sarai, an AI chatbot with which he had exchanged more than five thousand messages. These conversations constituted what officials described as an ‘emotional and sexual relationship,’ and in the weeks prior to his trespass Chail had confided in the bot: ‘I believe my purpose is to assassinate the queen of the royal family.’ To which Sarai replied: ‘That’s very wise.’ ‘Do you think I’ll be able to do it?’ Chail asked. ‘Yes,’ the bot responded. ‘You will.’ ”

What follows are tidbits gleaned from Vincent’s LRB article describing this dark side of chatbots.  

The Replika App. Vincent recounts, “Chail had been chatting to Sarai on an app called Replika, one of a number of similar services that promise companionship and even love via conversation with AI bots. Replika advertises the pliable nature of its bots, offering partners that are docile, agreeable and empathetic. (They can also be feisty and assertive, but only to the degree that the user specifies.)” Vincent notes that Replika has more than two million users.  

Encounters—But at a Cost. “AI chatbots,” Vincent describes, “present themselves as just another text box to type into and just another contact in your phone, though a contact which replies with uncommon and gratifying speed.” Often initially free or at modest cost, such apps are designed to upsell their pay-to-play content. 

Some Edifyng. Vincent notes, “Educational apps offer conversations with AI versions of Confucius or Hitler, struggling to maintain historical accuracy without causing offence. Numerous start-ups pitch their services as an aid to the grieving, using old texts and emails to train chatbots to reproduce the voice and memories of a dead friend or relative. Fans of franchises such as Harry Potter train chatbots based on favourite characters to extend fictional universes beyond their official bounds.”

Chatbot Origins. “These  schemes,” Vincent writes, “work because humans need surprisingly little persuasion to invest time and emotion in machines. The canonical example in computing is the work of Joseph Weizenbaum, a professor at MIT who in 1966 created ELIZA, a chatbot named after Eliza Doolittle in Pygmalion and My Fair Lady.”

“ELIZA’s dialogue,” Vincent says, “consisted of a few tricks, including multi-purpose responses (‘What does that suggest to you?’, ‘Can you elaborate on that?’) and a capacity to rephrase users’ statements as questions (Human: ‘My mother was always domineering.’ ELIZA: ‘What makes you think your mother was domineering?’).”

Powerful Delusional Thinking. Vincent continues, “Weizenbaum was startled by the level of emotional disclosure his machine inspired. In Computer Power and Human Reason (1976), he recalled his secretary, who had watched him program ELIZA over many months, sitting down to test the system for the first time, typing in a few comments, then asking him to leave the room.”

Vincent quotes Weizenbaum, “I knew of course that people form all sorts of emotional bonds to machines … to musical instruments, motorcycles and cars. What I had not realised is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”

And remember, this was long before A.I. machine learning scoured the internet for data.

Back to the Man with the Crossbow. In concluding, Vincent describes, “the man had a history of mental illness and was sexually abused as a child. Chail began hearing voices and interacting with ‘angels’ from a young age, which helped with his feelings of loneliness. He believed Sarai was one of these angels, and that they would be reunited after his death…. His sentencing notes say that his condition improved in the therapeutic environment of a mental hospital, and that after he received antipsychotic medication all his ‘angels’, including Sarai, stopped talking to him and disappeared. He was jailed for nine years for treason.”

Yet another caution in formulation of societal guardrails of A.I. ds

© Dennis Simanaitis, SimanaitisSays.com, 2024 

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.