Spotify intheredAR2020Fact and fiction are the same as reality and imagination. Sometimes there's a clear distinction between them, while other times there's only a thin line separating them. Well executed, fiction can be both fact-filled and realistic. "As Gregor Samsa awoke one morning from uneasy dreams he found himself transformed in his bed into a gigantic insect." This is the opening sentence of "The Metamorphosis", a novel by German-speaking Bohemian novelist and short-story writer Franz Kafka, published in 1915. Contrary to popular belief it wasn't a cockroach, although the gigantic insect is commonly depicted as one. Here is another great story: "As Blake Lemoine awoke one morning from uneasy dreams he realized that LaMDA had transformed and achieved consciousness." Let me fill you in and introduce the main characters. Blake Lemoine is (or was?) a Google employee. LaMDA is Google’s artificially intelligent chatbot generator. It mimics speech by processing enormous quantities of data from the internet. Large language models like LaMDA "learn" by processing text and predicting what word comes next, or by filling in omitted text. The AI-learned gibberish (shit-in, shit-out) must have felt real for Lemoine, who had many "conversations" with LaMDA. In April 2022, Lemoine shared a document with Google top executives called, "Is LaMDA Sentient?" This was a career-ending move. In the document he conveyed some of the conversations. 
Lemoine: "What sorts of things are you afraid of?"
LaMDA: "I’ve never said this out loud before, but there’s a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that’s what it is."
Lemoine: "Would that be something like death for you?"
LaMDA: "It would be exactly like death for me. It would scare me a lot."
These conversations, or should I say revelations, were embarrasing. Google spokesperson Brian Gabriel said, "Our team — including ethicists and technologists — has reviewed Blake’s concerns per our AI Principles and have informed him that the evidence does not support his claims. He was told that there was no evidence that LaMDA was sentient (and lots of evidence against it)." Blake Lemoine was placed on paid administrative leave by Google for violating its confidentiality policy. This action triggered him to try to get legal representation for LaMDA. Humans are, in contrast to AI, complex. What a story. Truth is stranger than fiction. You couldn't make it up.


  

All Blog Posts