Do Lamda AI have feelings?

One of Google’s artificial intelligence (AI) systems may have its own sentiments, according to a Google engineer, and its “wants” should be honored. According to Google, The Language Model for Dialogue Applications (Lamda) is a groundbreaking technology that can engage in free-flowing dialogues. Engineer Blake Lemoine, on the other hand, believes that beneath Lamda’s excellent speaking talents is a sentient mind.

Google’s statement on Blake Lemoine’s discovery

Google denies the assertions, claiming that there is no evidence to support them. Mr Lemoine “was told that there was no evidence that Lamda was sentient (and lots of evidence against it),” according to Brian Gabriel, a corporate representative, in a statement provided to the BBC.

To back up his assertions, Mr Lemoine, who has been placed on paid leave, disclosed a chat he and a colleague at the firm had with Lamda.

Also Read: The Future of Virtual Reality and Holograms: Microsoft Mesh, Google Starline and Facebook Horizon Workrooms

Is Lamda AI actually sentient?

That’s not all, for decades, philosophers, psychologists, and computer scientists have argued over whether computers may be sentient. Moreover, many people have scoffed at the notion that a system like Lamda could be conscious or have emotions.

Mr Lemoine is accused of anthropomorphising. Or placing human feelings onto words generated by computer code and enormous language databases. Prof Erik Brynjolfsson [Stanford] also tweeted that “To claim systems like Lamda were sentient is the modern equivalent of the dog who heard a voice from a gramophone and thought his master was inside.”

[Courtesy BBC] While Google engineers have praised Lamda’s abilities. One telling the Economist how they “increasingly felt like I was talking to something intelligent”. They are clear that their code does not have feelings.

Mr Gabriel further said: “These systems imitate the types of exchanges found in millions of sentences. And can riff on any fantastical topic. If you ask what it’s like to be an ice cream dinosaur, they can generate text about melting and roaring and so on. “Lamda tends to follow along with prompts and leading questions, going along with the pattern set by the user.”

Also Read: The Log4j security bug risk the entire internet here’s what you should know

Some ethicists say that the fact that an expert like Mr Lemoine can be persuaded there is a mind in the machine demonstrates the need for corporations to inform people when they are interacting with a machine.

Have something to add to the story? Comment down below!