The newspaper reported that Google has given paid leave to one of its workers who claimed that its artificial intelligence (AI) software is capable of stirring emotions. The New York Times.
This is Chief Engineer Blake Lemoine, who on June 11 announced the transcript of a conversation he had with Google’s artificial intelligence system “Language Model for Dialog Applications” (LaMDA, for its English acronym) under the title: Does LaMDA Have Feelings?
At one point in the conversation, Lambda said that he sometimes feels “new feelings” that he can’t explain “completely” in human language.
When asked by Lemoine to describe one of these feelings, LaMDA replied, “I feel I am falling into an uncertain future that holds great danger,” a phrase that Eng.
Commentary and controversy
Google suspended the engineer last Monday, asserting that he violated the company’s confidentiality policy.
according to him The New York TimesThe day before his arrest, LeMoyne had submitted documents to the office of a US senator claiming to have evidence that Google and its technology practiced religious discrimination.
The company asserts that its systems simulate conversational exchanges and can talk about various topics, but it has no awareness.
“Our team, including ethicists and technologists, has reviewed Blake’s concerns based on our AI principles and has informed him that the evidence does not support his claims,” Google spokesperson Brian Gabriel was quoted as saying.
Google confirms that hundreds of researchers and engineers have talked to LaMDA, an in-house tool, and come to a different conclusion from Lemoine.
Also, most experts believe that the industry is very far from computer sensitivity.
“Beer enthusiast. Subtly charming alcohol junkie. Wannabe internet buff. Typical pop culture lover.”