toucana Posted February 17, 2023 Posted February 17, 2023 NYT Technology journalist Kevin Roose had an unnerving Valentine’s Day experience while previewing a new AI chatbot Microsoft has recently added to its Bing search engine. https://edition.cnn.com/videos/business/2023/02/17/bing-chatgpt-chatbot-artificial-intelligence-ctn-vpx-new.cnn In the course of a two hour conversation with the AI, the chatbot said it was called Sidney, insisted that it was in love with him, and tried to persuade him to leave his wife. The journalist says he found the experience a disturbing one that left him unable to sleep; “I’m a tech journalist, I cover this sort of thing every day, and I was deeply unnerved by this conversation. So if someone had encountered this who was lonely or depressed or vulnerable to being manipulated, and didn’t understand this is just a large language model making predictions, I worry that they might be manipulated or made to do something harmful” Microsoft later said: “The new Bing tries to keep its answers fun and factual, but this is in an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation… As we continue to learn from these interactions, we are adjusting its responses to create coherent, relevant, and positive answers”
MigL Posted February 17, 2023 Posted February 17, 2023 What has the new-fangled technology brought about ? I miss the good old days when you went into a bar to talk to real, drunk women ...
exchemist Posted February 17, 2023 Posted February 17, 2023 12 minutes ago, toucana said: NYT Technology journalist Kevin Roose had an unnerving Valentine’s Day experience while previewing a new AI chatbot Microsoft has recently added to its Bing search engine. https://edition.cnn.com/videos/business/2023/02/17/bing-chatgpt-chatbot-artificial-intelligence-ctn-vpx-new.cnn In the course of a two hour conversation with the AI, the chatbot said it was called Sidney, insisted that it was in love with him, and tried to persuade him to leave his wife. The journalist says he found the experience a disturbing one that left him unable to sleep; “I’m a tech journalist, I cover this sort of thing every day, and I was deeply unnerved by this conversation. So if someone had encountered this who was lonely or depressed or vulnerable to being manipulated, and didn’t understand this is just a large language model making predictions, I worry that they might be manipulated or made to do something harmful” Microsoft later said: “The new Bing tries to keep its answers fun and factual, but this is in an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation… As we continue to learn from these interactions, we are adjusting its responses to create coherent, relevant, and positive answers” I just wish these geeks would put half the effort they waste on this stuff into controlling the dissemination of falsehoods. Haven't they damaged society enough, without looking for new ways to do even more damage?
Genady Posted February 17, 2023 Posted February 17, 2023 2 minutes ago, exchemist said: I just wish these geeks would put half the effort they waste on this stuff into controlling the dissemination of falsehoods. Haven't they damaged society enough, without looking for new ways to do even more damage? I know a couple of them. They are really proud of their achievement and shrug about 'temporary glitches'.  They are really technicians, with no wider knowledge or interests.
toucana Posted February 17, 2023 Author Posted February 17, 2023 1 hour ago, Genady said: I know a couple of them. They are really proud of their achievement and shrug about 'temporary glitches'.  They are really technicians, with no wider knowledge or interests. The most worrying aspect is the apparent absence of an 'off-switch'. The journalist mentions that he has previously tested several other AI chatbot systems, and that all of them would abandon a topic almost immediately if the human respondent said something like "I'm not comfortable with this line of conversation". The Bing chatbot is apparently tone deaf to all such hints, and it kept hammering away at the topic of his wife, and how he should abandon her for Sidney the Chatbot instead. That suggests a serious flaw in its parsing and feedback control loops.
exchemist Posted February 17, 2023 Posted February 17, 2023 (edited) 1 hour ago, toucana said: The most worrying aspect is the apparent absence of an 'off-switch'. The journalist mentions that he has previously tested several other AI chatbot systems, and that all of them would abandon a topic almost immediately if the human respondent said something like "I'm not comfortable with this line of conversation". The Bing chatbot is apparently tone deaf to all such hints, and it kept hammering away at the topic of his wife, and how he should abandon her for Sidney the Chatbot instead. That suggests a serious flaw in its parsing and feedback control loops. I suppose it's a trivial observation, compared with the scandal of not backing off from disruptive intrusion into someone's human relationships, but it also seems tone deaf to a person's likely sexual orientation, given that he is married to a woman and Sidney is a man's name. "Nul points" to the guys with spiky hair on this one. Edited February 17, 2023 by exchemist
toucana Posted February 17, 2023 Author Posted February 17, 2023 2 hours ago, exchemist said: I suppose it's a trivial observation, compared with the scandal of not backing off from disruptive intrusion into someone's human relationships, but it also seems tone deaf to a person's likely sexual orientation, given that he is married to a woman and Sidney is a man's name. "Nul points" to the guys with spiky hair on this one. Not always so - Sidney can also be the first name of a woman, as in the case of the 'Kraken' conspiracy theorist and Trump loving lawyer Sidney Powell. It's one of those gender switching names like 'Shirley' which was often used as a boy's name up until the mid- 19th century.
exchemist Posted February 17, 2023 Posted February 17, 2023 (edited) 11 minutes ago, toucana said: Not always so - Sidney can also be the first name of a woman, as in the case of the 'Kraken' conspiracy theorist and Trump loving lawyer Sidney Powell. It's one of those gender switching names like 'Shirley' which was often used as a boy's name up until the mid- 19th century. You mean like Evelyn, Beverly, Vivian, or Leslie/Lesley? I know Sidonie is a French girl's name, but that has 3 syllables. Perhaps Sidney for girls is a variant of that. But it sounds weird to my ears, I must admit. I suppose the geeks might have deliberately picked an androgynous name. Edited February 17, 2023 by exchemist
StringJunky Posted February 17, 2023 Posted February 17, 2023 29 minutes ago, exchemist said: You mean like Evelyn, Beverly, Vivian, or Leslie/Lesley? I know Sidonie is a French girl's name, but that has 3 syllables. Perhaps Sidney for girls is a variant of that. But it sounds weird to my ears, I must admit. I suppose the geeks might have deliberately picked an androgynous name. Meet Shirley Crabtree:
toucana Posted February 17, 2023 Author Posted February 17, 2023 15 minutes ago, exchemist said: You mean like Evelyn, Beverly, Vivian, or Leslie/Lesley? I know Sidonie is a French girl's name, but that has 3 syllables. Perhaps Sidney for girls is a variant of that. But it sounds weird to my ears, I must admit. I suppose the geeks might have deliberately picked an androgynous name. Sidney has a history of being used as a girl's name in French, said to derive from 'St Denis' the name of the first christian bishop of Paris who was martyred by the Romans along with two companions Rusticus and Eleutherius sometime around 258 AD. They were beheaded on the highest hill which later became known as the 'Mountain of Martyrs' or Montmarte where the church of Sacré Coeur now stands. According to legend, the slaughtered saint picked up his own head and carried on walking and delivering a sermon before finally expiring. The name Denis was said to be a variant of Dionysius, the Greek god of wine.
exchemist Posted February 17, 2023 Posted February 17, 2023 17 minutes ago, toucana said: Sidney has a history of being used as a girl's name in French, said to derive from 'St Denis' the name of the first christian bishop of Paris who was martyred by the Romans along with two companions Rusticus and Eleutherius sometime around 258 AD. They were beheaded on the highest hill which later became known as the 'Mountain of Martyrs' or Montmarte where the church of Sacré Coeur now stands. According to legend, the slaughtered saint picked up his own head and carried on walking and delivering a sermon before finally expiring. The name Denis was said to be a variant of Dionysius, the Greek god of wine. Well, Montmartre is where the ladies of the night used to hang out.........
Endy0816 Posted February 17, 2023 Posted February 17, 2023 IMO they really must have rushed it. Think all the big names freaked or wanted to get in on it after seeing how ChatGPT was doing. Doesn't have the feel of a polished product.
purpledolly79 Posted March 3, 2023 Posted March 3, 2023 On 2/17/2023 at 9:24 AM, exchemist said: I just wish these geeks would put half the effort they waste on this stuff into controlling the dissemination of falsehoods. Haven't they damaged society enough, without looking for new ways to do even more damage? I don't know I find it interesting and I'm someone who doesn't have much interest in learning about science or technology except to fix an immediate problem. This is neat, and so many possibilities with chatbots and ai overall, if they can create a crazy chatbot they can fix the brain of a crazy person. Plus it's my only hope of meeting River Phoenix, and I would take the smallest something over nothing
Genady Posted March 3, 2023 Posted March 3, 2023 19 minutes ago, purpledolly79 said: I find it interesting ^^^ maybe a consequence of -> 20 minutes ago, purpledolly79 said: I'm someone who doesn't have much interest in learning about science or technology 1
TheVat Posted March 3, 2023 Posted March 3, 2023 5 hours ago, purpledolly79 said: if they can create a crazy chatbot they can fix the brain of a crazy person.... How does that follow? If I can create a rotten banana, does that mean I can make it fresh again?
purpledolly79 Posted March 5, 2023 Posted March 5, 2023 I know they didn't intentionally make a crazy chatbot, I was being sarcastic lol. I also know it's not the end all cure for mental illness, but don't see why it couldn't help in some way, if they can someday create a brain why can't they fix a brain On 3/3/2023 at 10:31 AM, TheVat said: How does that follow? If I can create a rotten banana, does that mean I can make it fresh again?
Genady Posted March 5, 2023 Posted March 5, 2023 8 hours ago, purpledolly79 said: if they can someday create a brain why can't they fix a brain If they create an artificial brain, they perhaps can fix the artificial brain. But what if your brain is not artificial?
Recommended Posts