

Thanks!
Thanks!
Trouble is your statement was in answer to @morrowind@lemmy.ml’s comment that labeling lonely people as losers is problematic.
Also it still looks like you think people can only be lonely as a consequence of their own mistakes? Serious illness, neurodivergence, trauma, refugee status etc can all produce similar effects of loneliness in people who did nothing to “cause” it.
And Hastalavista if you wanted to find things that Altavista didn’t.
That’s really interesting. Its output to this prompt totally ignored the biggest and most obviously detrimental effect of this problem at scale.
Namely, emotional dependence will give AI’s big tech company owners increased power over people.
It’s not as if these concepts aren’t widely discussed online, everything from Meta’s emotional manipulation experiments or Cambridge Analytica through to the meltdowns Replika owners had over changes to the algorithm are relevant here.
He didn’t give them that though. He just claimed he did.
I’m seriously impaired, so all humans will start dying in a matter of weeks.
On the plus side, everything in that book The World Without Us will come to pass, and the planet’s environment and ecology will be better off.
Nooo, enshitification. I’ve only recently stared using it.
What do we use instead? Is Matrix the only option?
It would have to be the fail rate of an average doctor, because if average doctors are the use case then moving the bar to fail rate of a bad doctor doesn’t make any sense. You would end up saying worse outcomes = better.
I think the missing piece here is accountability.
If doctors are being encouraged to give harmful out-of-date advice, who will end up with a class action lawsuit on their hands - doctors or OE?
Openr
Openster
Open .io
Sort of but I think influence over emotional states is understating it and just the tip of the iceberg. It also made it sound passive and accidental. The real problem will be overt control as a logical extension to the kinds of trade offs we already see people make about, for example data privacy. With the Replika fiasco I bet heaps of those people would have paid good money to get their virtual love interests de-“lobotomized”.