• 1 Post
  • 10 Comments
Joined 2 years ago
cake
Cake day: August 27th, 2023

help-circle
  • Sort of but I think influence over emotional states is understating it and just the tip of the iceberg. It also made it sound passive and accidental. The real problem will be overt control as a logical extension to the kinds of trade offs we already see people make about, for example data privacy. With the Replika fiasco I bet heaps of those people would have paid good money to get their virtual love interests de-“lobotomized”.





  • That’s really interesting. Its output to this prompt totally ignored the biggest and most obviously detrimental effect of this problem at scale.

    Namely, emotional dependence will give AI’s big tech company owners increased power over people.

    It’s not as if these concepts aren’t widely discussed online, everything from Meta’s emotional manipulation experiments or Cambridge Analytica through to the meltdowns Replika owners had over changes to the algorithm are relevant here.





  • It would have to be the fail rate of an average doctor, because if average doctors are the use case then moving the bar to fail rate of a bad doctor doesn’t make any sense. You would end up saying worse outcomes = better.

    I think the missing piece here is accountability.

    If doctors are being encouraged to give harmful out-of-date advice, who will end up with a class action lawsuit on their hands - doctors or OE?