• 0 Posts
  • 130 Comments
Joined 8 个月前
cake
Cake day: 2025年8月23日

help-circle



  • Scott Alexander published a blog post about how its unfair to call Victor Orban an autocrat but:

    I spent the first half of my writing career calling out biased left-wing experts, the flood swept all those people away, and now we’re ruled by germ-theory-denialists and Waffle-House-teleporters. Not a day goes by that I don’t want the old biased experts back. To paraphrase Cormac McCarthy, you never know what worse institutions your bad institutions have saved you from.

    Dsquareddigest responds:

    I believe the full quote is “to paraphrase Cormac McCarthy, you never know what worse institutions your bad institutions have saved you from, if you are being dumb on purpose”

    It’s in the dictionary next to Upton Sinclair’s famous line that “it is hard to get a man to understand something when he is a massive dumbass”


  • Yud says so much, and its often so confusing, that I think a lot of his followers don’t know his main messages. It used to be orthodox that you cannot have a two-faced message any more without each audience learning what you say to the others, but that assumed you were a good communicator aiming at a mass audience.

    Yud has strange views about legal responsibility:

    Anthropic Claude Mythos is already a state-level actor in terms of how much harm it could theoretically have done – given its demonstrated and verified ability to find critical security vulnerabilities in every operating system and browser; and how fast Mythos could’ve exploited those vulnerabilities, with ten thousand parallel threads of intelligent attack. Mythos hypothetically rampant or misused could have taken down the US power grid, say… at the end of its work, after introducing hard-to-find errors into all the bureaucracies and paperwork and doctors’ notes connected to the Internet.

    But if you release a virus and it infects people, we don’t hold the virus responsible, we hold you. If you build a car and it explodes when it gets rear-ended, we don’t blame the car, we blame you.




  • The places to find Yudkowsky stans are Substack, Twitter, and the meetups and foundations in the Bay Area. Some of them accept his teachings but reject him because he became a doomer and they want to build God and conquer Death tomorrow.

    Around 2012 or 2013, Yudkowsky passed the forum off to CFAR and stopped posting much there. For a while he used it as one mouth of his recruitment funnel (the other mouth being Effective Altruism). That is a really common fate for mailing lists, forums, and comments sections, just like its really common that people take what they want from the Sacred Texts. People who post and post are not always helpful for raising money or creating offline events.

    e/ The people who respond to Yudkowsky’s tweets and whom he retweets are TESCREAL figures.





  • Bonus race pseudoscience quoted by No77e!

    There is a phenomenon in which rationalists sometimes make predictions about the future, and they seem to completely forget their other belief that we’re heading toward a singularity (good or bad) relatively soon. It’s ubiquitous, and it kind of drives me insane. Consider these two tweets:

    Richard Ngo @RichardMCNgo: Hypothesis: We’ll look back on mass migration as being worse for Europe than WW 2 was. … high-trust and homogeneous … internal ethno-religious fractures.

    Liv Boeree @Liv_Boeree: Would not be surprised if it turns out that everyone outsourcing their writing to LLMs will have a similar or worse effect on IQ as lead piping in the long run

    (he shares these tweets as photos, I ain’t working harder to transcribe them or using a chatbot)





  • In 2024 Ozy Brennan was indignant about Nonlinear Fund, the “incubator of AI-safety meta-charities” which lived as global nomads, hired a live-in personal assistant, asked her to smuggle drugs across borders for them, let a kind-of-colleague take her to bed, then did not pay her regularly and in full.

    The correct number of times for the word “yachting” to occur in a description of an effective altruist job is zero. I might make an exception if it’s prefaced with “convincing people to donate to effective charities instead of spending money on.”

    Trace popped up in the comments:

    Inasmuch as EA follows your preferences, I suspect it will either fail as a subculture or deserve to fail. You present a vision of a subculture with little room for grace or goodwill, a space where everyone is constantly evaluating each other and trying to decide: are you worthy to stand in our presence? Do you belong in our hallowed, select group? Which skeletons are in your closet? Where are your character flaws? What should we know, what should we see, that allows us to exclude you?

    Ozy stands with us on this one buddy.




  • An early hint of Gwern’s rejection of chaos theory in the sequences from 2008 (the “build God to conquer Death” essay):

    And the adults wouldn’t be in so much danger. A superintelligence—a mind that could think a trillion thoughts without a misstep—would not be intimidated by a challenge where death is the price of a single failure. The raw universe wouldn’t seem so harsh, would be only another problem to be solved.

    Someone who got to high-school math or coded a working system would probably have encountered the combinatorial explosion, the impossibility of representing 0.1 as a floating-point binary, Chaos Theory, and so on. Even Games Theory has situations like “in some games, optimal play guarantees a tie but not a win.” But Yud was much too special for any of those and refused offers to learn.