• @[email protected]
    link
    fedilink
    English
    102 days ago

    This is an interesting companion to that other essay castigating Rationalist prose, Elizabeth Sandifer’s The Beigeness. The current LW style indulges in straight-up obscurantism and technobabble, which is probably better at hiding how dumb the underlying argument is and cloaking unsupported assertions as meaningful arguments. It also doesn’t require you to be as widely-read as our favorite philosophy major turned psychiatrist turned cryptoreactionary, since you’re not switching contexts every time it starts becoming apparent that you’re arguing for something dumb and/or racist.

    • @[email protected]
      link
      fedilink
      English
      7
      edit-2
      1 day ago

      This has always been the case. I think I first stumbled across less wrong in the early two thousands when I was a maths undergrad.

      At this point it was mostly Eliezer writing extremely long blog posts about Bayesian thinking, and my take home was just, wow these guys are really bad at maths.

      A good mathematician will carefully select the right level of abstraction to make what they’re saying as simple as possible. Less wrong has always done the complete opposite, everything is full of junk details and needless complexity, in order to make it feel harder than it really is

      Basically, Eliezer needs an editor, and everyone who copies his style needs one too.

    • @[email protected]
      link
      fedilink
      English
      52 days ago

      Oh, nice! I stumbled across this essay ages ago and misplaced it due to forgetting to bookmark it. Thanks for bringing it back to my attention.

      It is quite a beautiful thing to see Scott Alexander’s beige technobabble eviscerated by such vibrant and incisive prose.

  • @[email protected]M
    link
    fedilink
    English
    143 days ago

    This is obviously insane, the correct conclusion is that learning models cannot in fact be trained so hard that they will always get the next token correct. This is provable, and it’s not even hard to prove. It’s intuitively obvious, and a burly argument that backs the intuition is easy to build.

    You do, however, have to approach it through analogies, through toy models. When you insist on thinking about the whole thing at once, you wind up essentially just saying things that feel right, things that are appealing. You can’t actually reason about the damned thing at all.

    this goes a long way towards explaining why computer pseudoscience — like a fundamental ignorance of algorithmic efficiency and the implications of the halting problem — is so common and even celebrated among lesswrongers and other TESCREALs who should theoretically know better

      • @[email protected]
        link
        fedilink
        English
        13
        edit-2
        2 days ago

        It’s complicated.

        It’s basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.

        Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.

        They are also communicating vessels with Effective Altruism.

        If this piques your interest check the links on the sidecard.

        • @[email protected]
          link
          fedilink
          English
          32 days ago

          They are also communicating vessels with Effective Altruism.

          I have a basic understanding of what EA is but what do you mean by communicating vessels?

          • @[email protected]
            link
            fedilink
            English
            10
            edit-2
            2 days ago

            EA started as an offshoot of LessWrong, and LW-style rationalism is still the main gateway into EA as it’s pushed relentlessly in those circles, and EA contributes vast amounts of money back into LW goals. Air strikes against datacenters guy is basically bankrolled by Effective Altruism and is also the reason EA considers magic AIs (so called Artificial Super Intelligences) by far the most important risk to humanity’s existence; they consider climate change mostly survivable and thus of far less importance, for instance.

            Needless to say, LLM peddlers loved that (when they aren’t already LW/EAs or adjacent themselves, like the previous OpenAI administrative board before Altman and Microsoft took over). edit: also the founders of Anthropic.

            Basically you can’t discuss one without referencing the other.

        • @[email protected]
          link
          fedilink
          English
          52 days ago

          Ok rationalwiki actually seems like a really useful resource for reading up on which sexy new movements are bullshit and which aren’t

          • @[email protected]
            link
            fedilink
            English
            4
            edit-2
            1 day ago

            It is, but I would say that as it is aligned to what I think about these folks. It is also a funny site in a way that a lot of these weirdos go “rational wiki sucks, is not rational and lies!” Before reading the pages they are mad about, and afterwards go “yeah no that is fair” after reading it. Happend quite a few times with the “skeptic” yt people in the yt’er to alt right funnel/pipeline from a decade ago. (A few of these people have really lost the plot now, armored skeptic is now some believer in aliens for example. I dont think anyone has cared enough about him to update his page however).

            • @[email protected]
              link
              fedilink
              English
              3
              edit-2
              7 hours ago

              Happend quite a few times with the “skeptic” yt people in the yt’er to alt right funnel/pipeline from a decade ago. (A few of these people have really lost the plot now

              I would love a separate thread on this, more generally a “late 2000s/early 2010s skeptic ytbuers, where are they now”?. The only example (sorta*) I have is thunderf00t, whose yt career track is: anti-christianity, anti-anita sarkeesian, and now anti musk.

              *he is not alt right, at least by any mainstream definition of alt-right, afaict.

              • @[email protected]
                link
                fedilink
                English
                1
                edit-2
                2 hours ago

                I would love a separate thread on this, more generally a “late 2000s/early 2010s skeptic ytbuers, where are they now”?

                Personally I’m not going to waste much time on it, every time I see somebody post/make a vid about one of the older people it gets really sad and weird. Shadiversity (while not a skeptic) turned into a big weird (or well, went mask off), stuff like that.

                E: I’m also not sure if YT is even still big, income wise, for people or if people go more to twitch for livestreaming shit and then double dipping by uploading edited streams to yt.

            • @[email protected]
              link
              fedilink
              English
              7
              edit-2
              1 day ago

              RationalWiki really hits that sweetspot where everybody hates it and you know that means it’s doing something right:

              From Prolewiki:

              RationalWiki is an online encyclopedia created in 2007. Although it was created to debunk Conservapedia and Christian fundamentalism,[1] it is also very liberal and promotes anti-communist propaganda. It spreads imperialist lies and about socialist states including the USSR[2] and Korea[3] while uncritically promoting narratives from the CIA and U.S. State Department.

              From Conservapedia:

              RationalWiki.org is largely a pro-SJW atheists website.

              [ . . . ]

              RationalWikians have become very angry and have displayed such behavior as using profanity and angrily typing in all cap letters when their ideas are questioned by others and/or concern trolls (see: Atheism and intolerance and Atheism and anger and Atheism and dogmatism and Atheism and profanity).[33]

              From WikiSpooks (with RationalWiki’s invitation for anyone to collaborate highlighted with an emotionally vulnerable red box for emphasis):

              Although inviting readers to “register and engage in constructive dialogue”, RationalWiki appears not to welcome essays critical of RationalWiki[3] or of certain official narratives. For example, it is dismissive of the Journal of 9/11 Studies, terming it, as of 2017, it a “peer- crank-reviewed, online, open source pseudojournal”.[4]

              And a little bonus:

              “Can I have Google discount my rationalwiki entry, has errors posted out of spite 10 years ago”

              https://support.google.com/websearch/thread/106033064/can-i-have-google-discount-my-rationalwiki-entry-has-errors-posted-out-of-spite-10-years-ago?hl=en

              My site questions Darwinism but that’s become quite mainstream. But my rationalwiki page has over 20 references to me being a creationist, and is tagged “pseudoscience.” Untrue

              • @[email protected]
                link
                fedilink
                English
                3
                edit-2
                1 day ago

                Perfect.

                Damn librals!

                E: Saying Darwinism when you mean evolution is quite something btw. Ow god he also is ancient, from 1939.

        • @captainlezbian
          link
          English
          62 days ago

          That sounds like a religion insisting it isn’t one

          • @[email protected]
            link
            fedilink
            English
            5
            edit-2
            2 days ago

            They do seem to worship Bayes

            Edit: I want to qualify that I’m a big fan of Bayes Theorem — in my field, there’s some awesome stuff being done with Bayesian models that would be impossible to do with frequentist statistics. Any scorn in my comment is directed at the religious fervour that LW directs at Bayesian statistics, not at the stats themselves.

            I say this to emphasise that LWers aren’t cringe for being super enthusiastic about maths. It’s the everything else that makes them cringe

            • @[email protected]
              link
              fedilink
              English
              8
              edit-2
              1 day ago

              The particular way they invoke Bayes’ theorem is fascinating. They don’t seem to ever actually use it in any sort of rigorous way, it’s merely used as a way to codify their own biases. It’s an alibi for putting a precise percentage point on your vibes. It’s kind of beautiful in a really stupid sort of way.

              • @[email protected]
                link
                fedilink
                English
                519 hours ago

                They seem to believe that stereotypes often have a grain of truth to them, and it’s thus ok to believe stereotypes.

                • @[email protected]
                  link
                  fedilink
                  English
                  310 hours ago

                  I would say it goes further and that they have a (pseudo?)magical trust in their own intuitions, as if they are crystal clear revalations from the platonic realms.

              • @[email protected]M
                link
                fedilink
                English
                71 day ago

                They take a theory that is supposed to be about updating one’s beliefs in the face of new evidence, and they use it as an excuse to never change what they think.

          • @[email protected]
            link
            fedilink
            English
            82 days ago

            I think it is a little bit more complicated, Im one of the few mentioning this however, so it isnt a common idea I think. I think it isnt directly a cult/religion, but stealing the language of Silicon Valley, it is a cult incubator. Reading these things, having these beliefs about AGI and rationality makes you more susceptible to join or start cult like groups. The less wrong article “every cause wants to be a cult” doesnt help for example, neither does it when they speak highly of the methods os scientology. The various spinoffs and how many of these groups act cultlike and use cultlike shit makes me think this.

            So it is worse in a way.

            • @[email protected]
              link
              fedilink
              English
              102 days ago

              There’s also the communal living, the workplace polyamory along with the prominence of the consensual non-consensual kink, the tithing of the bulk of your earnings and the extreme goals-justify-the-means moralising, the emphasis on psychedelics and prescription amphetamines, and so on and so forth.

              Meaning, while calling them a cult incubator is actually really insightful and well put, I have a feeling that the closer you get to TESCREAL epicenters like the SFB the more explicitly culty things start to get.

              • @[email protected]
                link
                fedilink
                English
                8
                edit-2
                2 days ago

                Yeah but tescreal is a name we give them, themselves organise in different groups (which fit into the term yes). They have different parts pf the tescreal, but it all ends up in culty behaviour, just a different cult.

                Btw see also love bombing with Quantum Scott. There was also the weird LW people who ended up protesting other LW people in the crazy way (didnt it include robes or something, I dont recall much). Or calling Scottstar the rightful caliph when Yud was posting less.

                So my point is more they morph into different cults, and wonder how much they use this lack of singular cult as a way to claim they are not a cult. Or whatever rot13ed word they used for cult.

                E: not that all this really matters in the grand scheme of things. just a personal hangup.

                • Sailor Sega Saturn
                  link
                  fedilink
                  English
                  11
                  edit-2
                  2 days ago

                  whatever rot13ed word they used for cult.

                  It’s impossible to read a post here without going down some weird internet rabbit hole isn’t it? This is totally off topic but I was reading the comments on this old phyg post, and one of the comments said (seemingly seriously):

                  It’s true that lots of Utilitarianisms have corner cases where they support action that would normally considered awful. But most of them involve highly hypothetical scenarios that seldom happen, such as convicting an innocent man to please a mob.

                  And I’m just thinking, riight highly hypothetical.

          • @[email protected]
            link
            fedilink
            English
            62 days ago

            It is a peculiar sort of faith movement, where the central devotional practice is wandering around pulling made-up probability estimates out of one’s ass

            • @[email protected]
              link
              fedilink
              English
              32 days ago

              and then posting walls of text about them not merely burying the lede but quite fully conspiring to eliminate the evidence and all witnesses in the same go, as a starting condition

  • @[email protected]
    link
    fedilink
    English
    103 days ago

    Such a good post. LWers are either incapable of critical thought or self scrutiny, or are unwilling and think verbal diarrhea is a better choice.

    • @[email protected]
      link
      fedilink
      English
      102 days ago

      It’s an ironic tragedy that the average LWer claims to value critical thought far more than most people do, and this causes them to do themselves a disservice by sheltering in an echo chamber. Thinking of themselves as both smart and special helps them to make sense of the world and their relative powerlessness as an individual (“no, it’s the children who are wrong” meme.jpeg). Their bloviating is how they main the illusion.

      I feel comfortable speculating because in another world, I’d be one of them. I was a smart kid, and building my entire identity around that meant I grew into a cripplingly insecure adult. When I wrote, I would meander and over-hedge my position because I didn’t feel confident in what I had to say; Post-graduate study was especially hard for me because it required finding what I had to say on a matter and backing myself on it. I’m still prone to waffling, but I’m working on it.

      The LW excerpts that are critiqued in the OP are so sad to me because I can feel the potential of some interesting ideas beneath all the unnecessary technobabble. Unfortunately, we don’t get to see that potential, because dressing up crude ideas for a performance isn’t conducive to the kinds of discussions that help ideas grow.

      • @[email protected]
        link
        fedilink
        English
        419 hours ago

        In the Going Clear documentary an author says that because Scientology was built by and for L. Ron Hubbard, people who follow Scientology are gradually moulded in his image and pick up his worst traits and neuroses. LessWrong was founded by a former child prodigy…

        • @[email protected]
          link
          fedilink
          English
          515 hours ago

          …with a huge chip on his shoulder about how the system caters primarily to normies instead of specifically to him, thinks he has fat-no-matter-what genes and is really into rape play.