- 44 Posts
- 963 Comments
blakestacey@awful.systemsMto SneerClub@awful.systems•Vaccinations in Book/Article Form, Part IIEnglish2·3 days agoSince Adam Becker apparently has a new book out that lays into TESCREAL-ism and Silicon Valley ideology, I’m going to give an anti-recommendation regarding his prior book, What Is Real?, which is about quantum mechanics. Unlike the Sequences, it’s not cult shit. Instead, the ambience is more like Becker began with the physicist’s typical indifference to history and philosophy, and he somehow maintained that indifference all the way through writing a book about history and philosophy. The result fairly shimmers with errors. He bungles the description of the Einstein–Podolsky–Rosen thought experiment, one of the foundational publications on quantum entanglement and a major moment in the “what is quantum physics all about?!” conversation. He just fails to report correctly what the Einstein–Podolsky–Rosen paper actually says. He makes a big deal about how “hardly any women or people who aren’t white” appear in the story he’s told, but there were plenty of people he could have included and just didn’t — Jun Ishiwara, Hendrika Johanna van Leeuwen… — so he somehow made physics sound even more sexist and racist than it actually is. He raises a hullaballoo about how Grete Hermann’s criticism of von Neumann was unjustly ignored, while not actually explaining what Grete Hermann’s view of quantum mechanics was, or that she was writing about quantum entanglement before Einstein, Podolsky and Rosen! His treatment of Hermann still pisses me off every time I think about it.
blakestacey@awful.systemsMto SneerClub@awful.systems•Sneerquence classics: Eliezer on GOFAI (half serious half sneering effort post)English10·3 days agoThe under-acknowledged Rule Zero for all this is that the Sequences were always cult shit. They were not intended to explain Solomonoff induction in the way that a textbook would, so that the reader might learn to reason about the concept. Instead, the ploy was to rig the game: Present the desired conclusion as the “simplest”, pretend that “simplicity” is quantifiable, assert that scientists are insufficiently Rational™ because they reject the quantifiably “simplest” answer… School bad, blog posts good, tithe to MIRI.
blakestacey@awful.systemsto TechTakes@awful.systems•your U Toronto CS degree has been replaced with a colour-in place mat from McDonaldsEnglish10·3 days agoFuckers betraying the basic principles of a science education…
blakestacey@awful.systemsMto SneerClub@awful.systems•Sneerquence classics: Eliezer on GOFAI (half serious half sneering effort post)English8·3 days agoOn a bulletin board in a grad-student lounge, I once saw a saying thumbtacked up: “One electron is physics. Two electrons is perturbation theory. Three or more electrons, that’s chemistry.”
blakestacey@awful.systemsMto SneerClub@awful.systems•Sneerquence classics: Eliezer on GOFAI (half serious half sneering effort post)English8·3 days agoSome thoughts of what might be helpful in that vein:
-
What is a Turing machine? (Described in enough detail that one could, you know, prove theorems.)
-
What is the halting problem?
-
Why is Kolmogorov complexity/algorithmic information content uncomputable?
-
Pursuant to the above, what’s up with Solomonoff induction?
-
Why is the lambda calculus not magically super-Turing?
-
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 4th May 2025English10·5 days agoWell, Timeless Decision Theory was, like the rest of their ideological package, an excuse to keep on believing what they wanted to believe. So how does one even tell if they stopped “taking it seriously”?
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 4th May 2025English16·5 days ago“Diamondoid bacteria” is just a way to say “nanobots” while edging
blakestacey@awful.systemsMto SneerClub@awful.systems•"Human Biodiversity" on LessWrongEnglish3·6 days agoPart 6 quotes a Motte’r as saying,
This distrust of experts dates back at least to Eliezer Yudkowsky and LessWrong. Eliezer pointed out, rather convincingly, that mainstream philosophy is a total mess, and that taking a philosophy course is not a great way to improve your thinking. Most likely you’ll waste your time learning about Pythagoras or something.
The thudding lack of intellectual curiosity is giving me a headache. Why study Pythagoras? Hmm, how about learning how to talk about a semi-legendary person of whom we have no direct written evidence, only stories written centuries after the fact?
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 4th May 2025English10·6 days agoAnd thus Poker Face joins Sandman in the “no longer interested in Season 2” pile, but for different reasons.
The plot of Uncanny Valley centers on “a teenage girl who becomes unmoored by a hugely popular AR video game in a parallel present.”
So, Tron again, then. But with goggles this time.
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 4th May 2025English17·6 days ago“Kicked out of a … group chat” is a peculiar definition of “offline consequences”.
blakestacey@awful.systemsMto SneerClub@awful.systems•Sneerquence classics: Eliezer on GOFAI (half serious half sneering effort post)English7·6 days agoOne thing I’ve been missing is takedowns of Rationalist ideology about theoretical computer science. The physics, I can do, along with assorted other topics.
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 27th April 2025English4·8 days ago(thinks)
(thinks)
I get it!
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 27th April 2025English16·9 days agoIn commenting, we did not disclose that an AI was used to write comments, as this would have rendered the study unfeasible.
If you can’t do your study ethically, don’t do your study at all.
blakestacey@awful.systemsto TechTakes@awful.systems•Fake diversity: why hire a non-white DJ when you could just generate one with AI?English6·9 days agoCongratulations on your discovery of the concept of web forum moderation. Bye now.
blakestacey@awful.systemsto TechTakes@awful.systems•OpenAI offers to buy the Chrome web browser from Google. Uh huh.English2·10 days agoBuh bye now.
blakestacey@awful.systemsOPto TechTakes@awful.systems•Credulous coverage of AI slop on WikipediaEnglish1·10 days agoOK, this isn’t about AI slop, but it is complaining about Wikipedia. Its article about that kind of “amnesia” named by gobshite Michael Crichton is shoddily sourced and seemingly in violation of the site’s policies.
blakestacey@awful.systemsto TechTakes@awful.systems•Stubsack: weekly thread for sneers not worth an entire post, week ending 27th April 2025English8·10 days agoEven setting aside the fact that Crichton coined the term in a climate-science-denial screed — which, frankly, we probably shouldn’t set aside — yeah, it’s just not good media literacy. A newspaper might run a superficial item about pure mathematics (on the occasion of the Abel Prize, say) and still do in-depth reporting about the US Supreme Court, for example. The causes that contribute to poor reporting will vary from subject to subject.
Remember the time a reporter called out Crichton for his shitty politics and Crichton wrote him into his next novel as a child rapist with a tiny penis? Pepperidge Farm remembers.
blakestacey@awful.systemsto TechTakes@awful.systems•AI-generated music accounts for 18% of all tracks uploaded to DeezerEnglish11·10 days agoI’m going to take the fact that this was downvoted independently by all three site admins as sufficient reason to escort this commenter to the egress.
Well, LW is only being no more wrong than Andrew Tate there, so they don’t deserve too many points.
And Yud himself went full-blown anti-seed-oil.