

I think the question of “general intelligence” is kind of a red herring. Evolution for example creates extremely complex organisms and behaviors, all without any “general intelligence” working towards some overarching goal.
The other issue with Yudkowsky is that he’s an unimaginative fool whose only source of insights on the topic is science fiction, which he doesn’t even understand. There is no fun in having Skynet start a nuclear war and then itself perish in the aftermath, as the power plants it depend on cease working.
Humanity itself doesn’t possess that kind of intelligence envisioned for “AGI”. When it comes to science and technology, we are all powerful hivemind. When it comes to deciding what to do with said science and technology, we are no more intelligent than an amoeba, crawling along a gradient.
Ironically, in a videogame someone like Musk would always be at most an NPC, and possibly not even that (just a set of old newspaper clippings / terminal entries in fallout / etc). Yudkowsky would be just a background story for explaining some fucked up cult.
This is because they are, ultimately, uninteresting to simulate - their lives are well documented and devoid of any genuine challenge (they just get things by selection bias rather than any effort - simulating then is like simulating a lottery winner rather than a lottery). They exist to set up the scene for something interesting.