original is here, but you aren’t missing any context, that’s the twit.
I could go on and on about the failings of Shakespear… but really I shouldn’t need to: the Bayesian priors are pretty damning. About half the people born since 1600 have been born in the past 100 years, but it gets much worse that that. When Shakespear wrote almost all Europeans were busy farming, and very few people attended university; few people were even literate – probably as low as ten million people. By contrast there are now upwards of a billion literate people in the Western sphere. What are the odds that the greatest writer would have been born in 1564? The Bayesian priors aren’t very favorable.
edited to add this seems to be an excerpt from the fawning book the big short/moneyball guy wrote about him that was recently released.
It is amazing how bad at reasoning these people are. I also saw a tweet by the director of operations and special projects (the one who has musk kids) who basically said ‘I have been thinking about this for about 7 years, and I think we could solve the malicious AGI problem by designing virtual worlds which are more interesting to it than the real world’(somebody doesn’t think a lot about set theory, anthropomorphizes AI and thinks their driving force will be curiousity) Which is such a nuts statement for a smart person who spend 7 years musing about it (she is also listed on wikipedia as somebody who works in AI) that it all baffles me. Also these people are worth millions while im poor. It is nuts.
What do you mean with the set theory comment?
If you create interesting simulations in our universe they would be inside our universe (and there would prob also be multiple variants) so the baseline real world is still more interesting than the simulations. If it is driven by curiosity it cannot be kept in a simulation like that.