80K interview X-risk work passes US government CBA with flying colours. Why not already working on it? Because not salient enough. PH: Suggests the main thing we need is comms, marketing, and persuasion. Book : Democracy for Realists. Catastrophic bioerror from secret weapons programs seems very likely over next few decades, just looking at base rates from 20th century. Defensive epistemology: people are speaking in public, but they know that they face hostile actors who want to make them look bad for political advantage, law suits, or other reasons. They then restrict what they say and do quite a lot, in a risk averse way. E.g. governments have a hard time doing VC style funding, because people can so easily highlight the misses and attack them. This leads to a lot of defensive engagement: only engage on iron clad arguments, or at least very defensible ones, e.g. defer to high status institution. Noone ever got fired for hiring IBM. Not just a comms thing. Actual institutional ability to understand is also impaired due to incentive and selection effects. Social epistemology: when most activity is conformist, the small set of people who are developing and testing new ideas can eins up having a lot of influence. History: more than half the variance in human conditions driven by changes in technology. Best arguments against the Most Important Century thesis: - Simulation argument. There might be trilllions of minds experiencing themselves as in most important century, when in fact only the ancestor civilisation was actually living it out. - Catastrophic risk: big disasater retards civilisation and tech progress for many centuries or milennia. - Stagnation: if AI is much harder than it seems, and fertility stays low, we might run out of inputs. Seems unlikely. Some high fertility subgroups should become dominant within a century or two, so things should get unstuck. C.f. Chad Jones in shownotes. Carl likes hot springs. Also Rick and Morty.