Confusing Plausibility with Probability in AI-Safety

Philip Dhingra
Philosophistry
Published in
2 min readAug 14, 2023

--

teapot floating in space
Russell’s teapot, generated by DALL·E

Is there an analog of the Lizardman’s Constant, but for scenario estimation? If someone tells you a plausible scenario that has no outside view, what is the immediate probability that your heart assigns? Is it 5%?

The problem with the standard AI Extinction line is that there is no good outside view and no great bases for inside views. Plausible stories then rush to fill in…

--

--

Philip Dhingra
Philosophistry

Author of Dear Hannah, a cautionary tale about self-improvement. Learn more: philipkd.com