Doesn’t this only put a (statistical) limit on how cheaply a civilization can launch planet-ending attacks? It may well be feasible for a civilization to aim and accelerate a mass to nearly the speed of light in order to protect itself from a future threat. It doesn’t necessarily follow it would be feasible or desirable to spend the presumably nontrivial resources needed to do so on every planet where simple life is detected.
Add to this the fact that, at least I understand it, evidence of our current level of technological sophistication (e.g. errant radio waves) attenuates to the point of being undetectable with sufficient distance and the dark forest becomes a bit more viable again.
Personally, I don’t like it as an answer to the Drake equation, but I think that it fails for social rather than technological/logical reasons. The hypothesis assumes a sort of hyper-logical game theory optimized civilization that is a. nothing whatsoever one our own and b. unlikely to emerge as any civilization that achieves sufficient technological sophistication to obliterate another will have gotten there via cooperation.
Doesn’t this only put a (statistical) limit on how cheaply a civilization can launch planet-ending attacks? It may well be feasible for a civilization to aim and accelerate a mass to nearly the speed of light in order to protect itself from a future threat. It doesn’t necessarily follow it would be feasible or desirable to spend the presumably nontrivial resources needed to do so on every planet where simple life is detected.
Add to this the fact that, at least I understand it, evidence of our current level of technological sophistication (e.g. errant radio waves) attenuates to the point of being undetectable with sufficient distance and the dark forest becomes a bit more viable again.
Personally, I don’t like it as an answer to the Drake equation, but I think that it fails for social rather than technological/logical reasons. The hypothesis assumes a sort of hyper-logical game theory optimized civilization that is a. nothing whatsoever one our own and b. unlikely to emerge as any civilization that achieves sufficient technological sophistication to obliterate another will have gotten there via cooperation.
Even the game theory analysis fails, as it doesn’t consider a sufficient number of outcomes nor their branching over time.