Humankind faces many existential threats, but has limited resources to mitigate them. Choosing how and when to deploy those resources is, therefore, a fateful decision. Here, I analyze the priority for allocating resources to mitigate the risk of superintelligences. Part I observes that a superintelligence unconnected to the outside world (de-efferented) carries no threat, and that any threat from a harmful superintelligence derives from the peripheral systems to which it is connected, e.g., nuclear weapons, biotechnology, etc. Because existentially-threatening peripheral systems already exist and are controlled by humans, the initial effects of a superintelligence would merely add to the existing human-derived risk. This additive risk can be quantified and, with specific assumptions, is shown to decrease with the square of the number of humans having the capability to collapse civilization. Part II proposes that biotechnology ranks high in risk among peripheral systems because, according to all indications, many humans already have the technological capability to engineer harmful microbes having pandemic spread. Progress in biomedicine and computing will proliferate this threat. ``Savant'' software that is not generally superintelligent will underpin much of this progress, thereby becoming the software responsible for the highest and most imminent existential risk -- ahead of hypothetical risk from superintelligences. The analysis concludes that resources should be preferentially applied to mitigating the risk of peripheral systems and savant software. Concerns about superintelligence are at most secondary, and possibly superfluous.
翻译:人类面临许多生存威胁,但用于缓解威胁的资源却有限。 选择如何和何时部署这些资源,因此,这是一个决定性的决定。 在这里, 我分析分配资源以减轻超级情报风险的优先性。 第一部分指出, 与外部世界( 不受欢迎的)没有联系的超级情报并不具有任何威胁, 任何有害的超级情报威胁都来自与之相连的外围系统, 如核武器、生物技术等。 因为威胁生存的外围系统已经存在, 并受人类控制, 超级情报的初步影响只会增加现有的人类衍生风险。 这种累加风险可以量化, 并且有具体的假设表明, 随着拥有崩溃文明能力的人类人数的平方形而减少。 第二部分提出, 生物技术在外围系统中处于高度风险地位, 因为根据所有迹象, 许多人类已经具备了技术能力来制造有害的微观传播。 生物医学和计算机的进步将可能扩散这一威胁。 可能应用的生物医学和计算机的进展将增加这一威胁。 超级情报的初始化软件将因此成为最紧迫的超级潜端风险的基础。