I think the issue here is you’re interpolating a couple different concepts:
Iterated technological self-improvement resulting in exponential growth
Artificial General Intelligence
The threat to humanity from advanced AI
1 is the singularity, 2 and 3 are frequently hypothesized consequences of 1. Kinda like extensive use of fossil fuels is one concept, the greenhouse effect is another, and rising sea levels a third. They are related, but distinct, even though one contributes to another.
Combining related concepts under one term dilutes the term and makes it more difficult to effectively communicate. Of course, the moral quandaries are valuable topics of discussion, but the mathematical function is a separate topic, and likewise valuable in and of itself
I think the issue here is you’re interpolating a couple different concepts:
Iterated technological self-improvement resulting in exponential growth
Artificial General Intelligence
The threat to humanity from advanced AI
1 is the singularity, 2 and 3 are frequently hypothesized consequences of 1. Kinda like extensive use of fossil fuels is one concept, the greenhouse effect is another, and rising sea levels a third. They are related, but distinct, even though one contributes to another.
Combining related concepts under one term dilutes the term and makes it more difficult to effectively communicate. Of course, the moral quandaries are valuable topics of discussion, but the mathematical function is a separate topic, and likewise valuable in and of itself