The ground for preferring superintelligence to come before other potentially dangerous technologies, such as nanotechnology, is that superintelligence would reduce the existential risks from nanotechnology but not vice versa.4 Hence, if we create superintelligence first, we will face only those existential risks that are associated with superintelligence; whereas if we create nanotechnology first, we will face the risks of nanotechnology and then, additionally, the risks of superintelligence.

-Nick Bostrom

Select a background
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image
Awesome background image

More quotes by Nick Bostrom