70 Quotes by Nick Bostrom

  • Author Nick Bostrom
  • Quote

    Though this might cause the AI to be terminated, it might also encourage the engineers who perform the postmortem to believe that they have gleaned a valuable new insight into AI dynamics – leading them to place more trust in the next system they design, and thus increasing the chance that the now-defunct original AI’s goals will be achieved.

  • Share

  • Author Nick Bostrom
  • Quote

    The gap between a dumb and a clever person may appear large from an anthropocentric perspective, yet in a less parochial view the two have nearly indistinguishable minds.

  • Share

  • Author Nick Bostrom
  • Quote

    This chapter analyzes the kinetics of the transition to superintelligence as a function of optimization power and system recalcitrance.

  • Share

  • Author Nick Bostrom
  • Quote

    Before the prospect of an intelligence explosion, we humans are like small children playing with a bomb. Such is the mismatch between the power of our plaything and the immaturity of our conduct. Superintelligence is a challenge for which we are not ready now and will not be ready for a long time. We have little idea when the detonation will occur, though if we hold the device to our ear we can hear a faint ticking sound.

  • Share

  • Author Nick Bostrom
  • Quote

    Our demise may instead result from the habitat destruction that ensues when the AI begins massive global construction projects using nanotech factories and assemblers – construction.

  • Share

  • Author Nick Bostrom
  • Quote

    The ground for preferring superintelligence to come before other potentially dangerous technologies, such as nanotechnology, is that superintelligence would reduce the existential risks from nanotechnology but not vice versa.4 Hence, if we create superintelligence first, we will face only those existential risks that are associated with superintelligence; whereas if we create nanotechnology first, we will face the risks of nanotechnology and then, additionally, the risks of superintelligence.

  • Share

  • Author Nick Bostrom
  • Quote

    One can speculate that the tardiness and wobbliness of humanity’s progress on many of the “eternal problems” of philosophy are due to the unsuitability of the human cortex for philosophical work. On this view, our most celebrated philosophers are like dogs walking on their hind legs – just barely attaining the treshold level of performance required for engaging in the activity at all.

  • Share

  • Author Nick Bostrom
  • Quote

    Granted, there is still that picture of the Terminator jeering over practically every journalistic attempt to engage with the subject.

  • Share

  • Author Nick Bostrom
  • Quote

    Newer systems use statistical machine learning techniques that automatically build statistical models from observed usage patterns.

  • Share