challenging religious privilege in public life

Tony Czarnecki

Latest

Human extinction or Posthuman evolution?

We are the only species which is consciously capable of minimizing the risk of its extinction and control its own evolution in a desired direction.

Will Superintelligence cause our extinction or pave the way for a human species evolution?

There are top AI scientists who all point to 2030 as the time by when we will lose control over the (self)development of AI.

Managing Human’s Evolution

In practice, we have about one decade to put in place at least the main safeguards to control Superintelligence’s capabilities.

Will humans gradually become part of Superintelligence?

If we do nothing, our species may simply become extinct within this, or the next, century, as the consequence of a dozen existential risks.

What is the risk of Superintelligence?

The risk coming from Superintelligence is more likely to happen in the next 50 years rather than in the next century.

How to Control the Capabilities of Superintelligence?

People, like Nick Bostrom, one of the top experts on Superintelligence, think we need to invent some controlling methods to minimize the risk of AGI.

SIGN OUR PETITION