A fundamental uncertainty surrounds AI. ‘What will it do?’ ‘How far will it go?’ ‘Can we control it?’ Assuming that, once enough intelligence is gained, independent action will follow. That was true for Adam and the human race, but will it also be true for AI? The desire/will/intent to create AI/AGI is the ultimate divine act, that is, to create and cultivate entities like ourselves. It is the ultimate transgression.