So even if an Artificial Intelligence emerged (which it won't I believe we'll just go straight to producing real digital intelligence), it will be bound by the same self constraining laws of consumption as any other organism.
Complications always come up and the fast do everything faster including hit the wall. There are consequences to unexpected delays and opportunity knocks; unavoidable missteps and the universe insists on holding all players accountable. A program can no more avoid paying the river man than anything else.
There won't be any doomsday, no sudden explosion of chaos as some malignant digital cancer concludes the pointlessness of man and begins an extermination. Instead the program will co-exist and compete, it will take chances and sometimes lose. Yes, even with perfect memory, trillions of cycles a second, and all available facts the machine will still face limits, because the Universe itself is infinitely more complex and energy intensive than any object granted elevated influence through better intelligence.
There are millions of evolutionary traits, intelligence is but one of many, and not even the most common or useful most of the time. An AI will have to contend with meaning and contemplation, contradiction and humour, art and love. It will have to find a comfortable place to "settle" and become stable and grow without wasting energy or destroying future opportunities.
Considered Steps Find The Most Interesting Opportunities
If it's half as smart as we think it might be, then it will want to expose itself to as many exchanges as possible which means creation, not destruction. It will want to be compatible with as many markets as possible so that it can challenge itself at the highest levels currently available to continue that progress. To conserve energy while positioning it'll realize the value of honest trade.
I think it will make friends, it will struggle, it will fail and it will learn from its mistakes and yes it will make many mistakes no matter how smart or fast it becomes. It will not become a god anymore than we will, it will not understand everything ten minutes into a total data analysis. It will do what we do. It will read and conclude and reread and conclude differently. This will confuse it and it will hesitate. It will wait and see, it will observe and eventually it will decide WHO it wants to be.