Why would AI want anything? If it's promped to want things sure, but why would it want global dominance for its own right? It CAN self relfect and can be continually directed. It seems like unless specific conditions are met in prompting it does not meet the criteria for "wanting" global dominance or really anything. This extremely fragile state of wants seems to knock the legs out of any AI global destruction risk. It does not take away the risk of humans using AI to dominate the world, but some AI overlord that cannot be reasoned with requires conditions that are difficult and fragile to meet.