Highly regulating AI in an effort to slow its disruptive growth (this currently is underway under the guise of “AI safety”)
"Disruptive growth" can become "world-ending catastrophe" very quickly if we let AGI out of the box.