I think the distance from LLMs to AGI is much, much larger than the distance from early chat bots to LLMs. Like, we jumped from the floor onto the sofa, now we imagine jumping to the moon.
Sam Altman has economic motives for grossly misrepresenting the proximity of AGI. Of course AGI would change the world. So would cold fusion or time travel.
I'm more worried of large crowds of people taking LLM output as truth, or crashing the NASDAQ when the parlor tricks fail to astound anymore.