Full title: Deep Hype in Artificial General Intelligence: Uncertainty, Sociotechnical Fictions and the Governance of AI Futures
Abstract:
Artificial General Intelligence (AGI) has emerged as one of the most ambitious frontiers of the contemporary technology sector. Promoted by tech leaders and investors, AGI is imagined as a system capable of performing all human intellectual tasks, and potentially exceeding them. Despite lacking clear definition or technical feasibility, AGI has attracted unprecedented levels of investment and political attention, driven by inflated promises of civilisational transformation.
Based on a series of statements and documents from AGI leaders, this paper examines how a multidimensional network of uncertainties sustains what is defined as deep hype: a long-term, overpromissory dynamic that constructs visions of civilisational transformation through a network of uncertainties extending into an undefined future, making its promises nearly impossible to verify in the present, while maintaining attention, investment, and belief. These uncertainties are articulated through sociotechnical fictions: forms of mediated imagination produced within science and technology but not recognised as such since they are sheltered with technoscientific legitimacy. These fictions make not-yet-existing technologies intelligible and desirable, generating surprise, urgency, and expectation.
The analysis shows how AGI deep hype is fuelled by the recursive interaction between uncertainty, fiction, and venture capital speculation. This entanglement plays a key role in advancing a cyberlibertarian and longtermist programme that displaces democratic oversight, frames regulation as obsolete, and positions private actors as the rightful stewards of humanity’s technological destiny. By observing the interplay of these elements, this theoretical paper contributes to the field of hype studies and offers critical insight into the governance of technological futures.
Sociotechnical Fictions, yasss!