Strong AI and the Future Disruption of War (Part 1 of 3)
How the Race to AGI Might Lead to Paradise or Create Paradise Lost
This three-part blog explores AI's transformative impact, focusing on its evolution and implications and delivered in my preferred style for intellectual exploration of wicked challenges. It will blend academia with pop culture and some snark. Part 1 examines AI's rapid progression over the next 12-18 months, introducing agentic AI and its potential to disrupt professionalization, paving the way for artificial general intelligence (AGI). Part 2 is where I delve into AGI’s security challenges by 2030-2050, questioning global governance, non-proliferation policies, and risks of misuse by authoritarian regimes. Part 3 envisions AGI reshaping conflict and war, exploring utopian promises, societal disparities, and existential threats. In each part of this 3-part series for subscribers, I offer plenty of new perspectives, links to other content and readings, and try to have fun along the way as we consider technological elimination, replacement, or societal transformation. Each part builds on AI’s paradigm shift, urging readers to consider ethical, strategic, and societal ramifications. Okay, strap in for the first part.
Part 1: The next 12-18 months of AI Progression
Artificial intelligence is part of the daily news cycle for good reason: it is probably the most significant scientific paradigm shift humanity is undergoing since the development of movable type. I say that without exaggeration, in that what complexity science terms ‘strong emergence’ in complex systems is where the incoming paradigm radically transforms the existing system in ways that significantly alter what reality is and how we go about experiencing it. If you remember ‘The Muppet Show’, this is where you should be flaying your arms overhead while screaming like Kermit the Frog while backstage as chaos unfolds. Just kidding, keep your frog arms at your side for the moment.
In the pre-Gutenberg period that consists of history up through the 15th century, societies controlled most all knowledge construction through steeply hierarchical modes that allowed a minority population (those that were literate or in control of literate members) to dominate the majority (illiterate, limited access to those with knowledge).
Keep reading with a 7-day free trial
Subscribe to Sapiens, Technology, and Conflict: Ben Zweibelson's Substack to keep reading this post and get 7 days of free access to the full post archives.