I believe in the primacy of systems. Functional systems should be defended because once they collapse, they cannot be rebuilt. That’s what today’s newsletter is about. Our system of liberal democracy has not collapsed yet, but it’s close. I’ve written a bunch about how certain things—the rule of law, the FBI, the Department of Justice—are probably beyond any near-term repair. But other systems have not yet fallen. That’s why we built The Bulwark. Because we saw the media ecosystem undergoing a shock and wanted to save it. If you want to be part of that, I hope you’ll join us. Going to be a short newsletter today because Sarah and I did a super-sized Secret Podcast this morning. You’re going to love it. Instant classic this week. Also: Next week is going to be a weird Triad schedule because I’m not sure when I’ll find windows to write while heading to Minneapolis for the live shows. (It looks like just a few tickets are still available for the February 18 event.) 1. SystemsThis week a tech guy named Matt Shumer wrote a big, apocalyptic warning about the chaos AI is about to unleash. His essay is worth your time, so I’d encourage you to read it in full. But the basic summary is:
There’s a lot more in it. Again, read the whole thing. What worries me—what I want to talk about today—is the problem of speed. If I could do one thing to change American education it would be to focus on ecology early and often. That’s because humans don’t think enough about systems and the easiest way to introduce the concept of a system is to talk about local environments. Get kids thinking about how an ecosystem works and they can learn how a financial market, or an industry, or a network functions. It helps them understand stable-states, and systemic shocks, and evolutionary change. There’s a lot to learn. One of the big lessons of ecology is that complex systems are tremendously resilient and adaptable if the change comes slowly enough. Complex systems are not vulnerable to change so much as they are vulnerable to shocks—sudden, rapid change. That’s what worries me most about AI. In the early days of ChatGPT, people were worried about the robot apocalypse. The big fear of the moment is white-collar job displacement, especially at the entry level. What happens when AI can do everything a paralegal, or a research assistant, or a data analyst does, and cheaper? What happens when AI can do journalism, coding, graphic design, and anything you might have hired McKinsey to do? A lot of white-collar workers may be out of a job. That wouldn’t worry me if it happened over the course of twenty years. Because the market would adapt. New industries would emerge; new pathways would be established. The system would find a |