Back to writing

The Adolescence of Technology

Confronting and overcoming the risks of powerful AI.

January 2026 15 min read Essay

There is a scene in the movie version of Carl Sagan's book Contact where the main character, an astronomer who has detected the first radio signal from an alien civilization, is being considered for the role of humanity's representative to meet the aliens. The international panel interviewing her asks, "If you could ask them just one question, what would it be?"

Her reply is: "I'd ask them, how did you do it? How did you evolve, how did you survive this technological adolescence without destroying yourself?"

1. I'm sorry, Dave

When I think about where humanity is now with AI, and about what we are on the cusp of, my mind keeps going back to that scene because the question is so apt for our current situation. We are acquiring tools that scale judgment, persuasion, automation, and power faster than our institutions evolve.

I believe we are entering a rite of passage, both turbulent and inevitable, which will test who we are as a species. Humanity is about to be handed almost unimaginable leverage, and it is not yet clear whether our social and political systems are prepared to handle it well.

2. A surprising and terrible empowerment

The challenge is not merely technical. It is also moral and institutional. We must ensure that the systems we build are aligned with human flourishing rather than narrow proxy metrics that gradually displace judgment, agency, and accountability.

As we hand over more cognitive labor to machines, the central question becomes sharper: what remains uniquely human? The answer is not a competition with algorithms. It is the responsibility of choosing ends, constraints, and values.

3. The odious apparatus

Every transformative technology builds an apparatus around itself: incentives, interfaces, business models, defaults, and new habits of attention. AI will be no different. The risk is not only that the model fails. The risk is that the surrounding system quietly teaches people to stop noticing what matters.

Bad defaults scale. So do shallow goals. If the surrounding apparatus rewards speed over reflection, extraction over understanding, and convenience over truth, then even a competent model can become part of a deeply incompetent society.

4. Player piano

Automation has always contained a psychological edge. It does not merely remove tasks; it redefines what kinds of effort feel necessary. People adapt to the machine's rhythm. Over time, they internalize its assumptions. That is why interface design and workflow design matter as much as raw model capability.

The goal should not be to make humans passive supervisors of opaque systems. The goal should be to create tools that preserve human comprehension while increasing human range.

5. Black seas of infinity

Powerful AI also changes the horizon of uncertainty. It expands what small groups can do, what institutions can automate, and what failures can propagate. That makes humility a design requirement. We need evaluation, monitoring, staged deployment, and the discipline to distinguish impressive demos from dependable systems.

If we treat capability as wisdom, we will build brittle futures. If we treat speed as legitimacy, we will create systems that are admired before they are understood.

Humanity's test

Technological adolescence is not only about danger. It is also about maturation. The same tools that amplify manipulation can also expand insight, scientific progress, accessibility, and coordination. The difference will come from governance, culture, and design discipline.

The test before us is whether we can become the kind of civilization that deserves the tools it is building. That answer will not come from models alone. It will come from the standards we hold while building and deploying them.