Humans Need Entropy | Daniel Miessler

Humans Need Entropy | Daniel Miessler

Humans Need Entropy

I’ve had several thoughts on the Karpathy and Dwarkesh conversation that took place in late October 2025.

But the one that keeps haunting me is something Karpathy just kind of casually mentioned before moving on to another topic. I think it might be the biggest idea in the whole conversation.

Talking about human / model similarities he says:

Humans collapse during the course of their lives. Children haven’t overfit yet.

They will say stuff that will shock you because they’re not yet collapsed.

But we [adults] are collapsed. We end up revisiting the same thoughts, we end up saying more and more of the same stuff, the learning rates go down, the collapse continues to get worse, and then everything deteriorates.Karpathy, on the Dwarkesh Podcast

Since my 20s I’ve been terrified of this happening to me. It pierces my soul whenever my partner says:

I knew you were going to say that.

Ugh. Nobody wants their wit or humor to be predictable.

How many older people do you know who tell the same stories and jokes over and over? Watch the same shows. Listen to the same five bands, and then eventually two. Their aperture slowly shrinks until they die.

Luckily, Karpathy gives the solution right after.

We have to find sources of entropy.

When we were kids, everything was entropy because everything was new. So we were constantly changing our preferences, our behaviors, our language, and everything.

It made us fresh. Unpredictable. Which is highly related to a concept I’m obsessed with from Shannon’s Theory of Information, which, in his model, defines Information as the part of the transmission that isn’t repeated or noise.

I’m doing a bunch of stuff—in addition to pathological reading—to maximize entropy in my life.

  • I am reading a lot of old books on writing, like on Rhetorical Figures, to get fresh phrases into my mind.
  • I regularly re-read, and listen to, Christopher Hitchens books and debates. Just to have exposure to that level of non-cliche language.
  • And I’m currently building an AI (Claude Code-based Skill) called increase-entropy that incorporates all this old and fresh language as like a particle accelerator I can point at a thought or piece of content. Not sure how effective it’ll be yet.

I even went so far in 2024 as to create an AI prompt in Fabric that would rate talks, blogs, panels or whatever for Wows Per Minute, meaning how often a given piece of content surprised the audience.

I mean, this was a problem before AI. And now many are delegating even more of their thinking to a system that learns by crunching mediocrity from the internet. I can see things getting significantly worse.

I guess it’s somehow comforting that this happens to both AI models and to humans. It makes the whole thing more human somehow. And hearing Karpathy say it so plainly was jarring to me, in a pleasant way.

At least for us humans the solution seems something like:

  1. Recognize that it’s a problem that starts for everyone in their (probably) mid-late 20’s.
  2. Constantly seek and consume sources of novelty and freshness to maintain young-mind.

Would love to hear your sources.



Source link