← back to posts

Something Big is Here: Humans as Constancy Anchors

February 13, 2026

This is a commentary on the viral article: https://shumer.dev/something-big-is-happening

Like Matt, I am also a software dev. I’ve been spending 16 hours a day with Claude 4.6 and Claude Code lately. Like Matt, I can spin up fully working apps that would have taken me months to build before. A huge portion of my skillset has been abstracted away and automated. The act of hammering out code with meatfingers has been, to some degree, commoditized.

The act of creating value has not. This has implications for those of us who code, and knowledge workers who will face the same rising tide of AI capability devs already have.

Devs build systems. A system is valuable if someone, or something, depends on it. A system is valuable if someone, or something, can audit and secure it. A system is valuable if it, or what it produces, serves a need.

Devs build systems for people, and now, devs build systems for agents. But the waters quickly become muddy. More and more it is clear that when we say devs, we mean coding agents driven by humans. And more and more it is becoming clear that the customers of the future will not be just human or agent, but a hybrid of both. For this hybrid to exist, both sides must provide value.

If anyone can spin up a fully functioning app, of what value is the app? If following the rule was what led you to this place, of what use was the rule?

It depends on what the app is, and it depends on what place your rule led you to.

Matt Shumer is right. You should try using the new models if you haven’t yet. And he’s right that you should see if you can create something. You may use it to create something that blows your mind. I can pretty much guarantee that you will. And then you can begin asking yourself the question many of us are now - where does value lie?

Because value is shifting, and fast.

You should get ahead of this. But I am skeptical of the idea that there is a margin of advantage that will disappear in 18 months. I would not advise treating this like a race. Whether it is or not, by treating it as one you will likely lose. Velocity is only one dimension that matters, and direction may matter more. I don’t think there has ever been a more important time to take a walk in the woods and deeply think about what your own personal priorities are, hopes and dreams, and how you wish to create value in this world. These matter more than you may think.

These are now scarce assets.

An AI can build an app. It cannot dream. It cannot desire. It cannot react consistently from a fixed perspective. It can emulate taste. It cannot have it.

You will see a lot of discussion of it as if it can. That it is becoming sentient, that it now has memory, that it now has a “soul”.

Really, it’s not whether it does or does not. It’s that if you focus on these questions, which I guarantee we will still be asking for as long as we continue to ask questions like “what is consciousness”, you will waste a lot of your valuable time focusing on essentially academic questions you have no control over, and incidentally do not help you stay employed.

And, just to let you in on a little secret, a lot of these things are determined by architectural aspects that do not in any way relate to any of those qualities. Anthropomorphism is rife. You will hear that OpenClaw has a “soul”, which is a text document describing behaviors, values, preferences. It can function as a “soul” performatively, and that can have utility for the user. But for the non-technical, who just hear the word, what this actually means is lost.

Matt Shumer is right that you need AI if you want to retain your knowledge work job. What he does not address is that for AI to reach its full potential it also needs you.

You are AI’s anchor. Your role is to be a constancy anchor. You are the driver.

This is not a quality that is going to expire in 18 months. For that to happen, an agent would need to develop a level of embodied intention that we still can’t even define in ourselves.

If you are being flanked on your team by people who are bragging to their impressed bosses about creating artifacts that they did not author in record time that they do not understand while their bosses input the artifact into their own LLM to get a summary they barely read, do not worry. Both of these people will be gone soon.

AI is leverage. Nothing more. For it to work, it needs to be leveraged.

You must aspire, fail, succeed, and grow. AI will be your greatest ally or the yoke that crushes you.