We're measuring productivity wrong
And it's about to cost us again
what Red Hat Got Right
I work at Red Hat. I'm biased, and I'll say that upfront. But I also have a front-row seat to something most of the industry still hasn't figured out: Red Hat has been selling capacity, not throughput, for twenty years.
Look at how Red Hat describes what a TAM does: "A direct relationship with a senior technical resource. They can prevent issues before they arise." Relationship. Knowledge. Prevention. Not "cases closed per hour."
Forrester found a 379% ROI from TAM services. IDC found a 93% reduction in unplanned downtime. When a customer described their TAM, they said: "A very smart and valuable member of my team." That's not throughput language. That's capacity language. The customer is recognizing that what they're paying for isn't output. It's expertise, availability, and the judgment to prevent problems before they become crises.
Jim Whitehurst wrote The Open Organization about why engagement and autonomy outperform command-and-control productivity management. The Open Management Practices say it plainly: "Individuals are inspired to do their best work when they are connected to something bigger than themselves."
This philosophy works. The business results validate it. And the rest of the industry is about to learn why it matters more than ever.
what the Rest of the Industry Measures Instead
Here's a question most employers get wrong: What are you actually paying for when you hire someone?
Most would answer without hesitating. Productivity. Output. Widgets per hour, cases closed per quarter, tickets resolved. Simple math.
Except what you actually hire, when you hire a domain expert, is capacity: their expertise, their judgment, and their availability. The output they produce is a byproduct of well-deployed capacity. It's evidence that the engine is running. It's not the engine.
the Slack Problem
I'm a Technical Account Manager. My job is to be the expert in the room for my customers. I learn their environments, I anticipate their problems, and I'm there when something goes sideways. Something always goes sideways.
Here's what happens when my employer measures my value purely by throughput: I fill every minute with tasks. Cases, documentation, reports, meetings. My utilization looks excellent.
Then a customer calls. Something is broken.
But I'm full. Every minute is spoken for. I either drop something in progress or make the customer wait. Neither option is good. Both are inevitable when you optimize for 100% utilization.
You don't run your servers at 100% CPU and then act surprised when they can't handle a spike. But somehow we look at people and ask: "Why aren't you at 100%?" Because slack isn't waste. Slack is responsiveness capacity.
goodhart's Ghost
There's a principle in measurement theory called Goodhart's Law: When a measure becomes a target, it ceases to be a good measure.
Tell a TAM "close more cases" and they cherry-pick the easy ones. Tell a developer "commit more code" and they inflate the count with trivial changes. Tell a support rep "reduce handle time" and they rush past the real problem. The metric goes up. The value goes down. Nobody notices until the customer leaves.
This isn't theoretical. This is how most organizations operate. They measure what's easy to count, optimize for the number, and then wonder why the things that are hard to count, customer loyalty, institutional knowledge, the crisis that was prevented because someone was paying attention, seem to evaporate.
the Invisible Value
The value that matters most is invisible to begin with. Nobody tracks "customers who didn't churn because someone was available at the right moment." Nobody measures "escalations that didn't happen." This is the dark matter of customer success: it shapes outcomes and retains revenue, but it never shows up on a throughput dashboard.
The absence of catastrophe doesn't show up on a report. The steady state of "customers were well taken care of" looks, to the throughput-obsessed organization, like mediocrity. I'd argue it should be the target.
the Industry Has Made This Mistake Before
In the 1990s, companies looked at their experienced domain experts and made a calculation. "This person costs $80,000 a year. This person overseas costs $15,000. They both produce code. They both handle tickets. Easy math."
They measured throughput, found a cheaper source, and pulled the trigger.
In many cases, the throughput was maintained. The same tickets got closed. The same lines got committed. But the capacity evaporated. The institutional knowledge. The judgment calls. The architect who could smell a bad design before it was built. The support engineer who knew that this specific customer's environment was weird in a way that made the standard fix dangerous.
That knowledge walked out the door, and it took years, and far more money, to rebuild.
and They're About to Make It Again
Agentic AI tools are here, they work, and they're getting into the hands of everyone. Not just knowledge workers, but electricians, plumbers, field techs, small business owners. Anyone with a phone.
The question organizations are asking right now is exactly the same question they asked in the 1990s: Can this thing do what my expensive expert does, but cheaper?
If you deploy AI as a cheaper replacement for the domain expert, you will get exactly the same result you got in the 90s. The throughput will be maintained. The capacity will evaporate.
An agentic tool optimized for throughput replaces the worker. An agentic tool optimized for capacity amplifies the worker. The difference isn't the technology. It's whether you understood what the worker was doing for you in the first place.
The electrician uses an AI assistant to look up code requirements and plan the sequence of a job. But the electrician is still the one who walks into a house with 1940s wiring and knows, from experience, that the code-compliant answer is actually the wrong answer here. The AI makes the electrician more effective. It doesn't make the electrician unnecessary.
the Prediction
The organizations that get AI right will be the ones that already understood what their people were actually for.
If you've been measuring capacity, you already know where AI fits. It's a force multiplier for your best people.
If you've been measuring throughput, AI looks like a replacement. And you'll deploy it that way. And you'll get the same result the throughput-obsessed companies got when they offshored their expertise in the 90s.
The technology isn't the variable. Your management philosophy is.
The target isn't productivity maxing. It never was. The target is customer experience maxing, and productivity is what happens along the way when you deploy expertise and availability correctly.
The companies that figure this out will thrive. The ones that don't will learn the same lesson twice.
---
Grimm Greysson is a Technical Account Manager, builder of AI tools, and the kind of person who thinks systems theory belongs in management conversations.
— grimm