In 1892, somewhere in New England, a factory manager made a decision that looked very smart. He replaced the old steam engine powering his factory floor with a brand new electric motor. The future had arrived. The motor hummed. The belts ran. The pulleys turned.
Nothing changed.
He wasn't incompetent. The motor worked exactly as promised. What he didn't change — couldn't change, because the whole building was organized around it — was the shaft. The long central rotating shaft that ran the length of his factory, driving dozens of machines through a labyrinth of belts and pulleys. The shaft was designed for steam. The new motor drove the same shaft. And the shaft kept determining everything: where machines were placed, how workers moved, how fast each process ran, where bottlenecks formed.
He had the right technology in the wrong system.
The Thirty-Year Gap
Stanford economist Paul David documented this in his 1990 paper "The Dynamo and the Computer." Electric motors accounted for less than 5 percent of factory mechanical drive in 1900. The technology was real. The benefits were not.
Factories only began capturing electricity's potential in the 1920s, when a new generation of managers did something the previous one hadn't. They stopped trying to improve the old factory with new technology, and started designing factories around the technology. Individual motors for individual machines. Buildings rearranged along the flow of materials, not around a central shaft. Single-story layouts. Flexible production lines.
The productivity gains came. They were enormous. But they required thirty years and an almost complete rethinking of how a factory was supposed to work.
The technology was ready in 1880. The institutions needed until 1920.
The Shaft Drive You're Still Running
PwC's 2026 survey of 4,454 CEOs is now the most cited uncomfortable statistic in enterprise tech: 56% report zero financial return from their AI investments. A separate dataset puts the productivity figure at 80% zero measurable impact, despite 69% adoption.
The response from consultants and vendors is predictable. Better change management. More training. Cleaner data. Different models.
None of them are pointing at the shaft.
Most companies deploying AI today are doing exactly what that New England factory manager did. They're overlaying new technology on existing systems. An AI chatbot on top of the old support ticket workflow. An LLM summarizer on top of the quarterly reporting process that should have been killed five years ago. A generative AI tool handed to workers whose job descriptions, incentive structures, and management layers were designed for a different technological era.
The motor hums. The shaft still runs the show.
What Carlota Perez Actually Said
Economist Carlota Perez spent decades studying how technological revolutions unfold. Her framework distinguishes between two phases: Installation (when new technology arrives, attracts capital, and gets absorbed into existing infrastructure) and Deployment (when institutions, business models, and organizations restructure around the technology and the gains materialize).
We are, by any honest reading of the data, deep in Installation. AI datacenter investment is approaching 1.2% of GDP. The technology is improving fast. The 56% zero-ROI figure is not an anomaly. It is a timestamp.
The factory doesn't reorganize itself.
The Question Nobody Wants to Ask
The question isn't "which AI model should we use?" It's "what would our organization look like if it was designed from scratch, today, knowing what AI can actually do?"
That question is uncomfortable because the honest answer usually involves fewer middle layers, different hiring criteria, a different relationship between human judgment and automated process. It involves changing the building, not just the motor.
The companies that saw nothing from electricity in 1892 weren't wrong about electricity. They were the last factories before the assembly line.
Some of them made it to 1920. Many didn't.