Innovation Didn’t Just Happen. It Diffused.
- Feb 25
- 9 min read
Every morning, millions of North Americans wake up, check a smartphone for the weather, follow GPS directions to work, send messages over cellular networks, and sit in offices illuminated by electric light. The experience feels seamless. Inevitable.
But none of it was inevitable.
The last century of technological progress did not unfold as a series of isolated flashes of genius. It advanced layer by layer through a quieter, more powerful force: diffusion. Innovation matters. But adoption—carefully engineered, economically rational adoption—is what reshapes civilization.
To understand how the modern world actually scaled, you have to look at the stack.

The Layers That Built Modernity
Start with electric light. In the late 19th century, incandescent lighting existed—but it was fragile and expensive. Early carbon filaments burned out quickly. For electrification to spread beyond novelty, durability had to improve, and costs had to fall. Engineers like Lewis Howard Latimer refined the production of carbon filaments and improved manufacturing processes, extending bulb life and reducing maintenance. Durability lowered the cost per use. Lower cost enabled adoption. Adoption built infrastructure.
That is the first rule of diffusion: a technology spreads when its relative advantage becomes economically obvious to those who control adoption decisions. Electrification didn’t simply illuminate rooms; it reorganized cities, factories, and productivity. And once reliable power was available, new systems could ride on it. Consider railroads. As rail networks expanded, they faced a scaling problem: communication between moving trains was unreliable. Collisions were catastrophic. Efficiency was limited. Granville T. Woods developed a multiplex railway telegraph system that enabled safer, real-time signalling between trains and stations.
Here, diffusion occurred through institutions. Rail companies adopted safety-enhancing technology not because it was fashionable, but because risk demanded it. Insurance premiums penalized unsafe operations. Regulatory bodies began requiring minimum safety standards. Public trust—and therefore ridership and revenue—depended on visible safety improvements.
This reveals the second rule: infrastructure grows when institutions create enforcement mechanisms that make risk intolerable without improvement. By the mid-20th century, another layer emerged—not glamorous, but decisive. Electronics were shrinking. Circuits were becoming more complex. And reliability became the difference between success and failure. Otis Boykin improved precision resistors, enhancing stability under heat and voltage stress. His contributions strengthened the backbone of consumer electronics, military systems, and life-saving medical devices like pacemakers.
Component-level reliability rarely captures headlines. But it governs whether an industry scales. Manufacturers adopt higher-quality components because failure rates decline and warranty costs fall. Diffusion here travels through supply chains, one procurement specification at a time. Meanwhile, another kind of infrastructure was taking shape—not mechanical or electrical, but biological.
By the 1940s, blood transfusion had been in use for decades, but preservation was unreliable and distribution was chaotic. Battlefield medicine and civilian surgery both faced the same constraint: blood was perishable, compatibility testing was inconsistent, and supply was unpredictable. Charles Drew developed techniques for plasma preservation and created the organizational systems that enabled large-scale blood banking. His work during World War II transformed blood from a scarce emergency resource into a storable, distributable infrastructure.
The technical innovation—plasma separation and preservation—mattered. But the diffusion happened through institutions. Hospitals adopted blood banking not because it was medically elegant, but because surgical capabilities came to depend on it. Malpractice liability demanded it. Accreditation bodies required it. Insurance frameworks covered it. Standards for collection, testing, storage, and distribution created protocols that reduced contamination, improved compatibility matching, and made transfusions predictable enough to build surgical practice around.

This reveals a crucial pattern:
When infrastructure enables capabilities that become baseline expectations, diffusion accelerates through institutional requirements. Once, patients expected hospitals to have blood available; not having it became a competitive and legal liability. The American Red Cross formalized the system. Regulatory frameworks standardized the practices. What began as wartime innovation became peacetime infrastructure.
Blood banking diffused not because doctors were convinced, but because the institutional cost of not adopting it exceeded the cost of implementation. If electrification enabled power, reliable components enabled circuits, and blood banking enabled modern surgery, the next transformation required a different kind of leap: modularity.
In the early 1980s, personal computing faced a constraint. Systems were often closed, limiting compatibility and expansion. Mark E. Dean, working at IBM, helped develop the Industry Standard Architecture (ISA) bus, which enabled peripherals to connect via standardized interfaces.
Standards reduce switching costs. They increase compatibility. They invite third-party innovation. When you standardize how parts connect, you do more than ship a device. You create a platform. Platforms accelerate diffusion by transforming products into ecosystems. But computing power alone did not dissolve geographic constraints. For that, another layer was required—precision modelling of the planet itself.
The Earth is not a perfect sphere. Accurate satellite positioning requires precise geoid models to account for gravitational irregularities. Gladys West developed mathematical models that contributed to the accuracy underlying GPS systems. Satellite navigation initially diffused through military and research institutions. Only when receiver chips became affordable—dropping from thousands of dollars to tens of dollars—did GPS spill into civilian life, reshaping logistics, agriculture, aviation, emergency response, and daily commuting.
Diffusion often begins in closed systems before reaching the public. Cost and reliability determine when the crossover occurs.
Meanwhile, communication networks were undergoing their own shift. Analog cellular systems faced capacity and efficiency limits. Digital architectures improved spectrum use, scalability, and security. Engineers such as Jesse Russell contributed to foundational work on digital wireless systems that would support the expansion of mobile broadband. Digital networks unlocked exponential growth. The more users joined, the more valuable the network became. Diffusion moved from linear to network-driven.
Finally, none of these layers scale without manufacturing capability and capital formation. Semiconductor advances compress the cost and size of computing components. Venture capital finances the transition from the laboratory to the market. Leaders such as Frank S. Greene Jr., who bridged engineering and venture investment in the early days of Silicon Valley, helped ensure that innovation moved from prototype to production.
In other words, civilization scales when technology meets financing and distribution.

When Diffusion Fails
But diffusion is not inevitable. The graveyard of superior technologies is vast.
Consider the metric system in the United States. Decimal-based measurement is objectively more rational than imperial units. The economic advantages are clear: simplified manufacturing, easier international trade, and reduced conversion errors in medicine and engineering. Yet America remains metrically isolated, despite federal legislation in 1975 declaring metric the “preferred system.”
Why? Because economic rationality operates at multiple levels, they don’t always align. Individual companies saw adoption costs. No single firm could justify the expense of retooling when competitors weren’t doing the same. Coordination failure is locked in an inferior standard.
Or consider Betamax versus VHS. Sony’s Betamax offered superior video quality, but JVC’s VHS won through a different kind of advantage: licensing openness. VHS manufacturers proliferated while Sony maintained tight control. More manufacturers meant lower prices, wider distribution, and more rental inventory. Technical superiority lost to ecosystem advantage.
These failures reveal a harder truth:
“Economically obvious” is contested terrain. The decision-makers who control adoption—procurement officers, regulators, standards bodies, incumbent firms—operate under different incentive structures than end users. What appears rational from one vantage point can be irrational from another.
Diffusion stalls when:
Coordination costs exceed individual incentives (metric system)
Incumbent lock-in creates switching barriers (QWERTY keyboard layout)
Control mechanisms restrict ecosystem growth (Betamax)
Regulatory capture protects inferior technologies (municipal broadband restrictions)
Short-term individual rationality conflicts with long-term collective benefit (social media algorithms optimized for engagement over user welfare)

The lesson: diffusion requires more than technical merit. It requires aligning incentives across fragmented decision-making structures.
The Institutional Accelerator
Return to the railroads. Safety technology didn’t diffuse because engineers recognized its value. It diffused because institutions created credible enforcement mechanisms.
Insurance companies raised premiums for railroads with poor safety records. State regulatory commissions began requiring safety equipment as a condition of operation. Labour unions demanded protections for workers. Newspapers publicized accidents, creating reputational risk. Each institution independently pushed toward the same outcome: adopting safety technology became cheaper than resisting.
This pattern repeats across successful diffusions. Blood banking scaled when hospitals faced malpractice exposure without it. GDPR didn’t make privacy-by-design technically superior—it made non-compliance legally expensive. Building codes didn’t make fire sprinklers more effective—they made their absence a barrier to occupancy permits. Accessibility standards didn’t improve screen readers—they made inaccessible government websites a liability.
Institutions transform best practices into baseline requirements. They convert voluntary adoption into structural inevitability.
This matters because complex technologies with externalities—technologies whose benefits and harms extend beyond immediate users—rarely diffuse through market forces alone. They require institutional architecture.
The public sector doesn’t just regulate diffusion. It often is the diffusion mechanism.
The Pattern Emerges:
Step back, and the layers reveal a structure:
Electrification reduced darkness.
Rail signalling reduced collision risk.
Precision components reduced failure rates.
Blood banking reduced surgical mortality.
Open architecture reduced integration costs.
Geoid modelling reduced positional error.
Digital wireless reduced communication constraints.
Semiconductor scaling reduced the cost per computation.
Each layer lowered switching costs for the next. But diffusion isn’t automatic.
It requires:
Economic advantage visible to decision-makers (not just end users)
Institutional mechanisms that make adoption structurally rational
Compatibility with existing systems (or overwhelming value that justifies replacement)
Coordination across fragmented adopters (standards, platforms, or mandates)
Risk reduction that transforms procurement calculations
The modern world did not arise from isolated breakthroughs. It emerged from stacked S-curves of adoption—each dependent on improvements in relative advantage, compatibility, reduced complexity, and institutional support.
What This Means for AI
We are living through the diffusion problem again. Artificial intelligence is not struggling for technical breakthroughs. Models improve monthly. Capabilities expand. But organizational adoption remains fragmented, hesitant, pilot-trapped. The barrier isn’t technical sophistication. It’s a diffusion architecture.
Consider AI governance frameworks. Organizations don’t resist responsible AI because they disagree with the principles. They resist because:
Compliance costs are immediate and visible; risk costs are distant and probabilistic
Procurement processes aren’t designed for algorithmic systems (RFPs ask for vendor references, not model cards)
Liability frameworks are unclear (who’s responsible when an AI system fails?)
Integration with existing IT infrastructure requires custom engineering (high switching costs)
Executive teams lack mental models for AI risk (hard to budget for threats you can’t visualize)
The pattern from electrification to GPS applies directly.
Diffusion will accelerate when:
Compliance becomes cheaper than risk. Right now, “doing AI governance right” feels expensive because the costs are concentrated and visible (dedicated staff, new workflows, vendor assessments), while the benefits are diffuse and probabilistic (avoided regulatory fines, reduced reputational damage, better decision quality). This flips when enforcement mechanisms create credible consequences. Early signals: EU AI Act penalties, algorithmic accountability legislation, procurement requirements in government contracts.
Frameworks integrate with existing systems. Just as the ISA bus made peripherals compatible, AI governance needs to plug into the existing enterprise architecture. That means model cards that flow into existing security review processes. Audit trails that integrate with compliance management systems. Risk assessments that speak the language of enterprise risk management. The winners won’t be the most technically sophisticated frameworks—they’ll be the ones that require the least organizational retooling.
Switching costs become negligible. Organizations won’t adopt AI governance by ripping out existing systems. They’ll adopt when new capabilities can be layered on incrementally. Pre-built templates for impact assessments. Automated documentation tools that generate compliance artifacts. Vendor ecosystems that handle the integration complexity. The diffusion pattern suggests: make the first step trivially small.
Value becomes observable and trialable. GPS didn’t scale until chips became affordable enough for civilian experimentation. Blood banking spread when hospitals could pilot programs in elective surgery before scaling to emergency medicine. AI governance will scale when organizations can run low-stakes trials. Pilot programs on non-critical systems. Proof-of-concept implementations with clear before/after metrics. Sandbox environments where governance overhead is measured, not assumed.
Institutions create baseline requirements. This is already beginning. Ontario’s AI governance framework. Federal procurement rules require algorithmic impact assessments. Industry-specific regulations (healthcare AI, financial services, hiring systems). Each mandate transforms voluntary best practices into table stakes. Blood banking didn’t become universal because every hospital independently concluded it was valuable—it became universal because accreditation required it. The question isn’t whether institutional enforcement will happen—it’s whether your organization will be ready when it does.

The organizations that will win the next decade aren’t the ones with the most sophisticated models. They’re the ones that make adoption feel inevitable—to procurement officers, legal teams, compliance departments, and risk managers.
They’ll reduce risk by providing clear audit trails and accountability mechanisms.
They’ll integrate with existing systems by speaking the language of enterprise IT and compliance frameworks.
They’ll make switching costs negligible by offering incremental adoption paths.
They’ll ensure value is observable by measuring governance overhead and demonstrating ROI.
This is how electrification scaled. How blood banking moved from battlefield innovation to civilian infrastructure. How GPS moved from the military to civilian use. How digital wireless transformed communication. And it’s how responsible AI will move from pilot projects to operational infrastructure.
The Infrastructure Mindset
The last hundred years of technological progress teach a counterintuitive lesson:
Invention is overrated. Not unimportant. Overrated.
The arc of civilization bends not toward the best technology, but toward the most adoptable technology. Toward systems that reduce switching costs, align institutional incentives, integrate with existing infrastructure, and make economic advantage obvious to fragmented decision-makers. We obsess over breakthroughs—the next model, the next capability, the next paradigm shift. But history suggests a different focus: diffusion architecture.
Who controls the adoption decision?
What makes it economically rational for them specifically?
Which institutions need to create enforcement mechanisms?
What existing systems must you integrate with?
Where are the coordination failures that prevent rational adoption?
These are not ancillary questions. They are the core questions. Because progress doesn’t spread through brilliance. It spreads when adoption feels inevitable. That is how the last hundred years of civilization actually happened.
It is how the next hundred will be built!

Comments