Core Primitive
New tools can force systemic change by changing what is possible and what is easy. Technology is not a neutral instrument — it is a structural force that reshapes the systems in which it is deployed. Introducing a new tool changes the information flows (who knows what), the process flows (how work moves), the decision rights (who can act), and the incentive structures (what is visible and measurable). Technology can be the most powerful systemic intervention available — or the most expensive waste of resources — depending on whether it is deployed as a system change or as an automation of the existing system.
The automation trap
The most common technology deployment failure is automating the wrong thing. An organization with a dysfunctional process deploys technology to automate that process, producing a dysfunctional process that executes faster. Michael Hammer warned against this with characteristic bluntness: "Don't automate; obliterate." The goal of technology deployment should not be to make the current process faster — it should be to enable a fundamentally different process that produces fundamentally different outcomes (Hammer, 1990).
The automation trap is seductive because it is easy. Automating the existing process requires no system redesign — just implementation. The current process is known, the current roles are defined, and the current expectations are set. The technology vendor promises efficiency gains, the implementation team follows the current process as a blueprint, and the project is declared successful when the current process is running on the new platform. But the outcomes do not change because the system did not change — only the medium changed.
Technology as a structural force
When technology is deployed as a systemic intervention rather than an automation, it operates through the same structural mechanisms described in the preceding lessons — but with unique properties that make it an exceptionally powerful lever.
Technology changes what is possible
Before email, communicating with a colleague in another office required a phone call (synchronous, undocumented) or a memo (slow, formal). Email made asynchronous, informal, documented communication possible. Before version control, collaborating on code required careful coordination to avoid conflicts. Git made parallel, independent, mergeable collaboration possible. Before AI language models, generating draft content required a human writer. AI tools make instant, iterative, scalable content generation possible.
Each technology expansion of the possible creates new system design options that did not previously exist. The system designer who understands the technology's capabilities can design systems that leverage those capabilities — producing outcomes that were impossible under the previous technology.
Technology changes what is easy
Even more powerful than changing what is possible is changing what is easy. Behavior follows the path of least resistance. When a technology makes a desired behavior easier than the alternative, the desired behavior becomes the default — without incentives, training, or enforcement.
Before Slack, sharing information with a team required composing an email and selecting recipients. After Slack, sharing information with a channel is easier than composing an email — the information flows to the team as a side effect of doing work, not as a separate communication task. The technology changed what is easy, and behavior followed.
Technology creates new information
Technology generates data about organizational behavior that was previously invisible. A CRM records every customer interaction, creating a dataset that enables analysis of sales patterns, customer preferences, and team performance. A project management tool records every task transition, creating a dataset that enables analysis of workflow bottlenecks, estimation accuracy, and team capacity. An AI tool records every query and response, creating a dataset that reveals what information people need and how they use it.
This new information creates new feedback loops (Feedback loops in organizational systems) — mechanisms for the organization to see its own behavior and adjust. The information that technology generates about organizational performance is often more valuable than the process automation the technology provides.
The sociotechnical perspective
The sociotechnical systems tradition, originating with Eric Trist and the Tavistock Institute, established that organizations are jointly optimized technical and social systems. Optimizing the technical system alone (deploying the best technology) while ignoring the social system (the people, roles, and relationships) produces suboptimal outcomes because the two systems interact. A technology that is technically superior but socially disruptive may produce worse organizational outcomes than a technically inferior technology that aligns with the social system (Trist, 1981).
The sociotechnical perspective produces three practical principles for technology deployment.
Joint optimization. Design the technical system and the social system together — not sequentially. When the technology is designed first and the social system is forced to adapt, the adaptation is often suboptimal. When both systems are designed together, each can be optimized to complement the other.
Minimal critical specification. Specify the minimum constraints necessary for the technology to function, and leave the remaining design to the people who will use it. Over-specified technology (rigid workflows, mandatory fields, prescribed sequences) prevents the adaptive behavior that organizations need. Under-specified technology (no structure, no defaults, no guidance) produces chaos. The optimal specification provides structure where structure is necessary and flexibility where flexibility is valuable.
Boundary management. Pay special attention to the boundaries between the technical and social systems — the points where humans interact with technology. These boundaries determine the user experience, and the user experience determines whether the technology is used effectively, used grudgingly, or worked around entirely.
Deploying technology as system change
Technology deployment as systemic intervention follows a different playbook than technology deployment as automation.
Start with the system change, not the technology. Define the system change you want to make (different information flows, different processes, different decision rights, different metrics) and then select the technology that enables that change. The technology serves the system change, not the other way around.
Redesign roles alongside deploying tools. Technology changes what people do — it automates some tasks, enables new tasks, and makes existing tasks easier or harder. Role redesign ensures that the human contribution evolves alongside the technical capability, rather than leaving people doing obsolete tasks alongside a powerful tool.
Plan for the transition. The period between the old system and the new system is the most dangerous — the old habits have been disrupted but the new habits have not yet formed. Plan for reduced productivity during the transition, provide support for people learning new ways of working, and resist the temptation to judge the new system's performance during the transition period.
The Third Brain
Your AI system is itself a technology that can serve as a systemic intervention. Beyond its use as a personal cognitive tool, consider how AI changes what is possible in your organizational systems: What decisions could be made faster with AI-assisted analysis? What information flows could be created by AI-generated summaries and syntheses? What processes could be redesigned with AI handling routine steps while humans handle exceptions? The question is not "What can AI do?" but "What system change does AI enable that was not possible before?"
From intervention to sustainability
Technology deployment is a moment. Systemic change is a duration. The next lesson, Sustaining systemic change, examines sustaining systemic change — how to ensure that changes persist after the implementation energy dissipates and the organization's attention moves to the next initiative.
Sources:
- Hammer, M. (1990). "Reengineering Work: Don't Automate, Obliterate." Harvard Business Review, 68(4), 104-112.
- Trist, E. (1981). "The Evolution of Socio-Technical Systems." In A. H. Van de Ven & W. F. Joyce (Eds.), Perspectives on Organization Design and Behavior (pp. 19-75). Wiley.
Frequently Asked Questions