Mental Pivot

Notes and observations from a lifelong pursuit of learning.

Insights and interesting reads delivered straight to your inbox.
Sign up for the free Mental Pivot Newsletter.

Book Notes: “Thinking in Systems: A Primer” by Donella Meadows


What is a system? A system is a set of interconnected things (people, cells, molecules, components, etc.) that produces a specific outcome over time. Donella Meadows’ engaging book, “Thinking in Systems: A Primer” (2008) looks at the composition of systems, their surprising and often counterintuitive behaviors, and the myriad ways we can interact with them to shape them to our benefit.

Why should we care about systems? Simply put, systems are everywhere. Understanding a system—its inputs, outputs, interconnections and causal relationships—is the first step to improving a system. An automobile is a system. An economy is a system. The plumbing in your house is a system. Remove a key part from a system and the system will cease to work. Not only are systems everywhere, but systems are endlessly connected with other systems in different ways. There are linear and overlapping connections. There are also hierarchical connections whereby subsystems are nested in other systems which are, in turn, part of larger systems. The human body is a good example of this with its constituent subsystems: a circulatory system, a respiratory system, a nervous system, a digestive system, and so on. Similarly, a single person is also part of many systems larger than itself: a household, a neighborhood, a city, a region, a state, a nation, and so on.

The book is neatly divided into three parts. Part 1 focuses on the nuts and bolts of systems. Here we learn about the essential parts of a system: elements, interconnections and a goal or purpose. Together these parts drive the behavior of a system. Meadows also teaches us about ways to read systems through stock-and-flow diagrams and common system structures. Meadows repeatedly emphasizes that systems are models are tools—simplified representations of complex mechanisms. Part 2 explores the surprising characteristics and behaviors of systems. Readers learn about system resilience, adaptability and self-organization. We also consider the potential pitfalls and errors in designing and managing these systems—errors that lead to suboptimal and even harmful outcomes (remedies are also proposed). In Part 3, Meadows offers strategies for influencing and changing suboptimal systems. She also reminds us that system behavior is messy, unpredictable, and non-linear. Because of this, we need to be equally flexible, patient, and adaptable in the ways we interact with them.

This is a short book, but it isn’t the easiest read. I’ve started it several times in the past only to abandon it halfway through (tip: not great bedtime reading). “Thinking in Systems” demands active engagement from readers, but the insights it yields are worth the effort. Those willing to put in the time will come away with a valuable addition to their cognitive toolkit. Adopting a systems mindset is not only fun and informative, it also offers a blueprint for effective change in the world—and who doesn’t want to learn more about that?

Pros: Systems thinking provides a generalized framework that is applicable to a wide-range of biological, institutional, social and mechanical phenomena.

Cons: Can be highly abstract. Author provides some concrete examples, but I wish she offered even more real-world examples and case-studies.

Verdict: 8/10


Introduction: The Systems Lens

  • The behavior of a thing is latent in its structure.

    • Example: A Slinky toy rests in a person’s palm and another hand grasps the top of the slinky several coils down. Removing the hand results in the Slinky dropping and then bouncing up and down. The hand did not make the Slinky bounce. The Slinky itself possessed the necessary properties for that specific behavior.
  • Understanding the relationship between behavior helps us understand how poor outcomes happen and how to generate better ones.

  • “It is a way of thinking that gives us the freedom to identify root causes of problems and see new opportunities.”

  • A system is an interconnected set of things (people, cells, molecules, components, etc.) that produces a specific outcome over time.

    • A system can be triggered or influenced by external forces.
    • The system’s response to external forces is a characteristic of the system itself (like the Slinky example).
    • An external force that triggers a system response will likely result in a different outcome when applied to a different system.
  • “The behavior of a system cannot be known just by knowing the elements of which the system is made.”

Part One: System Structure and Behavior
Chapter One: The Basics

  • A system consists of three things:

    • Elements
    • Interconnections
    • A function or purpose
  • Example: The digestive system.

    • Elements: Teeth, enzymes, stomach, intestines.
    • Interconnections: The physical flow of food, chemical signals to regulate the process.
    • Function: To separate nutrients from food (to maintain and provide energy to the body) and to collect and discard unusable waste.
  • Systems don’t exist in isolation.

    • Systems can be interconnected with other systems.
    • Systems can be embedded within other systems.
  • Conglomeration: A collection of things that LACK interconnections or function. Example: Sand scattered on a road is not a system.

    • “When a living creature dies, it loses its ‘systemness.’ The multiple interrelations that held it together no longer function, and it dissipates, although its material remains part of a larger food-web system.”
  • “A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.”

  • The flow of information, from one part to another, is a common way that system interconnections are manifest.

  • “The best way to deduce the system’s purpose is to watch for a while to see how the system behaves.”

    • Watch what the system does not what is says or advertises itself to be doing.
    • Example: “If a government proclaims its interest in protecting the environment but allocates little money or effort toward that goal, environmental protection is not, in fact, the government’s purpose.”
  • Self-perpetuation is an important function for almost every system.

  • System purposes need not be human purposes nor need they be the intended purpose. It is common for a well-meaning system to result in unintended outcomes and behaviors.

  • Successful systems keep “sub-purposes and overall system purposes in harmony.”

  • Changing elements in a system often has the least impact on the overall system behavior. Modifying the interconnections and purpose often results in dramatic or fundamental changes.

    • “The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior.”
  • System basics:

    • A stock: This is an element in the system that is accumulated, depleted or stored over time. Examples: water in a bathtub, the population of a city, books in a bookstore, money in a bank.

      • “A stock is the memory of the history of changing flows within the system.”
      • A stock changes over time based on flow. A flow represents elements entering the system stock as well as elements leaving or being depleted from a system stock. Example: For a population, an inflow might be represented by births (these ADD to the population stock). Outflows might be represented by deaths (these result in a reduction in the population stock).
      • Stock-and-flow diagram: A visual representation of a system that shows inflows, outflows, reinforcing and balancing loops, stocks and interconnections with other systems.
      • Remember: System diagrams are simplified representations of the real world [me: “The map is not the territory.”]
    • “If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems.”

    • Dynamic equilibrium: A state in which inflows and outflows are equal resulting in an unchanged stock.

    • “The human mind seems to focus more easily on stocks than on flows...when we do focus on flows, we tend to focus on inflows more easily than on outflows.”

      • Example: When considering petroleum, governments often focus on obtaining more oil through more discovery of reserves and drilling (inflows). But fuel economy and improved efficiency of consumption (outflows) can also make for effective policy.
      • A stock can be increased by increasing the inflow rate as well as by decreasing the outflow rate.
    • A stock takes time to change because flows take time to flow.

      • This is important because people underestimate the time needed to change large or complex systems.
    • “Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.”

      • Example: A water reservoir allows us to maintain stability in the availability of water. People can receive steady flow of water whether there is a temporary drought or if it’s the rainy season.
    • “Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows.

  • Feedback loops:

    • Feedback loop: A mechanism for regulating a system’s behavior. Feedback loops occur when a change in stock affects the inflow or outflow of that same stock.

    • Feedback loops can affect stocks in several ways:

      • Maintain the level of a stock within a narrow range.
      • Cause a stock to grow.
      • Cause a stock to decline.
    • Balancing feedback loops are loops that aim to stabilize a stock within a desired target or range.

      • The discrepancy or gap between actual and desired levels in a stock causes a decision to increase or decrease additional inflows into said stock.
      • Feedback loops can work in both directions (inflows/outflows) and can regulate increased or decreased flow rates.
      • Example: A coffee drinker might drink a cup when their energy is running low (in this example, the stock is energy). The drinker notices their energy level is ebbing and opts to drink a cup of coffee. The inflow of caffeine results in an increased stock of energy.
      • Balancing feedback loops are goal-seeking or stability-seeking....a balancing feedback loop opposes whatever direction of change is imposed on the system.”
      • Remember: the mere presence of a feedback loop isn’t sufficient to ensure that a system is working well (the loop may be insufficient or too weak to maintain the desired result).
    • Reinforcing feedback loops are loops that amplify or reinforce growth or destruction within a system.

      • “A reinforcing feedback loop enhances whatever direction of change is imposed on it.”
      • Example: Inflationary pressures. As prices for goods increase, wages increase. As wages increase, prices also must increase to maintain profits.
      • Example: Compound interest. Money is saved and as interest in earned, the interest is added to the stock of savings which increases the overall stock that is earning interest.
      • “Reinforcing loops are found wherever a system element has the ability to reproduce itself or to grow as a constant fraction of itself. Those elements include populations and economies.”
      • Reinforcing loops can lead to exponential growth but that can also result in runaway collapse.
    • Consider: If A causes B, is it possible that B also causes A?

      • Example: If someone says that population growth causes poverty, ask yourself if poverty causes population growth.
      • “The concept of feedback opens up the idea that a system can cause its own behavior.”

Chapter Two: A Brief Visit to the Systems Zoo

  • One-Stock Systems:

    • A Stock with Two Competing Balancing Loops

      • Example: A thermostat (see stock and flow diagram below).
      • Stock is the temperature of the room which is influenced by inflows and outflows.
      • Inflow comes from heat from the furnace.
      • Outflow results from temperature outside (which is presumably colder).
      • One balancing feedback loop regulates the inflow: A desired temperature compared against the real temperature of the system stock. The delta between desired and actual influences the subsequent flow of more heat (or reduced inflow).
      • One balancing feedback loop regulates the outflow: The outside temperature influences the inside temperature based on the delta between outside temp and inside temp. A higher delta (e.g. between high room temp and low outdoor temp) results in increased outflow of indoor heat.
      • The two balancing loops compete.
      • Note that owing to the flow of information and time needed to impact the stock, there are inherent delays to rebalancing the stock. Any actions taken (e.g. to increase inflow of heat) can only affect future behavior and stock.
      • Because of the competing outflow, the thermostat inflow needs to be set HIGHER than the target or desired temperature.
    • A Stock with One Reinforcing Loop and One Balancing Loop

      • Examples: Population and industrial economies (see stock and flow diagram below).

      • Stock is the population (city, nation, world).

      • Systems have key driving variables. For a population these are fertility and mortality.

      • Inflow is the result of births which add to the population stock.

      • Inflow is governed by a reinforcing loop. As births increase (fertility levels), the population stock increases which drives more births and adds more rapidly to the population stock.

      • Outflow is the result of deaths which reduces the population stock.

      • Outflow is regulated by a balancing loop—mortality rate.

      • Population grows when the birth rate outpaces the mortality rate. Population declines when the reverse is true.

      • Changes in the flows change the (over time) the behavior of the stock.

      • Shifting dominance refers to situations where one feedback loop dominates the system. The loop that dominates the system determines its behavior.

      • “A stock governed by linked reinforcing and balancing loops will grow exponentially if the reinforcing loop dominates the balancing one. It will die off if the balancing loop dominates the reinforcing one. It will level off if the two loops are of equal strength.”

      • In reality, loop dominance will shift back and forth in sequence over time.

      • Remember: fertility and mortality are governed by their own feedback loops (they can be modeled as discrete systems that interconnect with the population system).

      • An economy bears similar behavior to the population loop.

        • Stock = capital
        • Inflow = investment
        • Reinforcing feedback loop: increased capital stock leads to reinvestment and increasing capital over time.
        • Outflow = depreciation
        • Balancing loop: lifetime of the capital affects depreciation. The longer the lifetime, the smaller fraction of capital needs to be retired/replaced annually.
        • “Systems with similar feedback structures produce similar dynamic behaviors.”
    • A System with Delays—Business Inventory

      • Example: A store like a car dealership.

      • Stock is the product inventory itself—vehicles for sale.

      • Inflows are the deliveries from factories.

      • Balancing feedback loop regulates the inflow to ensure that there are 10 days of vehicle inventory in stock.

      • Outflows are the sale of new cars to consumers.

      • Balancing feedback loop regulates the outflow (customer demand). Dealer can monitor sales and sales trends. If forecast is higher, the dealer can modify the inflow of new vehicles accordingly.

      • Delays are inherent to the system. The response to each balancing loop is not immediate or always accurate.

        • Perception delay: The dealer bases their ordering decisions on a 5-day average to smooth the temporary spikes and dips in demand.
        • Response delay: The dealer doesn’t adjust inflows in a single order. They make up a fraction of any shortfall with each subsequent order. Changes in inflows occur over several days (rather than as a one-time, immediate response).
        • Delays in balancing feedback loops result in system oscillations (fluctuations over time of inventory levels).
        • Other delays: production delays, delivery delays, construction delays.
      • “Delays are pervasive in systems, and they are strong determinants of behavior. Changing the length of a delay may make a large change in the behavior of a system.”

  • Two-Stock Systems:

    • A Renewable Stock Constrained by a Nonrenewable Stock

      • Example: An oil company

      • This model considers environmental constraints (which is lacking in the simpler single stock models).

      • Entities that exchange things with the environment need both a supply of the resource and a way to dispose of waste and byproducts of the process.

      • “Any physical, growing system is going to run into some kind of constraint sooner or later. That constraint will take the form of a balancing loop that in some way shifts the dominance of the reinforcing loop driving the growth behavior, either by strengthening the outflow or by weakening the inflow.”

      • Systems that exhibit growth: look for a reinforcing loop driving inflows and a balancing loop that constrains it.

      • Renewable vs. nonrenewable pollution constraint (outflow):

        • Nonrenewable: The environment cannot absorb the pollutant.
        • Renewable: The environment can absorb the pollutant and make it harmless.
      • Two constraints:

        • Resource-constrained systems: Inflow supply constraints.
        • Pollution-constrained systems: Outflow constraints (production is not feasible because the damage to the environment is too great or hazardous).
      • Constraints on a system can be either temporary or permanent.

      • “A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.” The higher and faster the growth, the faster and more pronounced the fall (when building capital stock against a nonrenewable resource).

      • The size of the constraining resource is the key variable in this system.

    • A Renewable Stock Constrained by a Renewable Stock

      • Example: A fishing economy.

      • Living vs. nonliving renewable resources:

        • Nonliving renewable resources: sunlight, wind, water in a river are regenerated through steady input regardless of the state of that stock.
        • Living renewable resources: fish, trees, cattle. These resources can regenerate and regrow. Note that the regeneration rate of these resources are not constant and are constrained by their stock and environment.
      • The fishing economy is governed by three variables:

        • Price (scarcity determines the price).
        • Regeneration rate (scarcer fish stocks are replenished more slowly).
        • Yield per unit of capital (efficiency of the fishing technology and methods).
      • “Renewable resources are flow limited. They can support extraction or harvest indefinitely but only at a finite flow rate equal to their regeneration rate.”

      • Some over-extracted renewable resources can be driven below a critical threshold and effectively become nonrenewable (e.g. fish stocks).

Part Two: Systems and Us
Chapter Three: Why Systems Work So Well

  • Many systems possess one of the following characteristics: resilience, self-organization, or hierarchy.

  • Resilience: A measure of a system’s ability to adapt and survive in a changing environment. Brittleness and rigidity are the opposite of resilience.

    • Feedback loops that regulate the balance in a system are essential to system resilience.
    • Example: The human body can tolerate different temperatures, variations in food supply, and repair itself and compensate for missing parts.
    • Resilience is not the same as being static or constant.
    • “Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.”
  • Self-Organization: The characteristic in which some systems can learn, evolve, self-regulate, and develop complexity on its own.

    • “Self-organization produces heterogeneity and unpredictability.”
    • Freedom, experimentation, and disorder are necessary conditions for self-organization (many people are uncomfortable with these properties).
  • Hierarchy: Self-organizing systems generate hierarchies of nested systems. Systems with subsystems as you move down and systems aggregated/comprising larger systems as you move up.

    • Example: A cell in your heart is a subsystem of the organ. The organ is a subsystem of a larger circulatory system. The circulatory system, along with other physiological systems, comprises a person. In turn, a person is a subsystem of a family or group of people. Groups of people are subsystems of a larger societal organization (neighborhood, city, state, nation, etc.).

    • Hierarchies give structure to systems (yielding stability and resilience) and also allow for efficiency and specialization. Certain subsystems are tasked with certain goals and subsystems only need information essential to those goals and behaviors.

    • Balance in the hierarchy is necessary for optimal behaviors:

      • Sub-optimization occurs when a subsystem’s goals dominate at the expense of the overall system goals.
      • Central control occurs when the system at the top of the hierarchy prevents subsystems for operating efficiently or freely.
    • “Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.”

Chapter Four: Why Systems Surprise Us

  • Much of our thinking makes use of models to explain and make sense of the world. These models may or may not accurately reflect reality.

  • “System structure is the source of system behavior. System behavior reveals itself as a series of events over time.”

  • Remember: many system relationship and interconnections are NOT linear. Moreover, the relative strength or influence of these relationships are dynamic.

  • Remember: the language we use to describe a system biases our understanding of the system.

    • Example: Labeling an undesirable outcome of a system as a “side effect” suggests that the outcome was unanticipated, poorly considered, and unwanted.
    • Garrett Hardin (ecologist): “Side effects no more deserve the adjective ‘side’ than does the ‘principal’ effect. It is hard to think in terms of systems, and we eagerly warp our language to protect ourselves from the necessity of doing so.”
  • Simplified systems models can lead to errors. For instance, with simple, single stock-and-flow diagrams, we make inherent assumptions about the nature and availability of the source for a flow or the sink for an outflow. This may be necessary to examine a system closely for a given task, but it’s worth noting that this activity opens us up to blindspots and assumptions.

  • Remember: Systems form a continuum. Systems are connected with other systems (on both ends of the model). The boundaries we draw around a system are arbitrary and serve the purposes of our analysis and the questions we want answered.

  • Limiting factor: “A necessary system input that is the one limiting the activity of the system at a particular moment.”

    • “Any physical entity with multiple inputs and outputs is surrounded by layers of limits.”
    • “There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.”
  • Remember: Do not underestimate the importance of system delays.

  • Bounded rationality: People make reasonable decisions based on the information available, but they don’t always have complete or correct information available.

    • “The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.”

Chapter Five: System Traps and Opportunities

  • Archetypes: System structures that exhibit standard or characteristic behaviors. These include problematic patterns of behavior.

  • These systemic problems are often blamed on specific actors or events but the problems are more likely to be inherent to the system.

  • “Blaming, disciplining, firing, twisting policy levers harder, hoping for a more favorable sequence of driving events, tinkering at the margins—these standard responses will not fix structural problems.”

  • Archetype: Policy Resistance (aka “fixes that fail”)

    • “Policy resistance comes from the bounded rationalities of the actors in a system, each with his or her own goals.”

    • Example: Wars on drugs that fail to reduce the use or prevalence of drugs.

      • Consider a single-system stock of the drug supply in a city.

      • Consider the divergent goals of different subsystems or actors within the system:

        • Addicts want to keep the drug stock high.
        • Law enforcement wants to keep the drug stock low.
        • Pushers want to keep it in the middle so prices are moderate (not too high and not too low).
        • Average citizens want to be safe from crime.
      • When one actor gains an advantage that moves the system stock (drug supply) to their advantage, the other actors will increase their efforts to move it in the opposite direction. The result is a standoff in which the stock remains constant (which nobody wants).

      • Two possible resolutions:

        • Overpower the system with one of the goals: this requires tremendous resources (say from law enforcement) and carries high costs whether it succeeds (resentment from the population, mass incarcerations, etc.).
        • Capitulate (counterintuitive): stop spending resources and energy on enforcing or resisting. Example: Ending Prohibition in 1933 in the USA.
  • Archetype: The Tragedy of the Commons

    • This trap occurs when there is shared resource that is limited and can be damaged by overuse.

      • Common sinks are also possible (i.e. shared resources where waste, pollution and byproducts are dumped).
    • “The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.

    • Example: Uncontrolled access to a popular national park or reef system. Large crowds will ultimately destroy the resource.

    • “The structure of a commons system makes selfish behavior much more convenient and profitable than behavior that is responsible to the whole community and to the future.”

    • Three possible resolutions:

      • Educate and exhort: Let people see the consequences of their actions and appeal to their morality.
      • Privatize the commons.
      • Regulate the commons.
  • Archetype: Drift to Low Performance

    • This trap results when a system’s normal state is bad and only gets worse over time.

    • Examples: Falling market share in a business, eroding morale in an organization, increasing pollution in the water, increasing obesity rates, etc.

    • Some characteristic features of these systems:

      • Actors tend to emphasize bad news more than good (i.e. they see things as worse than they really are).
      • The desired state of the system is influenced by the perceived state.
      • Standards aren’t absolute and when system performance slips, the system goals also slip.
      • The balancing feedback loop meant to keep the system stable is dominated by a downward trending reinforcing feedback loop.
    • Two possible solutions:

      • Maintain fixed standards regardless of performance.
      • Base goals on the best results of the past rather than the worst ones.
  • Archetype: Escalation

    • This trap results when reinforcing loops set up by competing actors try to outdo one another.

    • Example: Siblings fighting. One hits the other and the other hits back even harder. The result is a cycle of increasing escalation. Compare this with the example of the USA and USSR nuclear arms race of the Cold War. Alternatively, consider social media personalities and the never-ending efforts to capture consumer attention with increasingly outlandish behavior.

    • “Each actor takes its desired state from the other’s perceived system state—and ups it! Escalation is not just keeping up with the Joneses, but keeping slightly ahead of the Joneses.”

    • “Escalation in morality can lead to holier-than-thou sanctimoniousness. Escalation in art can lead from baroque to rococo to kitsch.”

    • Possible solutions:

      • Unilateral disarmament: deliberate deescalate, refuse to compete.
      • Negotiate a disarmament: create a new system with balancing loops to mitigate future escalations.
  • Archetype: Success to the Successful (aka competitive exclusion)

    • This trap occurs when wealth, power and advantage creates yet more wealth, power and advantage to incumbent parties. It is, in effect, a reinforcing feedback loop that favors existing winners (and ensures continued success into the future).

    • Some examples:

      • The rich have more ways to avoid taxation than the poor.
      • The rich have access to better education than the poor ensuring that future generations of rich offspring will fare better.
      • The rich have better access to bank credit.
      • The rich live in safer, less polluted, less crime afflicted areas.
    • Possible solutions:

      • Policies that level the playing field.
      • Competitive policies like antitrust laws.
  • Archetype: Addiction (aka dependence and “shifting the burden to the intervenor”)

    • This trap occurs when a system creates dependencies that result in behaviors meant to ensure the status quo at all costs.

    • Example: Dependence of industry of government subsidies, dependence of Western economies on cheap oil, the dependence of weapons manufacturers on government contracts.

    • “The trap is formed if the intervention, whether by active destruction or simple neglect, undermines the original capacity of the system to maintain itself. If that capability atrophies, then more of the intervention is needed to achieve the desired effect.”

    • “Addiction is finding a quick and dirty solution to the symptom of the problem, which prevents or distracts one from the harder and longer-term task of solving the real problem. Addictive policies are insidious because they are so easy to sell, so simple to fall for.”

      • Example: When the price of oil rises, instead of addressing conservation, efficiency and alternatives, it’s easy to resort to PRICE FIXING to solve the problem in the short-run.
    • Possible solutions:

      • Avoid the trap in the first place (i.e. avoid symptom-relieving policies or signal-denying policies).
      • Focus on long-term restructuring over short-term fixes.
      • Rebuild the system’s capabilities and resilience before eliminating the intervention.
  • Archetype: Rule Beating

    • Whenever there are rules, there will be people and schemes developed to beat, cheat and circumvent the rules. Often the issue is that the intent of the rules is circumvented—that is the “spirit” of the law.
    • “Rule beating produces the appearance of rules being followed.”
    • “Rule beating is usually a response of the lower levels in a hierarchy to over-rigid, deleterious, unworkable, or ill-defined rules from above.”
    • Trying to enforce the rule or strengthen the rule will usually aggravate the problem of rule beating and create further distortions in the system.
    • Treat rule beating as an important signal that something is structurally suboptimal in the system. Use it to revise, improve or better explain the rules.
  • Archetype: Seeking the Wrong Goal

    • Systems produce exactly what is asked of the system. If a system is yielding a wrong outcome, it’s likely that the system goals are wrong.
    • Example: “If the desired system state is national security, and that is defined as the amount of money spent on the military, the system will produce military spending.”
    • Example: “If the quality of education is measured by performance on standardized tests, the system will produce performance on standardized tests.”
    • Remember: Do not confuse effort with results.
    • [Note: This rule is related to Goodhart’s Law: When a measure becomes a target, it ceases to be a good measure. In part, this is because, from a system perspective, it becomes the output.]

Part Three: Creating Change—in Systems and in Our Philosophy
Chapter Six: Leverage Points—Places to Intervene in a System

  • Leverage points are specific aspects of systems where a small change can yield large results.

  • The key to manipulating leverage points:

    1. Identifying the leverage point (easy).
    2. Properly influencing the leverage point (hard). People usually push in the wrong direction or at the wrong speed.
  • Leverage points (from low to high leverage):

    1. Parameters (lowest leverage)

      • Examples: subsidies, taxes, standards.
      • Example: US government fixated on budget deficit through tweaking parameters: increasing spending, decreasing taxing. Regardless of the party in charge, the deficit increases (albeit at different rates).
      • “Putting different hands on the faucets may change the rate at which the faucets turn, but if they’re the same old faucets plumbed into the same old system...the system behavior isn’t going to change much.”
      • Parameters can make a bigger impact if they are tied to reinforcing feedback loops.
    2. Buffers

      • Example: a reservoir system for drinking water.
      • A buffer is a particularly large stock can stabilize a system by reducing the short-run impact of variable or asynchronous inflows and outflows.
      • Note that overly large buffers carry a downside: the system becomes less responsive to changes. It reacts slowly.
    3. Stock and Flow Structures

      • The sequencing and physical arrangement of a system via the intersections and ordering of flows influences behavior.
      • Example: The Hungarian road system was designed so that all traffic from one side of the country to the other passed through central Budapest. This decision results in big impacts on traffic and pollution.
      • Poorly laid out systems must (unfortunately) be rebuilt either in total or in part.
      • The problem with physical structure changes is that they are not easy, can be capital intensive and are slow to roll out.
    4. Delays

      • Delays are caused by uncertain flows as well as by response delays from feedback loops.
      • Systems cannot respond quickly if there are long-term delays built into the system structure.
      • “A delay in a feedback process is critical relative to rates of change in the stocks that the feedback loop is trying to control.” Short delays can cause overreactions. Long delays can result in prolonged or sustained system oscillations.
      • “It is usually easier to slow down the change rate, so that inevitable feedback delays won’t cause so much trouble.”
    5. Balancing Feedback Loops

      • Balancing feedback loops need:

        • A goal (e.g. compare stock vs. target/desired value).
        • A monitoring mechanism (observe stock on a periodic basis, trailing average basis, etc.).
        • A response mechanism (take action to change the flow state, if possible, to get system stock closer to target based on available information).
      • Complex systems contain many balancing feedback loops. Many of these loops are only activated in specific situations.

      • “One of the big mistakes we make is to strip away these ‘emergency’ response mechanisms because they aren’t often used, and they appear to be costly.” In the long run, this reduces the resilience of the system.

      • The signals that a feedback loop depends on are only as useful as the quality of the information given. If the loop is receiving FLAWED information, the system will not operate properly.

    6. Reinforcing Feedback Loops

      • “A reinforcing feed loop is self-reinforcing. The more it works, the more it gains power to work some more, driving system behavior in one direction.”
      • Reinforcing feedback loops cause powerful moves in either positive or negative directions: rapid growth vs. rapid collapse.
      • “Reducing the gain around a reinforcing loop—slowing the growth—is usually a more powerful leverage point in systems than strengthening balancing loops...”
    7. Information Flows

      • “Missing information flows is one of the most common causes of system malfunction.”
      • Look for actors in a system who have (or don’t have) access to critical information.
      • Adding or restoring information to a system can be an easier and faster fix than rebuilding the system infrastructure. This can be done, in part, by adding new feedback loops to the system.
      • “There is a systematic tendency for human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops.”
    8. System Rules

      • System rules are used to establish boundaries, constraints, incentives, and direct ideal behaviors within a system.
      • Though experiment: imagine restructured rules and how behavior might change under them. Example: What if colleges granted no degrees. Students go to college and learn until they are finished learning what they need to learn. Example: What if students were not graded individually but as groups?
      • “Power over the rules is real attention to the rules and to who has power over them.”
      • If you have a system with rules designed by corporation, maintained by corporations, and benefiting corporations you will, unsurprisingly, find a system built to cater to the needs of corporations.
    9. Self-Organization

      • This represents the ability of a system to evolve and adapt on its own. The system can develop new capabilities and structures to address new challenges and goals as it sees fit.
      • Constraints can be imposed on the system: boundaries for what and how the system can reorganize and evolve. Even with a finite set of rules, the system can create an infinite set of new combinations and innovations.
      • “Any system, biological, economic, or social that gets so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on this highly variable planet.”
      • Promoting self-organizing systems, in part, requires letting go of control and letting creativity and chaos flourish to create new, unanticipated outcomes.
    10. Goals

      • Changing system elements has low impact on a system, but changing its goal has significant impacts.
      • Example: What is the goal of a corporation? To make profits? Author says this is just necessary to play the game. Author says the real goal is to gain power so that customers, competitors, governments are under the control of the corporation so that it can tend to its business as it sees fit.
      • Example: Ronald Reagan changed the goal of the country from getting people to help the government to getting the government “off our backs.”
    11. Paradigms

      • Paradigms are the ideas and world-views we share as a society.
      • The paradigms we hold form the moral and intellectual foundation upon which society’s goals and aspirations stem. The paradigms are a source for our systems.
      • “The ancient Egyptians built pyramids because they believed in an afterlife. We build skyscrapers because we believe that space in downtown cities is enormously valuable.”
    12. Transcending Paradigms

      • “Keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is ‘true,’ that every a tremendously limited understanding of an immense and amazing universe...”
      • To believe in paradigms is a paradigm in itself.
      • Choose the paradigms that serve your purpose.

Chapter Seven: Living in a World of Systems

  • Author urges the reader to forego the illusion of control. Learn to dance with systems by being adaptable, prepared for surprises, change and variability. Learn to listen and read the signs from the system rather than fight it.

  • General tenets for “dancing with a system”:

    • Observe a system and understand its behavior before trying to change it. Lead with facts, not theories (the theories can develop from the observed facts).
    • Make your assumptions visible and subject them to scrutiny of others to strengthen and improve them. The goal is to discover a rigorous and resilient solution NOT to enrich your ego. Be open to experimentation, adaptation, criticisms and many iterations.
    • Ensure that the information in a system is not biased, delayed, inaccurate or incomplete.
    • Use language with care. Language should be clear and specific. Remember: “We don’t talk about what we see; we see only what we can talk about.” Language influences and shapes our paradigms and systems.
    • Pay attention to what is important, not just what is quantifiable. It’s easier to measure things that are quantifiable. It’s harder to measure quality and subjective metrics. The result is that we often focus on the former and ignore the latter. Just because something is difficult to measure doesn’t mean it is less important or doesn’t exist.
    • Create feedback policies for feedback systems. Policies need to build learning and evaluation into their system structures.
    • “Remember that hierarchies exist to serve the bottom layers, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole.”
    • Listen to what the system has to say. Before changing a system, be sure you understand what is present in the system and what strengths are already present (and can be leveraged).
    • Identify responsibility within the system. Determine if there are external triggering events that can be controlled. Understand the consequences resulting from the system. Example: having companies place intake pipes downstream from their outflow pipes in a river might make the companies more responsible about the wastewater they discharge.
    • Stay humble and keep learning.
    • Celebrate complexity and be comfortable with nonlinear, messy and dynamic systems.
    • Expand time horizons: adopt long-term thinking.
    • Adopt interdisciplinary thinking: adopt useful ideas, discard narrow-minded ones, embrace the diversity of thought.
    • Expand the boundary of caring. Remember that systems are endlessly interconnected.
    • Don’t erode the goal of goodness.

Get weekly email updates and additional content: Sign up for the free Mental Pivot Newsletter.