On Systems

everything is connected

Out of intense complexities, intense simplicities emerge. - Winston Churchill

This note is a primer on problem solving on scales from micro to macro, how systems exist and react in the world and while acknowledging that all models are false, they help us simplify and at times make better predictions. All models are inaccurate, but some are useful.

A system is an interconnected set of elements that is coherently organized in a way that delivers something (elements, interconnections, function/purpose). Systems can be self-organizing, self-repairing (up to a point), resilient and many are evolutionary (adaptive).

Intangibles (such as school pride) are also part of systems. The best way to deduce a system's purpose is to watch it for some amount of time to see how it behaves (avoid rhetoric and stated goals). An important function of nearly every system is its own perpetuation.

Systems thinking transcends disciplines and cultures and when it is done right, it over arches history as well.

Systems work well

Resilience is the ability to survive and persist in a variable environment. Resilience in a system is restored through balancing feedback loops through different mechanisms, at different time scales and with redundancy. A set of feedback loops that can restore or rebuild feedback loops is resilience at a higher level – meta-resilience. Even higher meta-meta-resilience comes from feedback loops that can learn, create, design and evolve ever more complex restorative structures. Systems that can do this are self-organizing.

Our greatest glory is not in never falling, but in rising every time we fall. - Confucius

A resilient system has a big plateau, a lot of space over which it can wander, with gentle, elastic walls that will bounce it back, if it comes near a dangerous edge. As a system loses resilience, this plateau shrinks. Resilience often coupled with dynamism as static systems tend to become fragile.

Self Organization

Self-organization leads to complexity, heterogeneity and unpredictability. Like resilience, often sacrificed for productivity/short-term gain but drastically increases fragility of the system overall. Few, simple organizing principles can lead to wildly different self-organizing outcomes.

Hierarchy is the arrangement of systems and subsystems.

Complex systems can evolve from simple systems only if there are stable intermediate forms. The resulting complex forms will naturally be hierarchical. That may explain why hierarchies are so common in the systems nature presents to us. Among all possible complex forms, hierarchies are the only ones that have had the time to evolve.

Hierarchies are beautiful systems inventions, not only because they give a system stability and resilience, but also because they reduce the amount of information that any part of the system has to keep track of. In hierarchical systems relationships within each subsystem are denser and stronger than relationships between subsystems.

Everything is still connected to everything else, but not equally strongly. If these differential information links within and between each level of the hierarchy are designed right, feedback delays are minimized. No level is overwhelmed with information. The system works with efficiency and resilience.

Hierarchies are partially decomposable, and much can be learned by taking apart systems at different hierarchical levels and studying them separately.

Hierarchies evolve from the lowest level up. The original purpose of a hierarchy is always to help its originating subsystems do their jobs better. This is something which is easily forgotten and leads to malfunctioning hierarchies (suboptimal systems).

External solutions help solve many problems (such as vaccines) but those deeply embedded in the internal structure of systems won't go away unless we see the problem holistically. Ie see the system as the cause of the problem and restructure it.


Individual rationalism can lead to collective insanity – why things happen much faster or slower than people expect and why systems can unexpectedly jump into a behavior you've never seen before (leaping emergent effects).

The behavior of a system cannot be known just by knowing the elements of which the system is made.

Terms: Archetypes are common structures which produce characteristic behaviors. Stock is accumulation of material over time, a memory of the history of changing flows in the system. Dynamics is behavior over time. Dynamic equilibrium stays the same though it is always changing (inflows exactly equal outflows).

People tend to focus more on stock than flows (> inflow = < outflow)1. Stocks take time to change because flows take time to flow2. Changes in stocks set the pace of the dynamics in the system3. Stocks allow inflows and outflows to be decouple, independent and temporarily out of balance.

The world is a collection of feedback processes. The gap, discrepancy, between current and ideal state drives feedback loops and the bigger the gap the stronger the feedback loop

A stock system is a system with two competing, balancing loops (thermostat). The bigger the gap (between hot and cold in this case) the bigger the outflow. Shifting dominance occurs when one loop dominates and therefore drives behavior, oscillations and complex behavior.

13. Systems with similar feedback structures produce similar dynamic behavior. Typical delays such as perception, response, delivery. These delays cause small changes to turn into massive oscillations 15. 2

Stock systems

Renewable stock constrained by a non-renewable one (oil)1. Look for loops driving system and the loop that will ultimately constrain it (can be temporary, permanent and/or more than one). Renewable is constrained by renewable (fishing).

Important questions to ask to test the value of any model 1. Are the driving factors likely to unfold this way? 2. If they did, would the system react this way?3. What is driving the driving factors?

Model utility depends not on whether its driving scenarios are realistic (since no one can know for sure), but on whether it responds with a realistic pattern of behavior.


Everything we think we know about the world is a model (language, maps, books, databases, equations, computer programs, mental models) – nothing will ever be the real world. Our models usually have a strong congruence with the real world.

Systems fool us by presenting themselves as a series of events. Like the tip of the iceberg above the water, events are the most visible aspect of a larger complex, but not always the most important. We are less likely to be surprised if we can see how events accumulate into dynamic patterns of behavior. Look below the surface.

The behavior of a system is its performance over time – growth, stagnation, decline, oscillation, randomness, evolution.

The ideal of behaviorism is to eliminate coercion: to apply controls by changing the environment in such a way as to reinforce the kind of behavior that benefits everyone. - BF Skinner


When a systems thinker encounters a problem, the first thing he does is look for data, item graphs, the history of the system. That's because long- term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening but why.

  • Systems thinkers try to understand the connections between events and the resulting behavior and the mechanical characteristics of the structure.

  • Behavior based models are more useful than event based models but still flawed as they over focus on flows and under emphasize stocks. There is also no reason to expect any flow to bear a stable relationship to any other flow.

  • Humans are in sufficiently skilled at seeing in systems' history the clues to the structures from which behavior and events flow.

  • Non-linear relationships do not change in proportion and changes the relative strength of the feedback loops (shifting dominance).


The greatest complexities occur exactly at the boundaries – sources of diversity and creativity. Boundaries are false, man-made but necessary to simplify and comprehend.

The most important input in a system is the one that is most limiting. Growth itself depletes or enhances limits and therefore changes the limits themselves. Bounded rationality is when people make reasonable decisions based on information they have but since it is imperfect it leads to bad outcomes. Think of Wittgenstein.

The limits of my language mean the limits of my world. - Wittgenstein

Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives and disincentives can be restructured so that separate, bounded rational actions do add up to results that everyone desires.

It's amazing how quickly and easily behavior changes can come, with even the slightest enlargement of bounded rationality, by providing better, more complete, timelier information.

What makes a difference is redesigning the system to improve the information, incentives, disincentives, goals, stresses, and constraints that have an effect on specific actors. One must change the structure to change the behaviors.However, and conversely, our models fall far short of representing the world fully. You can't navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, mis-design, or misread systems if you don't respect their properties of resilience, self-organization and hierarchy.


The way to deal with policy resistance is to overpower it, totally let go or find ways to align the goals of all the subsystems involved.

The Tragedy of the commons is invisible or too long delayed feedback (educate / exhort, privatize or regulate the commons). Humans have difficulty with low time preference decision making.

Drift to low performance

There is always a trap in allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance. It sets up a reinforcing feedback loop of eroding goals that sets a system drifting to low performance. Fear is the mind killer…

The solution is to keep performance standards absolute and let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift of high performance

Escalation is the avoidance of falling into it in the first place but if you do, refuse to compete or negotiate a new system with balancing loops to control the escalation.


Success to the successful – winners keep winning and enhance prospects of future prosperity. Diversification, strict limitation on the fraction of the pie any one winner may win (anti trust laws), policies leveling the playing field, policies that devise rewards for success that do not bias the next round of competition all good solutions.

Addiction shouldn’t be ignored, be aware of symptom relieving or signal denying policies or practices that don't really address the problem. Take the focus off short-term relief and put it on long-term restructuring.

Rule beating – design, or redesign, rules to release creativity. Head in the direction of beating the rules, but in the also be mindful of achieving the purpose of the rules in their original form.

Seeking the wrong goals can be fatal. Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not results.


A leverage point is a point in a system where a small change can lead to a big shift in behavior. The leverage point is often hidden and counterintuitive. Examples of leverage points (from least to most effective):

  • Numbers – constants and parameters such as subsidies, taxes and standards

  • Least effective as changing these variables rarely changes the behavior of the system

  • Buffers – the sizes of stabilizing stocks relative to their flows

  • Big stocks relative to their flows are more stable than small ones. Often stabilize a system by increasing the capacity of the buffer but if the buffer gets too big, the system gets inflexible

  • Stock and flow structures – physical systems and their nodes of intersection

The stocks and flows and their physical arrangement can have a tremendous effect on how the system operates. The only way to fix a system that is laid out poorly is to rebuild it.

Delays are the lengths of time relative to the rates of system changes. A delay in the feedback process is critical relative to rates of change in the stocks that the feedback loop is trying to control. It is usually easier to slow down the change rate so that inevitable feedback delays won't cause much trouble or oscillations.


Balancing feedback loops is the strength of the feedback loop itself. It is important relative to the impacts they are trying to correct. One of the big mistakes is removing these "emergency" response mechanisms because they aren't often used and they appear to be costly. There may be no effect in the short-term but in the long-term you drastically reduce the range of conditions over which the system can survive. Black swan scenarios are unidentifiable, and one should always protect against total ruin or death. For people, this means reducing personal rest, recreation, socialization, meditation, etc. for short-term productivity over long-term health.

Reinforcing feedback loops are the strength of the gain of driving loops. Reinforcing loops are sources of growth, explosion, erosion and collapse in systems. Slowing the growth is usually a more powerful leverage point in systems than strengthening balancing loops and far more preferable than letting the reinforcing loop run unencumbered.


Information flows are the structure of who does and does not have access to information. A new feedback loop to a place it wasn't going before.

Rules include incentives, punishments, constraints. Rules are high leverage points. Power over rules is real power. If you want to understand the deepest malfunctions of systems, pay attention to the rules and who has power over them.

Self-organization – the power to add, change or evolve system structure. The ability to self-organize is the strongest form of system resilience as it can evolve and survive almost any change, by changing itself.

Goals are the purpose or function of the system. Everything further down the list from physical stocks and flows, feedback loops, information flows, even self-organizing behavior will be twisted to conform to the goal.

Single players who can change the system goal can affect the whole system. Dictator anyone?

Paradigms are the mind-set out of which the system (it's goals, structure, rules, delays, parameters) arises. Paradigms are the source of systems and harder to change than anything else about the system. The best chance to change paradigms is to keep pointing at the anomalies and failures in the old paradigm. One must get outside the system and force oneself to see the system as a whole (Galilean Relativity).

Transcending paradigms is difficult. But by keeping oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is "true" gives a tremendous source of perspective when dealing with systems


Systems can't be controlled but they can be designed and redesigned.

There are guidelines for living in a world of systems. Get the beat of the system by observing how it behaves before disturbing it. This forces you to focus on facts and long-term behavior rather than rhetoric and theories.

Expose your mental models to the light of day–judicious testing of theories allows you to faster admit uncertainties and correct mistakes leading to more flexibility. Mental flexibility, the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure, is a necessity when you live in a world of flexible systems.

Well, the first rule is that you can’t really know anything if you just remember isolated facts and try to bang ‘em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have mental models in your head. And you’ve got to array your experience, both vicarious and direct, on this latticework of models. - Charlie Munger

In sum, one should honor, respect and distribute information. Use language with care and enrich it with systems concepts – keep it concrete, meaningful and truthful. Pay attention to what is important, not just what is quantifiable – quality over quantity and never ignore a part of the a system just because it can't be counted.

Make feedback policies for feedback systems. Go for the good of the whole – don't optimize something which shouldn't be done at all. Listen to the wisdom of the system. Locate responsibility within the system – design systems which are accountable for its own actions.

Remain humble, remain hungry. Acknowledging uncertainty leads to more credibility. Celebrate complexity and expand time horizons. Defy the disciplines, be a multidisciplinary learner and thinker. Systems are incredibly complex and incredibly useful to try to understand. Step back and see the whole picture, observe the behavior of the system before attempting to change anything. What one does in a system has an impact on the whole system. Study intently and act wisely.