Donella Meadows

From Wikiquote
Jump to navigation Jump to search

Donella "Dana" Meadows (13 March, 194120 February, 2001) was a pioneering American environmental scientist, teacher and writer. She is best known as author of the influential book The Limits to Growth.

Quotes[edit]

  • Models can easily become so complex that they are impenetrable, unexaminable, and virtually unalterable.
    • Meadows (1980) "The unavoidable a priori" in: Randers J. ed., Elements of the system dynamics method, page 27.
  • The world is a complex, interconnected, finite, ecological–social–psychological–economic system. We treat it as if it were not, as if it were divisible, separable, simple, and infinite. Our persistent, intractable global problems arise directly from this mismatch.
  • Calculating how much carbon is absorbed by which forests and farms is a tricky task, especially when politicians do it.

Thinking in Systems: A Primer (2008)[edit]

Donella H. Meadows, edited by Diana Wright, Thinking in Systems: A Primer, Chelsea Green Publishing, 2008 (ISBN 9781603580557).

  • A system is a set of things – people, cells, molecules, or whatever – interconnected in such a way that they produce their own pattern of behavior over time. [...] The system, to a large extent, causes its own behavior!
    • Page 2.
  • Ever since the Industrial Revolution, Western society has benefited from science, logic, and reductionism over intuition and holism. Psychologically and politically we would much rather assume that the cause of a problem is “out there,” rather than “in here.” It’s almost irresistible to blame something or someone else, to shift responsibility away from ourselves, and to look for the control knob, the product, the pill, the technical fix that will make a problem go away.
    Serious problems have been solved by focusing on external agents – preventing smallpox, increasing food production, moving large weights and many people rapidly over long distances. Because they are embedded in larger systems, however, some of our “solutions” have created further problems. And some problems, those most rooted in the internal structure of complex systems, the real messes, have refused to go away.
    Hunger, poverty, environmental degradation, economic instability, unemployment, chronic disease, drug addiction, and war, for example, persist in spite of the analytical ability and technical brilliance that have been directed toward eradicating them. No one deliberately creates those problems, no one wants them to persist, but they persist nonetheless.
    That is because they are intrinsically systems problems – undesirable behaviors characteristic of the system structures that produce them. They will yield only as we reclaim our intuition, stop casting blame, see the system as the source of its own problems, and find the courage and wisdom to restructure it.
    • Pages 3-4.

Part one: systems structure and behavior[edit]

  • If a government proclaims its interest in protecting the environment but allocates little money or efforts toward that goal, environmental protection is not, in fact, the government's purpose. Purposes are deduced from behaviour, not from rhetoric or stated goals.
    • Page 14.
  • In fact, one of the most frustrating aspects of systems is that the purposes of subunits may add up to an overall behavior that no one wants.
    • Page 15.
  • Dynamic systems studies usually are not designed to predict what will happen. Rather, they're designed to explore what would happen, if a number of driving factors unfold in a range of different ways.
    • Page 46.
  • That very large system, with interconnected industries responding to each other through delays, entraining each other in their oscillations, and being amplified by multipliers and speculators, is the primary cause of business cycles. Those cycles don't come from presidents, although presidents can do much to ease or intensify the optimism of the upturns and the pain of the downturns. Economies are extremely complex systems; they are full of balancing feedback loops with delays and they are inherently oscillatory.
    • Page 58.
  • Whenever we see a growing entity, whether it be a population, a corporation, a bank account, a rumor, an epidemic, or sales of a new product, we look for the reinforcing loops that are driving it and for the balancing loops that ultimately will constrain it. We know those balancing loops are there, even if they are not yet dominating the system's behavior, because no real physical system can grow forever. [...] An economy can be constrained by physical capital or monetary capital or labor or markets or management or resources or pollution.
    • Page 59.

Part two: systems and us[edit]

  • Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability.
    • Page 79.
  • There always will be limits to growth. They can be self-imposed. If they aren't, they will be system-imposed. No physical entity can grow forever. If company managers, city governments, the human population do not choose and enforce their own limits to keep growth within the capacity of the supporting environment, then the environment will choose and enforce limits.
    • Page 103.
  • Changing the length of a delay may utterly change behavior. [...] Overshoots, oscillations, and collapses are always caused by delays.
    • Pages 104-105.
  • Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don't have perfect information, especially about more distant parts of the system. [...] We don't even interpret perfectly the imperfect information that we do have, say behavioral scientists. [...] Which is to say, we don't even make decisions that optimize our own individual good, much less the good of the system as a whole.
    • Pages 106-107.
  • Economic theory as derived from Adam Smith assumes first that homo economicus acts with perfect optimality on complete information, and second that when many of the species homo economicus do that, their actions add up to the best possible outcome for everybody. Neither of these assumptions stands up long against the evidence.
    • Page 107.
  • These examples confuse effort with result, one of the most common mistakes of this kind is designing systems around the wrong goal. Maybe the worst mistake this kind has been the adoption of the GNP as the measure of national economic success. [...] If you define the goal of a society as GNP, that society will do its best to produce GNP. It will not produce welfare, equity, justice, or efficiency unless you define a goal and regularly measure and report the state of welfare, equity, justice, or efficiency.
    • Pages 139-140.

Part three: creating change – in systems and in our philosophy[edit]

  • Growth has costs as well as benefits, and we typically don't count the costs – among which are poverty and hunger, environmental destruction, and so on – the whole list of problems that we are trying to solve with growth! What is needed is much slower growth, very different kinds of growth, and in some cases no growth or negative growth. The world's leaders are correctly fixated on economic growth as the answer to virtually all problems, but they're pushing it with all their might in the wrong direction.
    • Page 146.
  • Power over the rules is real power. That's why lobbyists congregate when Congress writes laws, and why the Supreme Court, which interprets and delineates the Constitution – the rules for writing the rules – has even more power than Congress. If you want to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them.
    • Page 158.
  • Even people within systems don't often recognize what whole-system goal they are serving. "To make profits", most corporations would say, but that's just a rule, a necessary condition to stay in the game. What is the point of the game? To grow, to increase market share, to bring the world (customers, suppliers, regulators) more and more under the control of the corporation, so that its operations becomes ever more shielded from uncertainty.
    • Page 161.
  • The shared idea in the mind of society, the great big unstated assumptions, constitute that society's paradigm, or deepest beliefs about how the world works. [...] people who have managed to intervene in systems at the level of paradigm have hit a leverage point that totally transforms systems.
    • Pages 162-163.
  • How is it that one way of seeing the world becomes so widely shared that institutions, technologies, production systems, buildings, cities, become shaped around that way of seeing?
    • Page 169.
  • Why are they [people] more likely to listen to people who tell them they can't make changes than they are to people who tell them they can?
    • Page 169.
  • You've seen how information holds system together and how delayed, biased, scattered, or missing information can make feedback loops malfunction. Decision-makers can respond to information they don't have, can't respond accurately to information that is inaccurate, and can't respond in a timely way to information that is late. I would guess that most of what goes wrong in systems goes wrong because of biased, late, or missing information. [...] Information is power.
    • Page 173.
  • Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can't measure. Think about that for a minute. It means that we make quantity more important than quality.
    • Pages 175-176.
  • No one can define or measure justice, democracy, security, freedom, truth, or love. [...] But if no one speaks up for them, if systems aren't designed to produce them, if we don't speak about them and point toward their presence or absence, they will cease to exist.
    • Page 176.

Appendix (summary)[edit]

  • The least obvious part of a system, its function or purpose, is often the most crucial determinant of the system's behavior.
  • A delay in a balancing feedback loop makes the system likely to oscillate.
  • In physical, exponentially growing systems, there must be at least one reinforcing loop driving growth and at least one balancing feedback loop constraining growth, because no system can grow forever in a finite environment.
  • There always will be limits to growth.
  • A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time.
  • When there are long delays in feedback loops, some sort of foresight is essential.
  • The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.
    • Pages 188-191.

Quotes about Donella Meadows[edit]

  • Mesarovic and Pestel are critical of the Forrester-Meadows world view, which is that of a homogeneous system with a fully predetermined evolution in time once the initial conditions are specified.
    • New Scientist. Vol. 66, nr. 947. May 1, 1975. p. 272

See also[edit]

Wikipedia
Wikipedia
Wikipedia has an article about: