John Gall

From Wikiquote
Jump to navigation Jump to search

John Gall (September 18, 1925 - December 15, 2014) was an American pediatrician, known for his 1975 book General systemantics.

Quotes[edit]

General systemantics, an essay on how systems work, and especially how they fail..., 1975[edit]

John Gall (1975). General systemantics : an essay on how systems work, and especially how they fail, together with the very first annotated compendium of basic systems axioms : a handbook and ready reference for scientists, engineers, laboratory workers, administrators, public officials, systems analysts, etc., etc., etc., and the general public.. General Systemantics Press, Ann Arbor, Michigan.

  • There is a world of difference, psychologically speaking, between the passive observation that Things Don't Work Out Very Well, and the active, penetrating insight that. Complex Systems Exhibit Unexpected Behavior.
    • p. 33 cited in: Stanley A. Clayes, David Gelvin Spencer, Martin S. Stanford (1979) Contexts for composition. p. 94
  • The Aswan Dam, built at enormous expense to improve the lot of the Egyptian peasant, has caused the Nile to deposit its fertilizing sediment in Lake Nasser, where it is unavailable. Egyptian fields must now be artificially fertilized. Gigantic fertilizer plants have been built to meet the new need. The plants require enormous amounts of electricity. The dam must operate at capacity merely to supply the increased need for electricity which was created by the building of the dam
    • p. 18. Cited in: Harvey J. Bertcher (1988) Staff development in human service organizations. p. 45
A system represents someone's solution to a problem. The system doesn't solve the problem (p.74)
  • A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
    • p. 65, cited in: Grady Booch (1991) Object oriented design with applications. p. 11
  • A selective process goes on, whereby systems attract and keep those people whose attributes are such as to make them adapted to life in the system: Systems attract systems-people.
    • p. 66
  • A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.
  • A system represents someone's solution to a problem. The system doesn't solve the problem.
    • p. 74 Cited in: Roger Kaufman and Fenwick W. English (1979) Needs Assessment: Concept and Application, p. 94
  • Systems Are Seductive. They promise to do a hard job faster, better, and more easily than you could do it by yourself. But if you set up a system, you are likely to find your time and effort now being consumed in the care and feeding of the system itself. New problems are created by its very presence. Once set up, it won't go away, it grows and encroaches. It begins to do strange and wonderful things. Breaks down in ways you never thought possible. It kicks back, gets in the way, and opposes its own proper function. Your own perspective becomes distorted by being in the system. You become anxious and push on it to make it work. Eventually you come to believe that the misbegotten product it so grudgingly delivers is what you really wanted all the time. At that point encroachment has become complete. You have become absorbed. You are now a systems person.
    • p. 90 as cited in: Robert B. Seidensticker (2006) Future Hype: The Myths of Technology Change. p. 45
  • A complex system can fail in an infinite number of ways
    • p. 92, cited in: Erik Hollnagel (2004) Barriers and accident prevention. p. 182
  • Loose systems last longer and function better.
    • p. 93. cited: Paul F. Downton (2008) Ecopolis: Architecture and Cities for a Changing Climate. p. 580

Systemantics: the underground text of systems lore, 1986[edit]

John Gall, D.H. Gall (1986) Systemantics: the underground text of systems lore.

  • The system always kicks back. — Systems get in the way— or, in slightly more elegant language: Systems tend to oppose their own proper functions. Systems tend to malfunction conspicuously just after their greatest triumph.
    • p. 27 cited in: Kevin Kelly (1988) Signal: communication tools for the information age. p. 7
A temporary patch will very likely be permanent. (p.36)
  • Even Toynbee, floundering through his massive survey of 20-odd civilizations, was finally able to discern only that: Systems tend to malfunction conspicuously just after their greatest triumph. Toynbee explains this effect by pointing out the strong tendency to apply a previously successful strategy to the new challenge.
    • p. 35 cited in: Kevin Kelly (1988) Signal: communication tools for the information age. p. 7
  • The field of Architecture has given rise to a second major principle relating to the Life Cycle of Systems. This principle has emerged from the observation that temporary buildings erected to house Navy personnel in World War I continued to see yeoman service in World War II as well as in subsequent ventures, and are now a permanent, if fading, feature of Constitution Avenue in Washington... We conclude: A temporary patch will very likely be permanent.
    • p. 36
  • But how does it come about, step by step, that some complex Systems actually function? This question, to which we as students of General Systemantics attach the highest importance, has not yet yielded to intensive modern methods of investigation and analysis. As of this writing, only a limited and partial breakthrough can be reported, as follows: A COMPLEX SYSTEM THAT WORKS IS INVARIABLY FOUND TO HAVE EVOLVED FROM A SIMPLE SYSTEM THAT WORKED
    • p. 65 cited in "Quotes from Systemantics – Funny, But Scary Too" Posted on agileadvice.com March 3, 2006 by Mishkin Berteig. This quote was mentioned in General systemantics (1975, p. 71)

Quotes about John Gall[edit]

  • The largest building in the world, the space vehicle preparation shed at Cape Kennedy, generates its own weather, including clouds and rain. This and other system principles are explained in a delightful and amusing book by John Gall (1986) entitled Systematics: The underground text of systems lore; how systems really work and how they fail, and is recommended for anyone who designs systems. One can choose to ignore the principles by which systems operate and continue to be puzzled as to why they do not seem to act as we intend, or recognize the principles and thus improve the ability to design systems that work.
    • Wayne P. Stevens (1991) Software design: concepts and methods. p. 187
  • John Gall's Systemantics: How Systems Work and Especially How They Fail has several suggestions from 1975 that are still relevant here:
    • In general, systems work poorly or not at all.
    • New system mean new problems.
    • Complex systems usually operate in failure mode
    • When a fail-safe system fails, it fails by failing to fail safe.
    • Peter G. Neumann (1994) Computer-Related Risks. p. 316
A complex system can fail in an infinite number of ways. - Gall (1975, p. 92)
  • Some years ago, many problems encountered by system developers were brought together in a pithy book by John Gall called Systemantics (Gall 1975). The book applies equally to computer systems and to the encompassing systems of coordinated human enterprise. The book's style is droll but its purpose is serious; it should be required reading. Among the many important rules and admonitions the book advances are several worth repeating here for anyone contemplating biodiversity information systems development:
    A complex system that works is invariably found to have evolved from a simple system that worked
    A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system
    In setting up a system, tread softly. You may be disturbing another system that is actually working
    A system can fail in an infinite number of ways
    In complex systems malfunction and even total nonfunction may not be detectable for a long period, if ever
    • Robert C. Szaro,David W. Johnston (1996) Biodiversity in Managed Landscapes: Theory and Practice. p. 182
  • Gall (1975) mentions numerous examples of malfunctioning systems. For example, at Cape Canaveral there is an enormous hangar that shelters the rockets being constructed. It is so large that it produces its own climate, including clouds and rain. thus, the very structure that is supposed to shelter rockets and people sprinkles them with its own rain.
    • Pierre Moessinger (2000) The Paradox of Social Order: Linking Psychology and Sociology. p. 20-21
  • The following four propositions, which appears to the author to be incapable of formal proof, are presented as Fundamental Postulates upon which the entire superstructure of General Systemantics... is based...
    EVERY THING IS A SYSTEM
    EVERYTHING IS PART OF A LARGER SYSTEM
    THE UNIVERSE IS INFINITELY SYSTEMATIZABLE BOTH UPWARDS (LARGER SYSTEMS) AND DOWNWARDS (SMALLER SYSTEMS)
    ALL SYSTEMS ARE INFINITELY COMPLEX (The illusion of simplicity comes from focusing attention on one or a few variables.)
    John Gall, Systematics, 1975
    • Yoram Moses ,Moshe Y Vardi, Ronald Fagin,J oseph Y Halpern (2003) Reasoning About Knowledge. p. 106
  • In one of my favorite books, Systemantics: How Systems Work and Especially How They Fail, John Gall (1977) warns against the rising tide of “systemism” — “the state of mindless belief in systems; the belief that systems can be made to function to achieve desired goals.” Gall’s point is that “the fundamental problem does not lie in any particular system but rather in systems as such.” These systems become the goal rather than the means to a goal.
    Adherents of these “systemisms” would argue that implementing these programs should not result in losing track of the primary goal (results rather than process). But Gall points out how this subversion becomes inevitable through two of his axioms: 1) “Systems Tend to Expand to Fill the Known Universe” and 2) “Systems Tend to Oppose Their Own Proper Functions, Especially in Connection with the Phenomenon of ‘Administrative Encirclement’ ”(Gall 1977).
    • Jim Highsmith (2004) Agile Project Management: Creating Innovative Products.. p. 33

External links[edit]

Wikipedia
Wikipedia
Wikipedia has an article about: