Jump to content

W. Ross Ashby

From Wikiquote
(Redirected from William Ross Ashby)
W. Ross Ashby

W. Ross Ashby (September 6, 1903November 15, 1972) was an English psychiatrist and a pioneer in cybernetics and the study of complex systems.

Quotes

[edit]
  • During the last few years it has become apparent that the concept of "machine" must be very greatly extended if it is to include the most modern developments. Especially is this true if we are studying the brain and attempting to identify the type of mechanism that is responsible for the brain’s outstanding powers of thought and action. It has become apparent that when we used to doubt whether the brain could be a machine, our doubts were due chiefly to the fact that by ‘‘machine’’ we understood some mechanism of very simple type. Familiar with the bicycle and the typewriter, we were in great danger of taking them as the type of all machines. The last decade, however, has corrected this error. It has taught us how restricted our outlook used to be; for it developed mechanisms that far transcended the utmost that had been thought possible, and taught us that ‘‘mechanism’’ was still far from exhausted in its possibilities. Today we know only that the possibilities extend beyond our farthest vision.
  • The invasion of psychology by cybernetics is making us realize that the ordinary concepts of psychology must be reformulated in the language of physics if a physical explanation of the ordinary psychological phenomena is to become possible. Some psychological concepts can be re-formulated more or less easily, but others are much more difficult, and the investigator must have a deep insight if the physical reality behind the psychological phenomena is to be perceived
    • W. Ross Ashby, "Review of Analytical Biology, by G. Sommerhoff." In: Journal of Mental Science Vol 98 (1952), p. 88; As cited in Peter M. Asaro (2008)
  • If intellectual power is to be developed, we must somehow construct amplifiers for intelligence — devices that, supplied with a little intelligence, will emit a lot.
    • Ashby. "Design for an intelligence amplifier." Automata studies (1956): 215-234. p. 216
  • Two main lines are readily distinguished. One already well developed in the hands of von Bertalanffy and his co-workers, takes the world as we find it, examines the various systems that occur in it - zoological, physiological, and so on - and then draws up statements about the regularities that have been observed to hold. This method is essentially empirical. The second method is to start at the other end. Instead of studying first one system, then a second, then a third, and so on, it goes to the other extreme, considers the set of all conceivable systems and then reduces the set to a more reasonable size. This is the method I have recently followed.
  • Every isolated determinate dynamic system, obeying unchanging laws, will ultimately develop some sort of organisms that are adapted to their environments.
    • Ashby (1962), quoted in: V. Lawrence Parsegian (1972) This cybernetic world of men, machines, and earth systems'. p. 178: About the principle of self-organization

Design for a Brain: The Origin of Adaptive Behavior (1952)

[edit]
Source: Ashby (1952) Design for a Brain: The Origin of Adaptive Behavior. New York, Wiley. (1960 edition online)
  • Every stable system has the property that if displaced from a state of equilibrium and released, the subsequent movement is so matched to the initial displacement that the system is brought back to the state of equilibrium. A variety of disturbances will therefore evoke a variety of matched reactions.
    • p.54
  • The primary fact is that all isolated state-determined dynamic systems are selective: from whatever state they have initially, they go towards states of equilibrium. These states of equilibrium are always characterised, in their relation to the change-inducing laws of the system, by being exceptionally resistant.
    • p. 238

An Introduction to Cybernetics (1956)

[edit]
Source: W. Ross Ashby (1956) An Introduction to Cybernetics. Chapman & Hall.
  • Many workers in the biological sciences — physiologists, psychologists, sociologists — are interested in cybernetics and would like to apply its methods and techniques to their own specialty. Many have, however, been prevented from taking up the subject by an impression that its use must be preceded by a long study of electronics and advanced pure mathematics; for they have formed the impression that cybernetics and these subjects are inseparable.
    The author is convinced, however, that this impression is false. The basic ideas of cybernetics can be treated without reference to electronics, and they are fundamentally simple; so although advanced techniques may be necessary for advanced applications, a great deal can be done, especially in the biological sciences, by the use of quite simple techniques, provided they are used with a clear and deep understanding of the principles involved. It is the author’s belief that if the subject is founded in the common-place and well understood, and is then built up carefully, step by step, there is no reason why the worker with only elementary mathematical knowledge should not achieve a complete understanding of its basic principles. With such an understanding he will then be able to see exactly what further techniques he will have to learn if he is to proceed further; and, what is particularly useful, he will be able to see what techniques he can safely ignore as being irrelevant to his purpose.
    • Preface

Part I: Mechanism

[edit]
  • Cybernetics was defined by Wiener as “the science of control and communication, in the animal and the machine” — in a word, as the art of steermanship, and it is to this aspect that the book will be addressed. Co-ordination, regulation and control will be its themes, for these are of the greatest biological and practical interest.
    We must, therefore, make a study of mechanism; but some introduction is advisable, for cybernetics treats the subject from a new, and therefore unusual, angle... The new point of view should be clearly understood, for any unconscious vacillation between the old and the new is apt to lead to confusion.
    • p. 1: Lead paragraph
  • Cybernetics treats not things but ways of behaving. It does not ask “what is this thing?” but “what does it do?”... It is thus essentially functional and behaviouristic. Cybernetics deals with all forms of behavior in so far as they are regular, or determinate, or reproducible. The materiality is irrelevant... The truths of cybernetics are not conditional on their being derived from some other branch of science. Cybernetics has its own foundations.
    • p. 1; As cited in: Stuart A. Umpleby, "Ross Ashby's general theory of adaptive systems." International Journal of General Systems 38.2 (2009): 231-238.
  • Cybernetics is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society. And it can provide the common language by which discoveries in one branch can readily be made use of in the others... [There are] two peculiar scientific virtues of cybernetics that are worth explicit mention. One is that it offers a single vocabulary and a single set of concepts suitable for representing the most diverse types of system... The second peculiar virtue of cybernetics is that it offers a method for the scientific treatment of the system in which complexity is outstanding and too important to be ignored. Such systems are, as we well know, only too common in the biological world!
    • p. 4-5
  • The most fundamental concept in cybernetics is that of "difference", either that two things are recognisably different or that one thing has changed with time. Its range of application need not be described now, for the subsequent chapters will illustrate the range abundantly. All the changes that may occur with time are naturally included, for when plants grow and planets age and machines move some change from one state to another is implicit. So our first task will be to develop this concept of "change", not only making it more precise but making it richer, converting it to a form that experience has shown to be necessary if significant developments are to be made.
    • p. 9: Chapter 2 Change, lead paragraph.
  • By a state of a system is meant any well-defined condition or property that can be recognised if it occurs again. Every system will naturally have many possible states.
    • p. 25
  • [T]he concept of “feedback”, so simple and natural in certain elementary cases, becomes artificial and of little use when the interconnexions between the parts become more complex. When there are only two parts joined so that each affects the other, the properties of the feedback give important and useful information about the properties of the whole. But when the parts rise to even as few as four, if every one affects the other three, then twenty circuits can be traced through them; and knowing the properties of all the twenty circuits does not give complete information about the system. Such complex systems cannot be treated as an interlaced set of more or less independent feedback circuits, but only as a whole.
    For understanding the general principles of dynamic systems, therefore, the concept of feedback is inadequate in itself. What is important is that complex systems, richly cross-connected internally, have complex behaviours, and that these behaviours can be goal-seeking in complex patterns.
    • p. 54 as cited in: Margaret A. Bode (2006) Mind as Machine: A History of Cognitive Science, Volume 1. p.229
  • As shorthand, when the phenomena are suitably simple, words such as equilibrium and stability are of great value and convenience. Nevertheless, it should be always borne in mind that they are mere shorthand, and that the phenomena will not always have the simplicity that these words presuppose.
    • p. 85
  • There comes a stage, however, as the system becomes larger and larger, when the reception of all the information is impossible by reason of its sheer bulk. Either the recording channels cannot carry all the information, or the observer, presented with it all, is overwhelmed. When this occurs, what is he to do? The answer is clear: he must give up any ambition to know the whole system. His aim must be to achieve a partial knowledge that, though partial over the whole, is none the less complete within itself, and is sufficient for his ultimate practical purpose

Part 2: Variety

[edit]
  • The fundamental questions in regulation and control can be answered only when we are able to consider the broader set of what it might do, when “might” is given some exact specification.
    • p. 121
  • [Constraint] is a relation between two sets, and occurs when the variety that exists under one condition is less than the variety that exists under another.
    • p. 127
  • When a constraint exists advantage can usually be taken of it.
    • p. 130
  • Further, as every law of nature implies the existence of an invariant, it follows that every law of nature is a constraint.
    • p. 130
  • The concept of "variety" [is] inseparable from that of "information."
    • p. 140

Part 3: Regulation and control

[edit]
  • The most basic facts in biology are that this earth is now two thousand million years old, and that the biologist studies mostly that which exists today.
    • p. 196
  • Variety can destroy variety.
    • p. 207
  • Its importance is that if R[egulator] is fixed in its channel capacity, the law places an absolute limit to the amount of regulation (or control) that can be achieved by R, no matter how R is re-arranged internally, or how great the opportunity in T. Thus the ecologist, if his capacity as a channel is unchangeable, may be able at best only to achieve a fraction of what he would like to do. This fraction may be disposed in various ways —he may decide to control outbreaks rather than extensions, or virus infections rather than bacillary — but the quantity of control that he can exert is still bounded. So too the economist may have to decide to what aspect he shall devote his powers, and the psychotherapist may have to decide what symptoms shall be neglected and what controlled.
    • p. 245: Regarding the law of requisite variety
  • Throughout, we shall be exemplifying the thesis of D. M. MacKay: that quantity of information, as measured here, always corresponds to some quantity, i.e. intensity, of selection, either actual or imaginable
    • p. 252
  • Duration of selection. At this point a word should be said about how long a given act of selection may take, for when actual cases are examined, the time taken may, at first estimate, seem too long for any practical achievement. The question becomes specially important when the regulator is to be developed for regulation of a very large system. Approximate calculation of the amount of selection likely to be necessary may suggest that it will take a time far surpassing the cosmological; and one may jump to the conclusion that the time taken in actually achieving the selection would have to be equally long. This is far from being the case, however.
    • p. 260

Quotes about W. Ross Ashby

[edit]
  • General systems theory is considered as a formal theory (Mesarovic, Wymore), a methodology (Ashby, Klir), a way of thinking (Bertalanffy, Churchman), a way of looking at the world (Weinberg), a search for an optimal simplification (Ashby, Weinberg), didactic method (Boulding, Klir, Weinberg), metalanguage (Logren), and profession (Klir).
    • George Klir cited in: James T. Ziegenfuss (1983) Patients' rights and organizational models: sociotechnical systems research on mental health programs. p.104
  • The brilliant British psychiatrist, neuroscientist, and mathematician Ross Ashby was one of the pioneers in early and mid-phase cybernetics and thereby one of the leading progenitors of modern complexity theory. Not one to take either commonly used terms or popular notions for granted, Ashby probed deeply into the meaning of supposedly self-organizing systems. At the time of the following article, he had been working on a mathematical formalism of his homeostat, a hypothetical machine established on an axiomatic, set theoretical foundation that was supposed to offer a sufficient description of a living organism's learning and adaptive intelligence. Ashby's homeostat had a small number of essential variables serving to maintain its operation over a wide range of environmental conditions so that if the latter changed and thereby shifted the variables beyond the range where the homeostat could safely function, a new 'higher' level of the machine was activated in order to randomly reset the lower level's internal connections or organization... Like the role of random mutations during evolution, if the new range set at random proved functional, the homeostat survived, otherwise it expired...
    • Jeffrey Goldstein (2004) in: Emergence: Complexity and Organization, January 1, 2004 (online) (over Goldstein, zie [1])
  • [A] famous photograph... showing McCulloch (1898–1969) and Norbert Wiener (1894–1964) with British Cyberneticians Ross Ashby (1903–1972) and Grey Walter (1910–1977), first appeared in de Latil (1953) with the caption "The four pioneers of Cybernetics get together in Paris", and encapsulates a view of the development of cybernetics that has slowly become more accepted: that there were important British contributions from the outset.
  • What is the use of ultra-stable systems which have the property that if you subject them to some influence, they change to an equilibrium state but don't even remember where they were? Warren McCulloch was a great fan of Ashby, and so finally I asked him, "Well, why do you think this is so important?" And he said, "Because he explains it so clearly." I went back and read Design for a Brain again and couldn't but agree with McCulloch that Ashby had managed to explain something more clearly than everyone else put together. The fact that the systems didn't do anything was a little bit disappointing, but that was as nothing compared to the clarity. So, if you want to explain something, you should read Design for a Brain and use that as a model for your next paper.
    • Marvin L. Minsky, "Adaptive control: from feedback to debugging." Adaptive Control of Ill-Defined Systems. Boston, MA: Springer US, 1984. 115-125.
[edit]
Wikipedia
Wikipedia
Wikipedia has an article about: