Systemantics

Systems Thinking 1975 book by John Gall https://en.wikipedia.org/wiki/Systemantics

Gall's Law: A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

(Every Human System is a Complex System.)

quotes - 36 laws: https://devshop.wordpress.com/2008/05/15/laws-of-systemantics-by-john-gall/

https://www.panarchy.org/gall/systemantics.html

https://en.wikiquote.org/wiki/John_Gall

Premise: Systems in general work poorly or not at all.

Excerpts

Preface to the First Edition

Jones received an urgent invitation to join an ambitious federally-funded project for a systematic attack upon the “problem” of mental retardation

within a year he was unable to speak or think intelligently in the field of mental retardation

Preface to the Second Edition

We are forced to report that in the interval since the original edition of Systemantics, no significant improvement in Systems-behavior has taken place

A question frequently asked is: does not the persistent occurrence of Horrible Examples of Systems-function (or Malfunction) prove something about human nature? If humans were rational, wouldn’t they act otherwise than they do? We reply: Systems-functions are not the result of human intransigeance. We take it as given that people are generally doing the very best they know how. Our point, repeatedly stressed in this text, is that Systems operate according to Laws of Nature, and that Laws of Nature are not suspended to accommodate our human shortcomings

Charles Darwin made it a rule to write down immediately any observation or argument that seemed to run counter to his theories

One need only follow this precept for a few days to become aware of the true dimensions of the discrepancy between the world as advertised and as it actually functions or fails to function

When Memoryis thus deliberately frustrated in its basic task ofprotecting us from too much awareness, we see what we had hitherto failed to notice: that malfunction is the rule and flawless operation the exception

The advent of the Computer Revolution merely provides new opportunities for errors at levels of complexity and grandiosity not previously attainable

The reader who is familiar with the First Edition will note, in the Second Edition, a very slight and subtle shift of focus, a change of emphasis, in the direction of Pragmatism. Some of the later Chapters, if read uncritically, could even lead to a sort of optimism

The author, for his own part, remains firmly convinced that the height and depth of practical wisdom lies in the ability torecognize and not to fight againstthe Laws of Systems

we here spell out our basic strategy, which was implicit in the pages of the First Edition

THE MOST EFFECTIVE APPROACH TO COPING IS TO LEARN THE BASIC LAWS OF SYSTEMS-BEHAVIOR As Virginia Satir has aptly reminded us:[x]
PROBLEMS ARE NOT THE PROBLEM; COPING IS THE PROBLEM*

Preface to the Third Edition

the passage of time has brought to light not only new Horrible Examples, but also new Axioms and above all, deeper insights, that justify us in bringing out a new essay

A System, after all, is a partial Intelligence; it participates in the great Mind of the Universe; and unless we ourselves have a direct pipeline into that Mind, we had jolly well better watch our step

Systems don’t appreciate being fiddled and diddled with. They will react to protect themselves; and the unwary intervenor may well experience an unexpected shock

We live in a new age of faith, an age of faith in systems. If there is any one belief which is not challenged anywhere in the world, it is this faith in systems.

one thing they all agree on is that whatever the problem may be, the answer lies in setting up some system to deal with it.

if we remain unaware of what we are actually doing when we use a tool, when we make a plan, when we interact with our surroundings—well, such unawarenessis usually rewarded with an unpleasant surprise at some point in the future, if not immediately.

Just before a dolphin actually understands the point of a new routine, it gets more and more irritable, swims in circles and finally loses its temper, leaps out of the water and sends a giant splash over the trainer. Psychologists call it Cognitive Dissonance—that irritating feeling that something just doesn't fit

Karen Pryor, the Dolphin Lady, whose animal-training methods are used world-wide, calls it the Learning Tantrum

Down through history, the Learning Tantrum recurs at moments of crisis

error correction is a big part of what we do. We notice (or, worse, fail to notice) the difference between our expectations and our actual sensory feedback. When we notice it, we register it as an error, a defect, a failure, a shortcoming, something to be corrected. If we fail to notice it, or if, instead of acknowledging the discrepancy we have noticed, we try to shut it out (see Taboo on Failure, in Preface to the Second Edition) and proceed as if it weren't there—then we have the bizarre, funny, sometimes excruciating results catalogued inSystemantics.

Introduction

How does it come about that things turn out so differently from what common sense would expect?

the fundamental problem does not lie in any particular System but rather in Systems As Such

Historical Overview

people everywhere are struggling with a Problem:
THINGS AREN’T WORKING VERY WELL

Because of its central role in all that follows (being the fundamental observation upon which all further research into Systems has been based) it is known as the Primal Scenario. We give it here in full:
THINGS (THINGS GENERALLY/ ALL THINGS/THE WHOLE WORKS) ARE INDEED NOT WORKING VERY WELL. IN FACT, THEY NEVER DID

In more formal terminology:
SYSTEMS IN GENERAL WORK POORLY OR NOT AT ALL

this fact, repeatedly observed by men and women down through the ages, has always in the past been attributed to variousspecial circumstances.

No history of the subject would be complete without some reference to the semi-legendary, almost anonymous Murphy (Murphy's Law)

The universe is not actually malignant, it onlyseemsso.

Shortly after Murphy, there appeared upon the scene a new and powerful mind, that of Count Alfred Korzybski

Korzybski seemed to have convinced himself that all breakdowns of human Systems are attributable to misunderstandings—in brief, to failures of communication. Our position, on the contrary, is that human Systems differ only in degree, not in kind, from other types of Systems*

SYSTEMS DISPLAY ANTICS. They “act up.”

Nevertheless, as we shall see, Korzybski, by stressing the importance of precise definitions, laid the groundwork for the Operational Fallacy, which is the key to understanding the paradoxical behavior of Systems

System Theory is a respectable academic subject, elaborated at leisure by professional scholars (mostly with tenure) who have the time and security to make sure that their researches turn out the way they should. Systemantics, by contrast, is almost a form of Guerilla Theater.

After Korzybski, a brilliant trio of founders established the real basis of the field. Of these, the earliest was Stephen Potter, who, in the masterly work entitled One-Upmanship, painstakingly elaborated a variety of elegant methods for bending recalcitrant Systems to the needs of personal advancement. Although Potter’s goals were essentially utilitarian, lacking the broad generality of Parkinson’s or Peter’s approach, he is rightly regarded as one of the pioneers of Intervention into the operations of Systems.

Following Potter, C. Northcote Parkinson established an undying claim to fame by prophesying, as early as 1957, the future emergence of the problem of Table Shape in diplomatic conferences.[xviii] He was triumphantly vindicated in the Paris Peace Talks of 1968, when an entire season was devoted to just this topic before discussion of an end to the war in Vietnam could even begin. No clearer demonstration of the Generalized Uncertainty Principle could have been asked. (Parkinson's Law)

Third in the brilliant trio of founders is Doctor Laurence J. Peter, whose Principle of Incompetence lies at the heart of Administrative Systemantics. (Peter Principle)

Here, then, is the very first book of Systems-Axioms, the very first attempt to deal with the cussedness of Systems in a fundamental, logical way, by getting at the basic rules of their behavior

Part One: Basic Theory

A. The Mysterious Ways of Systems

1. First Principles

We begin at the beginning, with the Fundamental Theorem:
NEW SYSTEMS MEAN NEW PROBLEMS

When a system is set up to accomplish some goal, a new entity has come into being—the system itself

Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.

In the case of Garbage Collection, the original problem could be stated briefly as “What do we do with all this garbage?” After setting up a garbage-collection system, we find ourselves faced with a new Universe of Problems.

Some garbage does get collected. The original problem is thereby somewhat reduced in magnitude and intensity. Over against this benefit, however, one must balance thenew problemsf

The sum total of problems facing the community has not changed.

ANERGY. ANERGY-STATE. Any state or condition of the Universe, or of any portion of it, that requires the expenditure of human effort or ingenuity to bring it into line with human desires, needs, or pleasures is defined as an ANERGY-STATE. Anergy is measured in units of effort required to bring about the desired change.

THE TOTAL AMOUNT OF ANERGY IN THE UNIVERSE IS CONSTANT

Law of Conservation of Anergy

SYSTEMS OPERATE BY REDISTRIBUTING ANERGY INTO DIFFERENT FORMS AND INTO ACCUMULATIONS OF DIFFERENT SIZES

One school of mathematically-oriented Systems-theoreticians holds that the Law of Conservation of Anergy is only approximately true. According to them, in very large Systems a Relativistic Shift occurs, whereby the total amount of Anergy increases exponentially. In really large and ambitious Systems, the original Problem may persist unchanged and at the same time a multitude of new Problems may arise to fester (or ferment) unresolved

2. Laws of Growth

Systems are like babies: once you get one, you have it. They don’t go away

SYSTEMS TEND TO EXPAND TO FILL THE KNOWN UNIVERSE

C. Function and Failure

Even in the extreme case where the System seems bent on self-destruction, the opposing tendency guarantees that the Ghost of the Old System will remain in evidence to Haunt the New. These Inner Goals quite obviously bear little or no relation to the Stated Purpose of the System, which in reality is only the Wish of the designer or manager.

The main usefulness of the Stated Purpose is in making a realistic assessment of the Delusion System within which, and out of which, the designers, operators, or managers of the System may be working. Once the Student learns to quickly identify such phrases as “creating the new Socialist Man,” “making the world safe for Democracy,”or “better living through asbestos products,” the less Delusion there will be in his/her own dealings with Systems.

As for the Ultimate Purpose of the System, above and beyond the Inner Goals conditioned by the structure of the System: if there is such a thing, it is beyond the author’s ken, and perhaps unknowable. It is hard enough to find out a few fragmentary aspects of what the System is really doing.
We therefore retreat from metaphysics to the simple, down-to-earth attitude: the Purpose of the System is—whatever it can be used for.
(vs POSIWID)


Edited:    |       |    Search Twitter for discussion

No Space passed/matched! - http://fluxent.com/wiki/Systemantics