Skip to content

Ullin T. Place (1924-2000)

Dependable Systems – why is psychology possible?

In this post, I review some of the arguments developed by authors like Donald Campbell and Paul Meehl, in the sixties and seventies of the last century, for what makes psychological laws possible. Their ideas go back to what John Stuart Mill wrote about causality in his A System of Logic (1843).

Overdetermination

John Stuart Mill (1806 – 1873) noticed that causation is a complex phenomenon. First, he drew attention to the frequent occurrence of multiple causation: multiple causal factors play a role in the coming about of an effect. A cause is a collection of causal factors sufficient for the occurrence of the effect. Second, the same effect can be caused in different ways. An effect can have several possible causes. Mill called this the plurality of causes. A consequence of plurality of causes is that it allows for the occurrence of more than one cause at the same time. Each cause is sufficient for the occurrence of the effect. The effect is overdetermined[1]. In a situation of overdetermination, the real cause of the effect is unclear. Often, the solution is to demand a continuous process or causal chain between cause and effect; the continuity condition. Scriven gives this example:

[C]onditions (perhaps unusual excitement plus constitutional inadequacies) [are] present at 4.0 p.m. that guarantee a stroke at 4.55 p.m. and consequent death at 5.0 p.m.; but an entirely unrelated heart attack at 4.50 p.m. us still correctly called the cause of death, which as it happens, does occur at 5.0 p.m. (Scriven, 1964, pp. 410-411)

In this example, there are two possible causes of death at 5.0 p.m.: excitement and a heart attack. Scriven distinguishes between continuous and interrupted causal chains:

The heart attack was, and the excitement was not the cause of death because the “causal chain”  between the latter and death was interrupted, while the former’s “went to completion”. (Scriven, 1964, p. 411)

In this example, we can speak of independent overdetermination. Another example is:

A man sets out on a trip through the desert. He has two enemies. One of them puts a deadly poison in his reserve can of drinking water. The other (not knowing this) makes a hole in the bottom of the can. The poisoned water all leaks out before the traveller needs to resort to this reserve can; the traveller dies of thirst. (Mackie, 1974, p. 44)[2]

In the case of independent overdetermination, there are always two (or more) occurring potential causes, of which just one is the actual cause[3], that leads (via a causal chain) to the effect, while the other potential cause would have led to the effect if the other had not occurred.

There are cases where the continuity condition does not help, especially in the case of concurrent overdetermination. A well-known example is the man who is executed by a firing squad. At least two bullets enter his heart at once, while only one bullet would be sufficient for his death. There are several sufficient conditions fulfilled, but we cannot say which one is responsible for the death of the executed man.

The last form of overdetermination is what we call coupled overdetermination. Example:

Smith and Jones commit a crime, but if they had not done so the head of the criminal organization would have sent other members to perform it in their stead, and so it would have been committed anyway. (Mackie, 1974, p. 44)

In the case of coupled overdetermination, there is, in contrast to independent and concurrent overdetermination, just one potential cause present, which is also the actual cause of the effect. But if this potential cause had not occurred for one reason or the other, this would have led to the occurrence of another cause that would be responsible for the effect. The alternative causes are coupled so that the non-occurrence of the one leads to the occurrence of the other. However, it is odd to speak of the non-occurrence of a potential cause as a cause of another effect that, in turn, causes the intended effect as if negative causes (non-occurring causes) are also causally effective. If we turn to the example, we realise that it is not so much the non-committing of the crime by Smith and Jones that leads to the assignment of another member of the organisation to perform the crime in their stead. It is the noticing by the head of the organisation that S. and J. did not commit the crime that led to another person’s assignment to perform it. Coupled overdetermination is only possible when the situation is controlled by a system with some form of perception informing the system that a potential cause occurred.

Redundant and converging causality

The stability of (complex) systems can often be explained by “in-built” overdetermination. Garfinkel (1981) introduced a special term for this: redundant causality.

We may explain why a child has certain attitudes by pointing out that it had certain experiences. This teacher said that on such-and-such day, they saw such-and-such movie, all of which had the effect of engendering a certain attitude. But if the attitude is relatively important to society, the means of generating that attitude will not left to chance; there will be a multiplicity, a redundancy, of mechanisms to ensure that the child developed the “right” attitude.
    So the causality with which the effect is produced has a strong resiliency. The very fact that the child did not have those experiences calls forth other experiences to do the job of producing the effect. […]
    I want to call this “redundant causality.” Systems which exhibit redundant causality therefore have, for every consequent Q, a bundle of antecedents (Pi) such that:
1. If any one of the Pi is true, so will Q.
2. If one Pi should not be the case, some other will.
(Garfinkel, 1981, pp. 57-58)

In a footnote, Garfinkel (1981, p. 58) notes that systems characterized by redundant causality “act as if they are goal directed because, should one means to the end be blocked, the system will shift to an alternative.”[4]

In psychology, a half-century ago, Paul Meehl (1954; 1970; 1978) and Donald Campbell together with Thomas Cook (Cook & Campbell, 1979) pointed out that one factor in the success of psychology as a science is the degree of convergent causality (Meehl) or “dependable intermediate mediational units” (Campbell) in psychological phenomena. Both concepts are related to redundant causality. Meehl (1978) mentions that one of the reasons why there is so little progress in the “soft” sectors of psychology like clinical, social, personality and educational psychology is the prevalence of what he calls following Langmuir divergent causality (or divergent causal chains) which Meehl contrast with convergent causality (or convergent causal chains).

Roughly, in divergent causal chains, small initial-condition fluctuations determine very different remote outcomes; in convergent situations, small fluctuations “average out” so that whether any one individual initial event is E or  ~E has a negligible effect on the system’s direction of movement. […] Langmuir’s distinction is of course implicit in numerous historical and fictional treatments of the theme “small causes, great effects.” A familiar example is speculation about whether World War I would have broken out if the obstetrician who delivered Wilhelm II had been more skilful, as a result of which the Kaiser would have been spared his withered arm, hence would have felt less need to compensate, etc. “For want of a nail the shoe was lost; for want of a shoe the horse was lost; for want of a horse the rider was lost; for want of a rider the battle was lost; for want of a victory the kingdom was lost.”
(Meehl, 1970, footnote 10, p. 395)

[T]here are complex systems whose causal structure and boundary conditions are such that slight differences … tend to “wash out,” “cancel each other.” Or “balance” over the long run. On the other hand, there are other systems in which such slight perturbations or differences in the exact character of the initial conditions are, so to speak, amplified over the long run. Langmuir christened the former kind of causality as “convergent,” as when we say that the average errors in making repeated measurements of a table tend to cancel out and leave us with a stable and highly trustworthy mean value of the result. On the other hand, an object in unstable equilibrium can lean slightly toward the right instead of the left, as a result of which a deadly avalanche occurs burying a whole village.
(Meehl, 1978, p. 809)

Redundant and convergent causality are, in a certain sense, complementary. Redundant causality ensures that when a cause for the realisation of a particular effect does not occur, there will be an alternative cause for the occurrence of the effect. Through convergent causality influences that oppose the realisation of this effect are cancelled by influences in an opposite direction. Redundant and convergent causality are responsible for systems to behave in an orderly and predictive manner. Note that Meehl suggests that systems can vary in the amount that they are sensitive to divergent causality and that this is expressed in the error variance. More specifically, ceteris paribus the error variance increases in a system more sensitive to divergent causation; one of the consequences is that the confidence intervals will be larger and less informative (less specific).

According to Cook and Campbell (1979), the causal laws psychology tries to discover are molar. A law is molar if it concerns big and often complex objects. The causes and effects connected in such molar law are generally separated in place and time. Cook and Campbell speak in this respect of micromediation. This refers to the specification of causal connections on a “lower” level of smaller parts that form the molar objects and on a finer timescale. Cook and Campbell defend the following thesis:

Dependable intermediate mediational units are involved in most strong laws.
[I]n biological systems, some complex molar units have acquired such dependability that they may operate as unproblematic units in still more molar causal chains. The individual nerve cell is such a unit. Until one is sick, it is an unproblematic unit that dependably and causally contributes to human functioning at more molar levels. Larger, more complex servosystem units may similarly function. For instance, a traditional goal for military training is to turn the individual soldier into an unproblematic unit who dependably responds to orders. Dependable intermediate mediation is involved in all strong causal laws. Undependable mediation – as when nerve cells malfunction or soldiers refuse to respond to orders – results in weaker and less dependable molar laws.
    In biological and social units, where purposes, goals, cybernetic systems or servomechanisms are involved, molar dependability may be achieved by substituting alternative micromediational links. Thus, if an extra homework assignment causes the room to be lighted at midnight, this effect can be mediated by candles if the electric power is out. If the rat’s hunger causes the animal to press a lever, this can be mediated by pulling the lever with its jaws when its paws are tired.
(Cook & Campbell, 1979, p. 35; italics added)

“Dependable” systems are typically orderly and predictive in their – molar – behaviour. Redundant causality in the form of alternative micromediation is responsible for this. In other words, because there is overdetermination at the lower level, molar causal laws are possible.


[1] In the literature, overdetermination is also known as preemption or fail-safe causes.

[2] Mackie attributes this example to Hart & Honoré (1959), Causation in the law. This example is also discussed by Pearl (2009, p. 312), who attributes it to P. Suppes.

[3] The Actual Cause is the title of chapter 10 of Pearl (2009). I refer the reader to this chapter for a more technical treatment of actual causation and overdetermination.

[4] It is not completely clear which form or forms of overdetermination Garfinkel has in mind. Some formulations point clearly to coupled overdetermination. I see no reason to exclude the two other forms of overdetermination from redundant causality.

References

Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design & analysis issues for field settings. Houghton Mifflin.

Garfinkel, A. (1981). Forms of explanation: Rethinking the questions of social theory. Yale University Press.

Hart, H. L. A., & Honoré,T. (1959), Causation in the law. Clarendon Press.

Mackie, J. L. (1974). The cement of the universe: A study of causation. Oxford University Press.

Meehl, P. E. (1954). Clinical and statistical prediction: A theoretical analysis and review of the evidence. University of Minnesota Press.

Meehl, P. E. (1970). Nuisance variables and the ex post facto design. In M. Radner, & S. Winokur (Eds.), Minnesota Studies in the Philosophy of Science. Volume IV (pp. 373-402). University of Minnesota Press.

Meehl, P. E. (1978). Theoretical risks and tabular asterisks: Sir Karl, Sir Ronald and the slow progress of soft psychology. Journal of Consulting and Clinical, 46, 806-834.

Mill, J. S. (1843). A system of logic.

Pearl, J. (2009). Causality: Models, reasoning, and inference (Second Edition) [First Edition is from 2000]. Cambridge University Press.

Scriven, M. (1964). Review of The structure of science by E. Nagel. Review of Metaphysics, 17(3),  403-424.

Leave a Reply

Your email address will not be published. Required fields are marked *