Thursday 14 January 2016

Ladyman & Ross - Every Thing Must Go chapter 1 (part 2)

Continuing my notes on Ladyman & Ross's book Every Thing Must Go (earlier post here).


1.2 Neo-Scholastic Metaphysics

In this section, L&R detail the sins of contemporary metaphysics. The main problems they identify are: (1) the appeal to intuitions and common sense; (2) the assumption by many metaphysicians that metaphysics is an a priori discipline that can proceed without any reliance on science; and (3) where metaphysicians do consult science, it's often completely outdated; they assume the "billiard-ball universe" of A-level chemistry rather than engaging the best contemporary scientific work.


1.3 The Principle of Naturalistic Closure 

This section is concerned with L&R's principle for distinguishing useful from useless metaphysics. But first they ask, why should a naturalist think that there's anything useful for metaphysics to do? Why not go along with e.g. the positivists in rejecting metaphysics entirely? L&R suggest that unification/consilience is an important part of the history of science, but that no particular science is devoted to evaluating consilience. That is, no particular science is devoted to evaluating how different sciences are connected to each other. Hence:

(1) Unification/consilience is a worthwhile project;
(2) No particular science is devoted to evaluating and elucidating unification/consilience;
(3) This is something that metaphysicians can do.

The immediately obvious problem with this justification of metaphysics is that actually, scientists can and do evaluate how well their theories accord with others. Where unifcation happens, this is usually due to the work of scientists. Indeed, I have difficulty thinking of even a single major scientific unification that occurred thanks to the efforts of metaphysicians. Just because no particular science is devoted to unification, it doesn't follow that we're in any need of specialists to help see it through. Unifications happen "automatically," so to speak.

L&R then propose a non-positivist verificationism, dubbed the Principle of Naturalistic Closure (PNC), which rests on two claims: (1) "no hypothesis that the approximately consensual current scientific practice declares to be beyond our ability to investigate should be taken seriously" (2) "any metaphysical hypothesis that is to be taken seriously should have some identifiable bearing on the relationship between at least two relatively specific hypotheses that are either regarded as confirmed by institutionally bona fide current science or are regarded as motivated and in principle confirmable by such science." The reason why they require it to have bearing on two hypotheses reflects their view that the function of metaphysics is to further scientific unification. They spend several pages spelling out exactly what the PNC means. I won't comment on this much; as I noted in the last post, while I agree with them that there's plenty of bad metaphysics out there, I don't think we need a principle to demarcate the good from the bad.

There is however one important point: they later (pg 37) make their verificationism even more restrictive. They require that at least one of those two scientific hypotheses upon which a metaphysical theory must have a bearing must be a hypotheses from fundamental physics. This is due to the Primacy of Physics Constraint (PPC), which they discuss in the next part. They also suggest (pg 37) that what's important about physics is that it has the widest scope of all the sciences; since metaphysics is concerned with unification, it shares this maximum scope. If this is supposed to be an argument for the view that a metaphysical hypothesis must have bearing on at least one fundamental physics hypothesis, it just seems like a non-sequitur to me.

Now, L&R's position here is a little unclear to me. Is a purported metaphysical hypothesis that doesn't have a bearing on fundamental physics bad metaphysics, or is it just not metaphysics at all? I think that L&R opt for the latter. They say that a hypothesis that has a bearing on hypotheses all drawn from sciences other than fundamental physics would be a special science hypothesis, not a metaphysical hypothesis. Presumably they don't object to special sciences. So there's not necessarily anything wrong with a "metaphysical" hypothesis that has nothing to do with physics; it's just misnamed. With that in mind, their requirement that metaphysics must be concerned with physics seems like a merely semantic point.


1.4 The Primacy of Physics

The PPC: "Special science hypotheses that conflict with fundamental physics, or such consensus as there is in fundamental physics, should be rejected for that reason alone. Fundamental physical hypotheses are not symmetrically hostage to the conclusions of the special sciences." Three arguments for this are proposed: (1) First, in the history of science, hypotheses that postulate irreducibly nonphysical processes have failed. Of course, it might be objected that it's simply that these processes are considered nonphysical precisely because they failed to be confirmed. If we had discovered some special élan vital, say, perhaps this would simply have been captured within the fold of physics. Still, I think L&R are basically right here. (3) Second, physical hypotheses have been successful instead; processes in e.g. biology can now be understood largely or entirely in physical terms. (3) The third argument is that the PPC is regulative in science, so it must also be respected by naturalistic metaphysics. All special sciences assume the primary of physics; if metaphysics is to be scientifically informed, it must do so as well.

In the last section L&R say that the PPC is what justifies the requirement that any metaphysical hypothesis must have a bearing on at least one fundamental physics hypothesis. It's not clear to me why this is so. Surely all that follows from the PPC is simply that metaphysical hypotheses must not conflict with fundamental physics, and this is hardly controversial.

In any case, I'm not sure I buy the PPC. Suppose there were a hypothesis in biology that seemed to conflict with some accepted consensus in fundamental physics, but that had an enormous amount of evidence in its favour. Tests repeatedly confirmed it, none refuted it. Surely such a hypothesis wouldn't (and shouldn't) be rejected simply because it conflicts with physics. Now, it's true that scientists do assume something like the PPC, but this is just as a heuristic. Any scientist who realizes that a hypothesis conflicts with fundamental physics probably wouldn't bother developing it, simply because it's very, very unlikely that any hypothesis in conflict with fundamental physics would in fact fare so well. (Though it's occasionally happened before: cf. Kelvin's argument that Darwinian evolution was incompatible with the age of Earth predicted by the best 19th century physics.)


1.5 Unity of Science and Reductionism

Metaphysics is concerned with unifying scientific hypotheses. One major strategy for unification is reduction. In this section, L&R discuss various forms of reductionism.

Oppenheim & Putnam's microreductions: O&P's theory of reduction is based on the part/whole relation. Lower-level sciences deal with parts of the objects of higher-level sciences. For instance, biological entities are composed of physical parts. Microreduction involves decomposing the entities of a higher-level science into the entities of a lower-level science. Microreduction is transitive: if T1 microreduces T2, and T2 microreduces T3, then T1 microreduces T3. O&P add the assumption that "there does not exist an infinite descending chain of proper parts, i.e., a series of things x1, x2, x3 ... such that x2 is a proper part of x1, x3 is a proper part of x2, etc" (quoted in L&R), so microreductions will stop somewhere. There is a lowest level of reality.

There is a trend towards microreduction across six levels: (6) social groups; (5) multicellular living things; (4) cells; (3) molecules; (2) atoms; (1) elementary particles. The unity of science is an ideal state where all higher-level entities can be decomposed into lower-level entities, and ultimately elementary particles. Fundamental physics can indirectly explain everything on the higher levels through cumulative microreductions.

L&R note, but leave further explanation to later chapters, that they reject O&P's model of reduction on two grounds: (1) they deny O&P's atomism; (2) they think that O&P's model rests on a denial of the "scale relativity of ontology" (again, this isn't explained at all at this point).

Nagelian reduction: For Ernest Nagel, "reduction is the (deductive) explanation of a theory by another theory", i.e. reduction involves showing that the reduced theory is entailed by the reducing theory. Note that contra O&P, the reducing theory need not be concerned with proper parts of the entities of the reduced theory. The popular understanding of Nagel emphasizes the role of "bridge laws," laws which related terms of the reducing theory with terms of the reduced theory. Here's IEP's account of the form of the Nagelian reduction:
(1) The occurrence of a B1 causes the occurrence of a B2 (a law in the base science).
(2) If something is a B1, then it is a T1. (bridge law)
(3) If something is a B2, then it is a T2. (bridge law)
(4) Therefore, the occurrence of a T1 causes the occurrence of a T2 (a law in the target science).
Much of the controversy about Nagelian reduction has centred on the bridge laws, but following Marras 2005, L&R downplay the importance of bridge laws for Nagelian reduction. Marras considers Nagel's example of the reduction of the Boyle-Charles law of thermodynamics to statistical mechanics. On Nagel's account, this reduction involved showing that the Boyle-Charles law can be derived from statistical mechanics. As Marras sees it, there are three steps to Nagel's account of this derivation (I'm paraphrasing slightly):
(1) The formulation of a number of limiting assumptions and initial conditions (LA/IC) centring on the identification of a fixed volume of an ideal gas with a fixed number of molecules. 
(2) The derivation from the principles of statistical mechanics together with (LA/IC), of a law L*, pV=2E/3, which is the mechanical counterpart (an "image" or "close analogue") of the Boyle-Charles law pV=kT (call this L). L* is of course entirely in the vocabulary of statistical mechanics. 
(3) The postulation of a bridge law, 2E/3=kT, consequent upon a "comparison" of thermodynamics with statistical mechanics, enabling the formal derivation of L from L*. [Given that 2E/3=kT, we can derive pV=kT from pV=2E/3.]
The bridge laws enter in step (3), but it's steps (1) and (2), Marras suggests, that are actually important. What's important is not the deduction of pV=kT from statistical mechanics, which requires a bridge law, but the deduction of an "image" of pV=kT from statistical mechanics, and pV=2E/3 is just that. Indeed, all the bridge law can do is "serve the logical/expository function for formally exhibiting a result of the reduction ... rather than the scientific/methodological function of actually effecting the reduction." The point is that the bridge law is based on a "comparison" of pV=kT with pV=2E/3 (in the context of a "comparison" of thermodynamics and statistical mechanics more generally). It's only when we can say that pV=2E/3 is an image of pV=kT that we can postulate a bridge law linking pV=2E/3 with pV=kT - but once we know that pV=2E/3 is an image of pV=kT, all the important work of reduction is done.

Another caveat that L&R note is that Nagel was skeptical of identity claims. The basic worry with identity claims is that to say that e.g. temperature is identical to mean kinetic energy must involve changing the meaning of one or both of these terms; as L&R put it: "temperature is not a statistical property while mean kinetic energy is, and there are theoretically distinct ways of determining the mean within statistical mechanics." L&R share this skepticism about identity claims. The example just quoted obviously concerns a type-identity, but L&R are skeptical even of token identities, which is notable given that the token identity of higher-level entities and processes with physical entities and processes tends to be assumed even by antireductionists (L&R again cite "the scale relativity of ontology" as motivating their skepticism of token identities, and again leave the explication of that for later).

Anyway, L&R accept Nagelian reductions, bearing in mind that they downplay identity claims and the importance of bridge laws. Importantly, Nagelian reduction allows us to say e.g. that temperature has been reduced to mean kinetic energy, without saying that temperature just is mean kinetic energy.

Type reductionism: This is the kind of reductionism that's targeted by multiple realizability arguments. Type reduction occurs when for some special science law S1 → S2, there is a true physical law P1 → P2 and true biconditionals linking P1 with S1 and P2 with S2. The multiple realizability argument is intended to show that, for some domain, such biconditionals are impossible. While L&R are skeptical of the standard presentation of the multiple realizability argument, given their skepticism even of token identities obviously type identities are off the table.

Core-sense reduction: The form of reductionism found in Andrew Melnyk's realization physicalism; L&R reject it.

So while L&R emphasize the unity of science, they reject all but the weakest kinds of reductionism.


Marras, A. (2005) "Consciousness and Reduction", British Journal for the Philosophy of Science, vol. 56, no. 2, June, pp. 335-361.

No comments:

Post a Comment