Laws of Nature

A notable fact about the world we inhabit (whether or not it is one of many) is that it has many regularities, universally quantified propositions that are true throughout its spatio-temporal extent. Some of these regularities, we think, are laws, whereas some of these regularities are, in some sense, "accidental." For example, it is probably true that there is no sphere of solid gold that is over one mile in diameter, nor will there ever be. But it would not be inconsistent with the laws of nature if there were such a sphere. Neither is there, nor will there ever be, a sphere of Uranium-235 over one mile in diameter. But this is not accidental in this sense: the laws of nature rule out this possibility, for such a sphere would exceed critical mass and explode immediately.

How are we to distinguish between regularities that are laws, and regularities that are mere accidents? This is not an epistemological question, i.e. how do we tell whether a regularity is a law, but a metaphysical one: what must the world be like, so that one regularity is a law, and the other a mere accident? It has been proposed by Armstrong, Tooley, and Dretske, that the metaphysical difference between laws and regularities must be given in terms of relations between universals. If a regularity (x)(Fx -> Gx) is a statement of a law, it is a law in virtue of a special relation between the universals Fness and Gness. We may call such a theory a "universalist" theory of laws. In contrast to a universalist theory of laws stand "regularity" theories of laws, theories that accounts for the distinction between laws and mere accidents without appeal to universals standing "over and above" the entities involved in the regularity.

The simplest sort of regularity theory with any degree of plausibility is the "lawlike" theory of regularity. According to this theory, a statement of a regularity (a universally quantified sentence) is a statement of a law just in case

(1) The statement is true,
(2) The statement is contingent, and
(3) The statement is lawlike,

where being lawlike is a semantic feature of the statement in question.

It is not exactly clear which statements count as lawlike, but at least this much can be said. A statement is not lawlike if it makes essential reference, explicit or implicit, to a particular thing, place, or time. Thus "All emeralds in Australia are green", although true, is not a law, nor is "All emeralds found after time t are green." I won't try to give any rigorous standards for determining when a statement makes implicit reference to a particular thing, place, or time. Roughly, no predicate essentially involved in the statement should be capable of distinguishing between indiscernible entities (such as the counterparts in a universe of eternal recurrence). This may rule out such predicates as "is grue", since this predicate makes implicit reference to some time t. This will probably also rule out such predicates as "is in a left-hand configuration", since these predicates can only be defined by reference to an individual object.

It is unlikely that this sort of regularity theory of laws, or any simple modification of such a theory, can be successful, however. Suppose that F is a predicates that is permissible in lawlike sentences, but suppose that in fact there are no Fs. Then for any predicate G that is also permissible, (x)(Fx -> Gx) will be a statement of a law, according to this theory. But this may well not be a law. For example "is not acted on by any force" is presumably a lawlike predicate, as is "has electric charge -1". But there may well be no objects, now or ever, which are not acted on by any force: so the generalization (x)(x is not acted on by any force -> x has electric charge -1) is a true, lawlike generalization. But it presumably is not a law, for unlike laws, it does not support its counterfactuals. Even though the generalization is true, the counterfactual "If there were an object not acted on by any force, it would have electric charge -1" is not.

It might be thought that the easy fix is to require that the sentence in question not be vacuously true in order to qualify as lawlike. However, there are some vacuously true generalizations, such as Newton's first law, that we do wish to count as laws. So this "fix" is out of the question.

A second problem for this view, if we presume that conjunctions of lawlike predicates will themselves be lawlike: unless we live in an eternal recurrence universe (or some other such universe in which everything has indiscernible counterparts), there will, in all likelihood, be some entities picked out uniquely by some conjunction of lawlike predicates. Suppose that {F1...Fn} is a set of lawlike predicates that are jointly true of exactly one thing, and that G is a lawlike predicate, not entailed by {F1...Fn} also true of that thing. Then (x)(F1x & ... & Fnx -> Gx) will be a true generalization, and according to the theory of laws in question, it will be a law. But surely this could just be an accidental generalization!

A third problem with the "lawlike" theory is that it rules out the possibility that some generalization that is actually a law could have been merely accidental, or the possibility that some generalization that is actually a mere accident could have been a law. It would seem that there are possible worlds in which there is no sphere of U-235 over a mile in diameter, but in which U-235, because of the laws of that world, is quite stable, and thus one in which the laws do not rule out there being a sphere of U-235 of that size. And it would seem that there are possible worlds, like ours in that there is no sphere of gold over a mile in diameter, but in which this is ruled out by the laws of that world, for in that world gold is an unstable element. According to the lawlike theory, for all generalizations S, if S is a law then necessarily (S is a law iff S is true), and if S is merely accidental then necessarily, S is not a law.

An alternative theory of laws, which we may call the "systematic theory", has been offered by David Lewis, based suggestive remarks of F. P. Ramsey. According to this view, laws are those generalizations entailed by whatever true theory of nature has the best combination of simplicity and strength. (Or if there is more than one theory tied for this honor, those generalizations entailed by all such theories, or if, as seems quite unlikely, there is an infinite sequence of better and better theories <T1, T2, ... >, those generalizations entailed by all theories above Tn, for some n.) Simplicity without strength can be had by a theory that entails only logical truths, strength without simplicity by the deductive closure of an almanac.

It should be noted that if we allow just any old predicates, no matter how gerrymandered, into the formulation of a theory, the simplicity requirement will be too easy to achieve. The adherent of the systematic theory of laws must be an inegalitarian about predicates, and must hold that it is simplicity when formulated using natural predicates that counts toward the simplicity requirement.

This theory of laws avoids the above three problems of the "lawlike" theory. This theory does not make any distinction between lawlike and non-lawlike predicates, so it does not entail that those generalizations that are actually laws are laws in every universe in which they are true. And the best theory may well not entail the vacuous generalizations that, intuitively, shouldn't count as laws; nor need it entail those singular generalizations that intuitively shouldn't count as laws.

The largest problem facing the systematic theory is whether it can, without violating the spirit of regularity theories, deal with probabilistic laws, of the sort it seems we need to acknowledge if current physics is anywhere near right. Suppose that it is a law that it is for any F, it is 99% probable that it is a G. This is entirely consistent with all Fs being non-G, and although epistemologically this would count against this being a law, we're dealing with metaphysics here, not epistemology. But in such a terribly improbable world, a better theory would entail that all Fs are non-Gs simpliciter, with no probabilistic proviso. If there is to be a solution to this problem, then the true theories in the running for the best theory must take account not only the Fness and Gness of the entities in the universes to which they apply, but also the objective, single-case, probabilities that entities are Fs or Gs.


References

Armstrong, D. M. What is a Law of Nature? Cambridge, 1993.

Tooley, Michael. Causation: A Realist Approach. Oxford, 1987.

Back to Brock's Philosophy Page


Copyright © 1997 Carl Brock Sides.
Permission granted to distribute in any medium, commercial or non-commercial, provided all copyright notices remain intact.