Skip to main content

The Universal Plausibility Metric (UPM) & Principle (UPP)



Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, "Yes." A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of ξ is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set.


No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (ξ < 1).

The seemingly subjective liquidity of "plausibility"

Are there any objective standards that could be applied to evaluate the seemingly subjective notion of plausibility? Can something so psychologically relative as plausibility ever be quantified?

Our skepticism about defining a precise, objective Universal Plausibility Metric (UPM) stems from a healthy realization of our finiteness [1], subjectivity [2], presuppositional biases [3, 4], and epistemological problem [5]. We are rightly wary of absolutism. The very nature of probability theory emphasizes gray-scales more than the black and white extremes of p = 0 or 1.0. Our problem is that extremely low probabilities can only asymptotically approach impossibility. An extremely unlikely event's probability always remains at least slightly > 0. No matter how many orders of magnitude is the negative exponent of an event's probability, that event or scenario technically cannot be considered impossible. Not even a Universal Probability Bound [68] seems to establish absolute theoretical impossibility. The fanatical pursuit of absoluteness by finite subjective knowers is considered counterproductive in post modern science. Open-mindedness to all possibilities is encouraged [9].

But at some point our reluctance to exclude any possibility becomes stultifying to operational science [10]. Falsification is critical to narrowing down the list of serious possibilities [11]. Almost all hypotheses are possible. Few of them wind up being helpful and scientifically productive. Just because a hypothesis is possible should not grant that hypothesis scientific respectability. More attention to the concept of "infeasibility" has been suggested [12]. Millions of dollars in astrobiology grant money have been wasted on scenarios that are possible, but plausibly bankrupt. The question for scientific methodology should not be, "Is this scenario possible?" The question should be, "Is this possibility a plausible scientific hypothesis?" One chance in 10200 is theoretically possible, but given maximum cosmic probabilistic resources, such a possibility is hardly plausible. With funding resources rapidly drying up, science needs a foundational principle by which to falsify a myriad of theoretical possibilities that are not worthy of serious scientific consideration and modeling.

Proving a theory is considered technically unachievable [11]. Few bench scientists realize that falsification has also been shown by philosophers of science to be at best technically suspect [13]. Nevertheless, operational science has no choice but to proceed primarily by a process of elimination through practical falsification of competing models and theories.

Which model or theory best corresponds to the data? [[14] (pg. 32-98)] [8]. Which model or theory best predicts future interactions? Answering these questions is made easier by eliminating implausible possibilities from the list of theoretical possibilities. Great care must be taken at this point, especially given the many non intuitive aspects of scientifically addressable reality. But operational science must proceed on the basis of best-thus-far tentative knowledge. The human epistemological problem is quite real. But we cannot allow it to paralyze scientific inquiry.

If it is true that we cannot know anything for certain, then we have all the more reason to proceed on the basis of the greatest "plausibility of belief" [1519]. If human mental constructions cannot be equated with objective reality, we are all the more justified in pursuing the greatest likelihood of correspondence of our knowledge to the object of that knowledge--presumed ontological being itself. Can we prove that objectivity exists outside of our minds? No. Does that establish that objectivity does not exist outside of our minds? No again. Science makes its best progress based on the axioms that 1) an objective reality independent of our minds does exist, and 2) scientists' collective knowledge can progressively correspond to that objective reality. The human epistemological problem is kept in its proper place through a) double-blind studies, b) groups of independent investigators all repeating the same experiment, c) prediction fulfillments, and d) the application of pristine logic (taking linguistic fuzziness into account), and e) the competition of various human ideas for best correspondence to repeated independent observations.

The physical law equations and the deductive system of mathematical rules that govern the manipulations of those equations are all formally absolute. But the axioms from which formal logic theory flows, and the decision of when to consider mathematical equations universal "laws" are not absolute. Acceptance of mathematical axioms is hypothetico-deductively relative. Acceptance of physical laws is inductively relative. The pursuit of correspondence between presumed objective reality and our knowledge of objective reality is laudable in science. But not even the axioms of mathematics or the laws of physics can be viewed as absolute. Science of necessity proceeds tentatively on the basis of best-thus-far subjective knowledge. At some admittedly relative point, the scientific community agrees by consensus to declare certain formal equations to be reliable descriptors and predictors of future physicodynamic interactions. Eventually the correspondence level between our knowledge and our repeated observations of presumed objective reality is considered adequate to make a tentative commitment to the veracity of an axiom or universal law until they are proven otherwise.

The same standard should apply in falsifying ridiculously implausible life-origin assertions. Combinatorial imaginings and hypothetical scenarios can be endlessly argued simply on the grounds that they are theoretically possible. But there is a point beyond which arguing the plausibility of an absurdly low probability becomes operationally counterproductive. That point can actually be quantified for universal application to all fields of science, not just astrobiology. Quantification of a UPM and application of the UPP inequality test to that specific UPM provides for definitive, unequivocal falsification of scientifically unhelpful and functionally useless hypotheses. When the UPP is violated, declaring falsification of that highly implausible notion is just as justified as the firm commitment we make to any mathematical axiom or physical "law" of motion.

Universal Probability Bounds

"Statistical prohibitiveness" in probability theory and the physical sciences has remained a nebulous concept for far too long. The importance of probabilistic resources as a context for consideration of extremely low probabilities has been previously emphasized [[20] (pg. 13-17)] [68, 21]. Statistical prohibitiveness cannot be established by an exceedingly low probability alone [6]. Rejection regions and probability bounds need to be established independent of (preferably prior to) experimentation in any experimental design. But the setting of these zones and bounds is all too relative and variable from one experimental design to the next. In the end, however, probability is not the critical issue. The plausibility of hypotheses is the real issue. Even more important is the question of whether we can ever operationally falsify a preposterous but theoretically possible hypothesis.

The Universal Probability Bound (UPB) [6, 7] quantifies the maximum cosmic probabilistic resources (Ω, upper case omega) as the context of evaluation of any extremely low probability event. Ω corresponds to the maximum number of possible probabilistic trials (quantum transitions or physicochemical interactions) that could have occurred in cosmic history. The value of Ω is calculated by taking the product of three factors:

  1. 1)

    The number of seconds that have elapsed since the Big Bang (1017) assumes a cosmic age of around 14 billions years. 60 sec/min × 60 min/hr × 24 hrs/day × 365 days per year × 14 billion years = 4.4 × 1017 seconds since the Big Bang.

  2. 2)

    The number of possible quantum events/transitions per second is derived from the amount of time it takes for light to traverse the minimum unit of distance. The minimum unit of distance (a quantum of space) is Planck length (10-33 centimeters). The minimum amount of time required for light to traverse the Plank length is Plank time (10-43 seconds) [[68], pg 215-217]. Thus a maximum of 1043 quantum transitions can take place per second. Since 1017 seconds have elapsed since the Big Bang, the number of possible quantum transitions since the Big Bang would be 1043 × 1017 = 1060.

  3. 3)

    Sir Arthur Eddington's estimate of the number of protons, neutrons and electrons in the observable cosmos (1080) [22] has been widely respected throughout the scientific literature for decades now.

Some estimates of the total number of elementary particles have been slightly higher. The Universe is 95 billion light years (30 gigaparsecs) across. We can convert this to cubic centimeters using the equation for the volume of a sphere (5 × 1086 cc). If we multiply this times 500 particles (100 neutrinos and 400 photons) per cc, we would get 2.5 × 1089 elementary particles in the visible universe.

A Universal Probability Bound could therefore be calculated by the product of these three factors: 1017 × 1043 × 1080 = 10140

If the highest estimate of the number of elementary particles in the Universe is used (e.g., 1089), the UPB would be 10149.

The UPB's discussed above are the highest calculated universal probability bounds ever published by many orders of magnitude [7, 8, 12]. They are the most permissive of (favorable to) extremely low-probability plausibility assertions in print [6] [[8] (pg. 216-217)]. All other proposed metrics of probabilistic resources are far less permissive of low-probability chance-hypothesis plausibility assertions. Emile Borel's limit of cosmic probabilistic resources was only 1050 [[23] (pg. 28-30)]. Borel based this probability bound in part on the product of the number of observable stars (109) times the number of possible human observations that could be made on those stars (1020). Physicist Bret Van de Sande at the University of Pittsburgh calculates a UPB of 2.6 × 1092 [8, 24]. Cryptographers tend to use the figure of 1094 computational steps as the resource limit to any cryptosystem's decryption [25]. MIT's Seth Lloyd has calculated that the universe could not have performed more than 10120 bit operations in its history [26].

Here we must point out that a discussion of the number of cybernetic or cryptographic "operations" is totally inappropriate in determining a prebiotic UPB. Probabilistic combinatorics has nothing to do with "operations." Operations involve choice contingency [2729]. Bits are "Yes/No" question opportunities [[30] (pg. 66)], each of which could potentially reduce the total number of combinatorial possibilities (2NH possible biopolymers: see Appendix 1) by half. But of course asking the right question and getting an answer is not a spontaneous physicochemical phenomenon describable by mere probabilistic uncertainty measures [3133]. Any binary "operation" involves a bona fide decision node [3436]. An operation is a formal choice-based function. Shannon uncertainty measures do not apply to specific choices [3739]. Bits measure only the number of non distinct, generic, potential binary choices, not actual specific choices [37]. Inanimate nature cannot ask questions, get answers, and exercise choice contingency at decision nodes in response to those answers. Inanimate nature cannot optimize algorithms, compute, pursue formal function, or program configurable switches to achieve integration and shortcuts to formal utility [28]. Cybernetic operations therefore have no bearing whatever in determining universal probability bounds for chance hypotheses.

Agreement on a sensible UPB in advance of (or at least totally independent of) any specific hypothesis, suggested scenario, or theory of mechanism is critical to experimental design. No known empirical or rational considerations exist to preclude acceptance of the above UPB. The only exceptions in print seem to come from investigators who argue that the above UPB is too permissive of the chance hypothesis [8, 12]. Faddish acceptance prevails of hypothetical scenarios of extremely low probability simply because they are in vogue and are theoretically possible. Not only a UPB is needed, but a fixed universal mathematical standard of plausibility is needed. This is especially true for complex hypothetical scenarios involving joint and/or conditional probabilities. Many imaginative hypothetical scenarios propose constellations of highly cooperative events that are theorized to self-organize into holistic formal schemes. Whether joint, conditional or independent, multiple probabilities must be factored into an overall plausibility metric. In addition, a universal plausibility bound is needed to eliminate overly imaginative fantasies from consideration for the best inference to causation.

The Universal Plausibility Metric (UPM)

To be able to definitively falsify ridiculously implausible hypotheses, we need first a Universal Plausibility Metric (UPM) to assign a numerical plausibility value to each proposed hypothetical scenario. Second, a Universal Plausibility Principle (UPP) inequality is needed as plausibility bound of this measurement for falsification evaluation. We need a cut-off point beyond which no extremely low probability scenario can be considered a "scientifically respectable" possibility. What is needed more than a probability bound is a plausibility bound. Any "possibility" that exceeds the ability of its probabilistic resources to generate should immediately be considered a "functional non possibility," and therefore an implausible scenario. While it may not be a theoretically absolute impossibility, if it exceeds its probabilistic resources, it is a gross understatement to declare that such a proposed scenario is simply not worth the expenditure of serious scientific consideration, pursuit, and resources. Every field of scientific investigation, not just biophysics and life-origin science, needs the application of the same independent test of credibility to judge the plausibility of its hypothetical events and scenarios. The application of this standard should be an integral component of the scientific method itself for all fields of scientific inquiry.

To arrive at the UPM, we begin with the maximum available probabilistic resources discussed above (Ω, upper case Omega) [6, 7]. But Ω could be considered from a quantum or a classical molecular/chemical perspective. Thus this paper proposes that the Ω quantification be broken down first according to the Level (L) or perspective of physicodynamic analysis (LΩ), where the perspective at the quantum level is represented by the superscript "q" (qΩ) and the perspective at the classical level is represented by "c" (cΩ). Each represents the maximum probabilistic resources available at each level of physical activity being evaluated, with the total number of quantum transitions being much larger than the total number of "ordinary" chemical reactions since the Big Bang.

Second, the maximum probabilistic resources LΩ (qΩ for the quantum level and cΩ for classical molecular/chemical level) can be broken down even further according to the astronomical subset being addressed using the general subscript "A" for Astronomical: LΩA (representing both qΩA and cΩA). The maximum probabilistic resources can then be measured for each of the four different specific environments of each LΩ, where the general subscript A is specifically enumerated with "u" for universe, "g" for our galaxy, "s" for our solar system, and "e" for earth:

To include meteorite and panspermia inoculations in the earth metrics, we use the Solar System metrics LΩs (qΩs and cΩs).

As examples, for quantification of the maximum probabilistic resources at the quantum level for the astronomical subset of our galactic phase space, we would use the qΩg metric. For quantification of the maximum probabilistic resources at the ordinary classical molecular/chemical reaction level in our solar system, we would use the cΩs metric.

The most permissive UPM possible would employ the probabilistic resources symbolized by qΩu where both the quantum level perspective and the entire universe are considered.

The sub division between the LΩA for the quantum perspective (quantified by qΩA) and that for the classical molecular/chemical perspective (quantified by cΩA), however, is often not as clear and precise as we might wish. Crossovers frequently occur. This is particularly true where quantum events have direct bearing on "ordinary" chemical reactions in the "everyday" classical world. If we are going to err in evaluating the plausibility of any hypothetical scenario, let us err in favor of maximizing the probabilistic resources of LΩA. In cases where quantum factors seem to directly affect chemical reactions, we would want to use the four quantum level metrics of qΩA (qΩu, qΩg, qΩs and qΩe) to preserve the plausibility of the lowest-probability explanations.

Quantification of the Universal Plausibility Metric (UPM)

The computed Universal Plausibility Metric (UPM) objectively quantifies the level of plausibility of any chance hypothesis or theory. The UPM employs the symbol ξ (Xi, pronounced zai in American English, sai in UK English, ksi in modern Greek) to represent the computed UPM according to the following equation:


where f represents the number of functional objects/events/scenarios that are known to occur out of all possible combinations (lower case omega, ω) (e.g., the number [f] of functional protein family members of varying sequence known to occur out of sequence space [ω]), and LΩA (upper case Omega, Ω) represents the total probabilistic resources for any particular probabilistic context. The "L" superscript context of Ω describes which perspective of analysis, whether quantum (q) or a classical (c), and the "A" subscript context of Ω enumerates which subset of astronomical phase space is being evaluated: "u" for universe, "g" for our galaxy, "s" for our solar system, and "e" for earth. Note that the basic generic UPM (ξ) equation's form remains constant despite changes in the variables of levels of perspective (L: whether q or c) and astronomic subsets (A: whether u, g, s, or e).

The calculations of probabilistic resources in LΩA can be found in Appendix 2. Note that the upper and lower case omega symbols used in this equation are case sensitive and each represents a completely different phase space.

The UPM from both the quantum (qΩA) and classical molecular/chemical (cΩA) perspectives/levels can be quantified by Equation 1. This equation incorporates the number of possible transitions or physical interactions that could have occurred since the Big Bang. Maximum quantum-perspective probabilistic resources qΩu were enumerated above in the discussion of a UPB [6, 7] [[8] (pg. 215-217)]. Here we use basically the same approach with slight modifications to the factored probabilistic resources that comprise Ω.

Let us address the quantum level perspective (q) first for the entire universe (u) followed by three astronomical subsets: our galaxy (g), our solar system (s) and earth (e).

Since approximately 1017 seconds have elapsed since the Big Bang, we factor that total time into the following calculations of quantum perspective probabilistic resource measures. Note that the difference between the age of the earth and the age of the cosmos is only a factor of 3. A factor of 3 is rather negligible at the high order of magnitude of 1017 seconds since the Big Bang (versus age of the earth). Thus, 1017 seconds is used for all three astronomical subsets:

These above limits of probabilistic resources exist within the only known universe that we can repeatedly observe--the only universe that is scientifically addressable. Wild metaphysical claims of an infinite number of cosmoses may be fine for cosmological imagination, religious belief, or superstition. But such conjecturing has no place in hard science. Such claims cannot be empirically investigated, and they certainly cannot be falsified. They violate Ockham's (Occam's) Razor [40]. No prediction fulfillments are realizable. They are therefore nothing more than blind beliefs that are totally inappropriate in peer-reviewed scientific literature. Such cosmological conjectures are far closer to metaphysical or philosophic enterprises than they are to bench science.

From a more classical perspective at the level of ordinary molecular/chemical reactions, we will again provide metrics first for the entire universe (u) followed by three astronomical subsets, our galaxy (g), our solar system (s) and earth (e).

The classical molecular/chemical perspective makes two primary changes from the quantum perspective. With the classical perspectivethe number of atoms rather than the number of protons, neutrons and electrons is used. In addition, the total number of classical chemical reactions that could have taken place since the Big Bang is used rather than transitions related to cubic light-Planck's. The shortest time any transition requires before a chemical reaction can take place is 10 femtoseconds [4146]. A femtosecond is 10-15 seconds. Complete chemical reactions, however, rarely take place faster than the picosecond range (10-12 secs). Most biochemical reactions, even with highly sophisticated enzymatic catalysis, take place no faster than the nano (10-9) and usually the micro (10-6) range. To be exceedingly generous (perhaps overly permissive of the capabilities of the chance hypothesis), we shall use 100 femtoseconds as the shortest chemical reaction time. 100 femtoseconds is 10-13 seconds. Thus 1013 simple and fastest chemical reactions could conceivably take place per second in the best of theoretical pipe-dream scenarios. The four cΩA measures are as follows:

Remember that LΩe excludes meteorite and panspermia inoculations. To include meteorite and panspermia inoculations, we use the metric for our solar system cΩs.

These maximum metrics of the limit of probabilistic resources are based on the best-thus-far estimates of a large body of collective scientific investigations. We can expect slight variations up or down of our best guesses of the number of elementary particles in the universe, for example. But the basic formula presented as the Universal Plausibility Metric (PM) will never change. The Universal Plausibility Principle (UPP) inequality presented below is also immutable and worthy of law-like status. It affords the ability to objectively once and for all falsify not just highly improbable, but ridiculously implausible scenarios. Slight adjustments to the factors that contribute to the value of each LΩA are straightforward and easy for the scientific community to update through time.

Most chemical reactions take longer by many orders of magnitude than what these exceedingly liberal maximum probabilistic resources allow. Biochemical reactions can take years to occur in the absence of highly sophisticated protein enzymes not present in a prebiotic environment. Even humanly engineered ribozymes rarely catalyze reactions by an enhancement rate of more than 105 [4751]. Thus the use of the fastest rate known for any complete chemical reaction (100 femtoseconds) seems to be the most liberal/forgiving probability bound that could possibly be incorporated into the classical chemical probabilistic resource perspective cΩA. For this reason, we should be all the more ruthless in applying the UPP test of falsification presented below to seemingly "far-out" metaphysical hypotheses that have no place in responsible science.

Falsification using The Universal Plausibility Principle (UPP)

The Universal Plausibility Principle (UPP) states that definitive operational falsification of any chance hypothesis is provided by the inequality of:

This definitive operational falsification holds for hypotheses, theories, models, or scenarios at any level of perspective (q or c) and for any astronomical subset (u, g, s, and e). The UPP inequality's falsification is valid whether the hypothesized event is singular or compound, independent or conditional. Great care must be taken, however, to eliminate errors in the calculation of complex probabilities. Every aspect of the hypothesized scenario must have its probabilistic components factored into the one probability (p) that is used in the UPM (See equation 2 below). Many such combinatorial possibilities are joint or conditional. It is not sufficient to factor only the probabilities of each reactant's formation, for example, while omitting the probabilistic aspects of each reactant being presented at the same place and time, becoming available in the required reaction order, or being able to react at all (activated vs. not activated). Other factors must be included in the calculation of probabilities: optical isomers, non-peptide bond formation, many non biological amino acids that also react [8]. The exact calculation of such probabilities is often not straightforward. But in many cases it becomes readily apparent that whatever the exact multi-factored calculation, the probability "p" of the entire scenario easily crosses the plausibility bound provided by the UPP inequality. This provides a definitive objective standard of falsification. When ξ < 1, immediately the notion should be considered "not a scientifically plausible possibility." A ξ value < 1 should serve as an unequivocal operational falsification of that hypothesis. The hypothetical scenario or theory generating that ξ metric should be excluded from the differential list of possible causes. The hypothetical notion should be declared to be outside the bounds of scientific respectability. It should be flatly rejected as the equivalent of superstition.

f/ω in Equation 1 is in effect the probability of a particular functional event or object occurring out of all possible combinations. Take for example an RNA-World model. 23 different functional ribozymes in the same family might arise out of 1015 stochastic ensembles of 50-mer RNAs. This would reduce to a probability p of roughly 10-14 of getting a stochastic ensemble that manifested some degree of that ribozyme family's function.

Thus f/ω in Equation 1 reduces to the equivalent of a probability p:


where "p" represents an extremely low probability of any chance hypothesis that is asserted to be plausible given LΩA probabilistic resources, in this particular case cΩe probabilistic resources.

As examples of attempts to falsify, suppose we have three different chance hypotheses, each with its own low probability (p), all being evaluated from the quantum perspective at the astronomical level of the entire universe (qΩu). Given the three different probabilities (p) provided below, the applied UPP inequality for each ξ = p qΩu of each hypothetical scenario would establish definitive operational falsification for one of these three hypothetical scenarios, and fail to falsify two others:

Let us quantify an example of the use of the UPM and UPP to attempt falsification of a chance hypothetical scenario:

Suppose 103 biofunctional polymeric sequences of monomers (f) exist out of 1017 possible sequences in sequence space (ω) all of the same number (N) of monomers. That would correspond to one chance in 1014 of getting a functional sequence by chance (p = 103/1017 = 1/1014 = 10-14 of getting a functional sequence). If we were measuring the UPM from the perspective of a classical chemical view on earth over the last 5 billion years (cΩe = 1070), we would use the following UPM equation (#1 above) with substituted values:

Since ξ > 1, this particular chance hypothesis is shown unequivocally to be plausible and worthy of further scientific investigation.

As one of the reviewers of this manuscript has pointed out, however, we might find the sequence space ω, and therefore the probability space f/ω, to be radically different for abiogenesis than for general physico-chemical reactions. The sequence space ω must include factors such as heterochirality, unwanted non-peptide-bond formation, and the large number of non biological amino acids present in any prebiotic environment [8, 12]. This greatly increases ω, and would tend to substantially reduce the probability p of naturalistic abiogenesis. Spontaneously biofunctional stochastic ensemble formation was found to be only 1 in 1064 when TEM-1 β-lactamase's working domain of around 150 amino acids was used as a model [52]. Function was related to the hydropathic signature necessary for proper folding (tertiary structure). The ability to confer any relative degree of beta-lactam penicillin-like antibiotic resistance to bacteria was considered to define "biofunctional" in this study. Axe further measured the probability of a random 150-residue primary structure producing any short protein, despite many allowable monomeric substitutions, to be 10-74. This probability is an example of a scientifically determined p that should be incorporated into any determination of the UPM in abiogenesis models.

Don't multiverse models undermine The UPP?

Multiverse models imagine that our universe is only one of perhaps countless parallel universes [5355]. Appeals to the Multiverse worldview are becoming more popular in life-origin research as the statistical prohibitiveness of spontaneous generation becomes more incontrovertible in a finite universe [5658]. The term "notion," however, is more appropriate to refer to multiverse speculation than "theory." The idea of multiple parallel universes cannot legitimately qualify as a testable scientific hypothesis, let alone a mature theory. Entertaining multiverse "thought experiments" almost immediately takes us beyond the domain of responsible science into the realm of pure metaphysical belief and conjecture. The dogma is literally "beyond physics and astronomy," the very meaning of the word "metaphysical."

The notion of multiverse has no observational support, let alone repeated observations. Empirical justification is completely lacking. It has no testability: no falsification potential exists. If provides no prediction fulfillments. The non parsimonious construct of multiverse grossly violates the principle of Ockham's (Occam's) Razor [40]. No logical inference seems apparent to support the strained belief other than a perceived need to rationalize what we know is statistically prohibitive in the only universe that we do experience. Multiverse fantasies tend to constitute a back-door fire escape for when our models hit insurmountable roadblocks in the observable cosmos. When none of the facts fit our favorite model, we conveniently create imaginary extra universes that are more accommodating. This is not science. Science is interested in falsification within the only universe that science can address. Science cannot operate within mysticism, blind belief, or superstition. A multiverse may be fine for theoretical metaphysical models. But no justification exists for inclusion of this "dream world" in the sciences of physics and astronomy.

It could be argued that multiverse notions arose only in response to the severe time and space constraints arising out of Hawking, Ellis and Penrose's singularity theorems [5961]. Solutions in general relativity involve singularities wherein matter is compressed to a point in space and light rays originate from a curvature. These theorems place severe limits on time and space since the Big Bang. Many of the prior assumptions of limitless time and sample space in naturalistic models were eliminated by the demonstration that time and space in the cosmos are quite finite, not infinite. For instance, we only have 1017-1018 seconds at most to work with in any responsible cosmological universe model since the Big Bang. Glansdorff makes the point, "Conjectures about emergence of life in an infinite multiverse should not confuse probability with possibility." [62]

Even if multiple physical cosmoses existed, it is a logically sound deduction that linear digital genetic instructions using a representational material symbol system (MSS) [63] cannot be programmed by the chance and/or fixed laws of physicodynamics [2729, 32, 33, 3639, 64, 65]. This fact is not only true of the physical universe, but would be just as true in any imagined physical multiverse. Physicality cannot generate non physical Prescriptive Information (PI) [29]. Physicodynamics cannot practice formalisms (The Cybernetic Cut) [27, 34]. Constraints cannot exercise formal control unless those constraints are themselves chosen to achieve formal function [28] ("Constraints vs. Controls" currently in peer review). Environmental selection cannot select at the genetic level of arbitrary symbol sequencing (e.g., the polymerization of nucleotides and codons). (The GS Principle [Genetic Selection Principle]) [36, 64]. Polymeric syntax (sequencing; primary structure) prescribes future (potential; not-yet-existent) folding and formal function of small RNAs and DNA. Symbol systems and configurable switch-settings can only be programmed with choice contingency, not chance contingency or fixed law, if non trivial coordination and formal organization are expected [29, 38]. The all-important determinative sequencing of monomers is completed with rigid covalent bonds before any transcription, translation, or three-dimensional folding begins. Thus, imagining multiple physical universes or infinite time does not solve the problem of the origin of formal (non physical) biocybernetics and biosemiosis using a linear digital representational symbol system. The source of Prescriptive Information (PI) [29, 35] in a metaphysically presupposed material-only world is closely related to the problem of gene emergence from physicodynamics alone. The latter hurdles remain the number-one enigmas of life-origin research [66].

The main subconscious motivation behind multiverse conjecture seems to be, "Multiverse models can do anything we want them to do to make our models work for us." We can argue Multiverse models ad infinitum because their potential is limitless. The notion of Multiverse has great appeal because it can explain everything (and therefore nothing). Multiverse models are beyond scientific critique, falsification, and prediction fulfillment verification. They are purely metaphysical.

Multiverse imaginings, therefore, offer no scientific threat whatever to the universality of the UPM and UPP in the only cosmic reality that science knows and investigates.


Mere possibility is not an adequate basis for asserting scientific plausibility. Indeed, the practical need exists in science to narrow down lists of possibilities on the basis of objectively quantifiable plausibility.

A numerically defined Universal Plausibility Metric (UPM = ξ) has been provided in this paper. A numerical inequality of ξ < 1 establishes definitive operational falsification of any chance hypothesis (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. No low-probability plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (ξ < 1).

The use of the UPM and application of the UPP inequality to each specific UPM will promote clarity, efficiency and decisiveness in all fields of scientific methodology by allowing operational falsification of ridiculously implausible plausibility assertions. The UPP is especially important in astrobiology and all areas of life-origin research where mere theoretical possibility is often equated erroneously with plausibility. The application of The Universal Plausibility Principle (UPP) precludes the inclusion in scientific literature of wild metaphysical conjectures that conveniently ignore or illegitimately inflate probabilistic resources to beyond the limits of observational science. The UPM and UPP together prevent rapidly shrinking funding and labor resources from being wasted on preposterous notions that have no legitimate place in science. At best, notions with ξ < 1 should be considered not only operationally falsified hypotheses, but bad metaphysics on a plane equivalent to blind faith and superstition.

Appendix 1

2NH is the "practical" number (high probability group), measured in bits, rather than the erroneous theoretical nN as is usually published, of all possible biopolymeric sequences that could form, where

N = the number of loci in the string (or monomers in polymer)

n = the number of possible alphabetical symbols that could be used at each locus (4 nucleotides, 64 codons, or 20 amino acids)

H = the Shannon uncertainty at each locus

For a 100 mer biopolymeric primary structure, the number of sequence combinations is actually only 2.69 × 10-6 of the theoretically possible and more intuitive measure of nN sequences. The reason derives from the Shannon-McMillan-Breiman Theorem [6770] which is explained in detail by Yockey [[71], pg 73-76].

Appendix 2

For best estimates of the number of atoms, protons, neutrons and electrons in the universe and its astronomical subsets, see [72].

Simple arithmetic is needed for many of these calculations. For example, the mass of our galaxy is estimated to be around 1012 solar masses. The mass of "normal matter" in our galaxy is around 1011 solar masses. The mass of the sun is about 2 × 1030 kg. The mass of our solar system is surprisingly not much more than the mass of the sun, still about 2 × 1030 kg. (The Sun contains 99.85% of all the matter in the Solar System, and the planets contain only 0.136% of the mass of the solar system.) The mass of a proton or neutron is 1.7 × 10-27 kg. Thus the number of protons & neutrons in our solar system is around 2 × 1030/1.7 × 10-27 = 1.2 × 1057. The number of electrons is about half of that, or 0.6 × 1057. The number of protons, neutrons and electrons in our solar system is therefore around 1.8 × 1057. The number of protons, neutrons and electrons in our galaxy is around 1.8 × 1068. We have crudely estimated a total of 100 protons, neutrons and electrons on average per atom. All of these estimates will of course vary some through time as consensus evolves. But adjustments to LΩA are easily updated with absolutely no change in the Universal Plausibility Metric (UPM) equation or the Universal Plausibility Principle (UPP) inequality. Definitive operational falsification still holds when ξ < 1.


  1. Emmeche C: Closure, function, emergence, semiosis, and life: the same idea? Reflections on the concrete and the abstract in theoretical biology. Ann N Y Acad Sci. 2000, 901: 187-197.

    Article  CAS  PubMed  Google Scholar 

  2. Baghramian M: Relativism. 2004, London Routledge

    Google Scholar 

  3. Balasubramanian P: The concept of presupposition: a study. 1984, [Madras]: Radhakrishnan Institute for Advanced Study in Philosophy, University of Madras

    Google Scholar 

  4. Beaver DI: Presupposition and assertion in dynamic semantics. 2001, Stanford, Calif.: CSLI Publications; FoLLI

    Google Scholar 

  5. Bohr N: Discussion with Einstein on epistemological problems in atomic physics. Albert Einstein: Philosopher-Scientist. Edited by: Schilpp PA. 1949, Evanston, IL: Library of Living Philosophers

    Google Scholar 

  6. Dembski W: The Design Inference: Eliminating Chance Through Small Probabilities. 1998, Cambridge: Cambridge University Press

    Book  Google Scholar 

  7. Dembski WA: No Free Lunch. 2002, New York: Rowman and Littlefield

    Google Scholar 

  8. Meyer SC: Signature in the Cell. 2009, New York: Harper Collins

    Google Scholar 

  9. Kuhn TS: The Structure of Scientific Revolutions. 1970, Chicago: The University of Chicago Press, 2

    Google Scholar 

  10. Sokal A, Bricmont J: Fashionable Nonsense. 1998, New York, NY: Picador

    Google Scholar 

  11. Popper KR: The logic of scientific discovery. 6th impression revised. edn. 1972, London: Hutchinson

    Google Scholar 

  12. Johnson DE: Probability's Nature and Nature's Probabilty (A call to scientific integrity). 2009, Charleston, S.C.: Booksurge Publishing

    Google Scholar 

  13. Slife B, Williams R: Science and Human Behavior. What's Behind the Research? Discovering Hidden Assumptions in the Behavioral Sciences. Edited by: Slife B, Williams R. 1995, Thousand Oaks, CA: SAGE Publications, 167-204.

    Chapter  Google Scholar 

  14. Lipton P: Inference to the Best Explanation. 1991, New York: Routledge

    Book  Google Scholar 

  15. Press SJ, Tanur JM: The Subjectivity of Scientists and the Bayesian Approach. 2001, New York: John Wiley & Sons

    Book  Google Scholar 

  16. Congdon P: Bayesian Statistical Modeling. 2001, New York: John Wiley and Sons

    Google Scholar 

  17. Bandemer H: Modeling uncertain data. 1992, Berlin: Akademie Verlag, 1

    Google Scholar 

  18. Corfield D, Williamson J, Eds: Foundations of Bayesianism. 2001, Dorcrecht: Kluwer Academic Publishers

    Google Scholar 

  19. Slonim N, Friedman N, Tishby N: Multivariate Information Bottleneck. Neural Comput. 2006, 18: 1739-1789. 10.1162/neco.2006.18.8.1739.

    Article  PubMed  Google Scholar 

  20. Fisher RA: The Design of Experiments. 1935, New York: Hafner

    Google Scholar 

  21. Fisher RA: Statistical Methods and Statistical Inference. 1956, Edinburgh: Oliver and Boyd

    Google Scholar 

  22. Eddington A: The Nature of the Physical World. 1928, New York: Macmillan

    Google Scholar 

  23. Borel E: Probabilities and Life. 1962, New York: Dover

    Google Scholar 

  24. Sande van de B: Measuring complexity in dynamical systems. RAPID II. 2006, Biola University

    Google Scholar 

  25. Dam KW, Lin HS, Eds: Cryptography's Role in Securing the Information Society. 1996, Washington, D.C.: National Academy Press

    Google Scholar 

  26. Lloyd S: Computational capacity of the universe. Phys Rev Lett. 2002, 88: 237901-237908. 10.1103/PhysRevLett.88.237901.

    Article  PubMed  Google Scholar 

  27. Abel DL: 'The Cybernetic Cut': Progressing from description to prescription in systems theory. The Open Cybernetics and Systemics Journal. 2008, 2: 234-244. 10.2174/1874110X00802010234.

    Article  Google Scholar 

  28. Abel DL: The capabilities of chaos and complexity. Int J Mol Sci. 2009, 10: 247-291. 10.3390/ijms10010247. Open access

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  29. Abel DL: The biosemiosis of prescriptive information. Semiotica. 2009, 1-19. 10.1515/semi.2009.026.

    Google Scholar 

  30. Adami C: Introduction to Artificial Life. 1998, New York: Springer/Telos

    Book  Google Scholar 

  31. Abel DL: Is Life Reducible to Complexity?. Fundamentals of Life. Edited by: Palyi G, Zucchi C, Caglioti L. 2002, Paris: Elsevier, 57-72.

    Google Scholar 

  32. Abel DL: Life origin: The role of complexity at the edge of chaos. Washington Science. Edited by: Chandler J, Kay P. 2006, Headquarters of the National Science Foundation, Arlington, VA

    Google Scholar 

  33. Abel DL: Complexity, self-organization, and emergence at the edge of chaos in life-origin models. Journal of the Washington Academy of Sciences. 2007, 93: 1-20.

    Google Scholar 

  34. Abel DL: The Cybernetic Cut (Scirus Topic Page)

  35. Abel DL: Prescriptive Information (PI) (Scirus Topic Page)

  36. Abel DL: The GS (Genetic Selection) Principle. Frontiers in Bioscience. 2009, 14: 2959-2969. 10.2741/3426. Open access

    Article  CAS  Google Scholar 

  37. Abel DL, Trevors JT: Three subsets of sequence complexity and their relevance to biopolymeric information. Theoretical Biology and Medical Modeling. 2005, 2: 10.1186/1742-4682-2-29. Open access

    Google Scholar 

  38. Abel DL, Trevors JT: Self-Organization vs. Self-Ordering events in life-origin models. Physics of Life Reviews. 2006, 3: 211-228. 10.1016/j.plrev.2006.07.003.

    Article  Google Scholar 

  39. Abel DL, Trevors JT: More than Metaphor: Genomes are Objective Sign Systems. BioSemiotic Research Trends. Edited by: Barbieri M. 2007, New York: Nova Science Publishers, 1-15.

    Google Scholar 

  40. Vitányi PMB, Li M: Minimum Description Length Induction, Bayesianism and Kolmogorov Complexity. IEEE Transactions on Information Theory. 2000, 46: 446-464. 10.1109/18.825807.

    Article  Google Scholar 

  41. Zewail AH: The Birth of Molecules. Scientific American. 1990, December: 40-46.

    Google Scholar 

  42. Zewail AH: The Nobel Prize in Chemistry. For his studies of the transition states of chemical reactions using femtosecond spectroscopy: Press Release.

  43. Xia T, Becker H-C, Wan C, Frankel A, Roberts RW, Zewail AH: The RNA-protein complex: Direct probing of the interfacial recognition dynamics and its correlation with biological functions. PNAS. 2003, 1433099100-

    Google Scholar 

  44. Sundstrom V: Femtobiology. Annual Review of Physical Chemistry. 2008, 59: 53-77. 10.1146/annurev.physchem.59.032607.093615.

    Article  PubMed  Google Scholar 

  45. Schwartz SD, Schramm VL: Enzymatic transition states and dynamic motion in barrier crossing. Nat Chem Biol. 2009, 5: 551-558. 10.1038/nchembio.202.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  46. Pedersen S, Herek JL, Zewail AH: The Validity of the "Diradical" Hypothesis: Direct Femtoscond Studies of the Transition-State Structures. Science. 1994, 266: 1359-1364. 10.1126/science.266.5189.1359.

    Article  CAS  PubMed  Google Scholar 

  47. Wiegand TW, Janssen RC, Eaton BE: Selection of RNA amide synthases. Chem Biol. 1997, 4: 675-683. 10.1016/S1074-5521(97)90223-4.

    Article  CAS  PubMed  Google Scholar 

  48. Emilsson GM, Nakamura S, Roth A, Breaker RR: Ribozyme speed limits. RNA. 2003, 9: 907-918. 10.1261/rna.5680603.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  49. Robertson MP, Ellington AD: Design and optimization of effector-activated ribozyme ligases. Nucleic Acids Res. 2000, 28: 1751-1759. 10.1093/nar/28.8.1751.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  50. Hammann C, Lilley DM: Folding and activity of the hammerhead ribozyme. Chembiochem. 2002, 3: 690-700. 10.1002/1439-7633(20020802)3:8<690::AID-CBIC690>3.0.CO;2-C.

    Article  CAS  PubMed  Google Scholar 

  51. Breaker RR, Emilsson GM, Lazarev D, Nakamura S, Puskarz IJ, Roth A, Sudarsan N: A common speed limit for RNA-cleaving ribozymes and deoxyribozymes. Rna. 2003, 9: 949-957. 10.1261/rna.5670703.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  52. Axe DD: Estimating the prevalence of protein sequences adopting functional enzyme folds. J Mol Biol. 2004, 341: 1295-1315. 10.1016/j.jmb.2004.06.058.

    Article  CAS  PubMed  Google Scholar 

  53. Barrau A: Physics in the multiverse. CERN Courier. See also the letter to the editor of CERN Courier critiquing this paper: [], See also the letter to the editor of CERN Courier critiquing this paper: []

  54. Carr B, Ed: Universe or Multiverse?. 2007, Cambridge: Cambridge University Press

    Google Scholar 

  55. Garriga J, Vilenkin A: Prediction and explanation in the multiverse. PhysRevD. 2008, 77: 043526-arXiv:0711.2559 11/7/2009

    Google Scholar 

  56. Axelsson S: Perspectives on handedness, life and physics. Med Hypotheses. 2003, 61: 267-274. 10.1016/S0306-9877(03)00168-3.

    Article  PubMed  Google Scholar 

  57. Koonin EV: The Biological Big Bang model for the major transitions in evolution. Biol Direct. 2007, 2: 21-10.1186/1745-6150-2-21.

    Article  PubMed Central  PubMed  Google Scholar 

  58. Koonin EV: The cosmological model of eternal inflation and the transition from chance to biological evolution in the history of life. Biol Direct. 2007, 2: 15-10.1186/1745-6150-2-15.

    Article  PubMed Central  PubMed  Google Scholar 

  59. Hawking S, Ellis GFR: The Large Scale Structure of Space-Time. 1973, Cambridge: Cambridge University Press

    Book  Google Scholar 

  60. Hawking S: A Brief History of Time. 1988, New York: Bantam Books

    Google Scholar 

  61. Hawking S, Penrose R: The Nature of Space and Time. 1996, Princeton, N.J.: Princeton U. Press

    Google Scholar 

  62. Glansdorff N, Xu Y, Labedan B: The origin of life and the last universal common ancestor: do we need a change of perspective?. Res Microbiol. 2009, 160: 522-528. 10.1016/j.resmic.2009.05.003.

    Article  CAS  PubMed  Google Scholar 

  63. Rocha LM: Evolution with material symbol systems. Biosystems. 2001, 60: 95-121. 10.1016/S0303-2647(01)00110-1.

    Article  CAS  PubMed  Google Scholar 

  64. Abel DL: The GS (Genetic Selection) Principle (Scirus Topic Page). Last accessed Nov, 2009,

  65. Abel DL, Trevors JT: More than metaphor: Genomes are objective sign systems. Journal of BioSemiotics. 2006, 1: 253-267.

    Google Scholar 

  66. Origin of Life Prize.http://www.lifeorigin.org

  67. Shannon C: A mathematical theory of communication. The Bell System Technical Journal. 1948, XXVII: 379-423.

    Article  Google Scholar 

  68. McMillan : The basic theorems of information theory. Ann Math Stat. 1953, 24: 196-219. 10.1214/aoms/1177729028.

    Article  Google Scholar 

  69. Breiman L: The individual ergodic theorem of information theory. Ann Math Stat. 1957, 28: 808-811. Correction in 831:809-810

    Google Scholar 

  70. Kinchin I: The concept of entropy in probabililty theory. Also, On the foundamental theorems of information theory. Mathematical Foundations of Information Theory. 1958, New York: Dover Publications, Inc

    Google Scholar 

  71. Yockey HP: Information Theory and Molecular Biology. 1992, Cambridge: Cambridge University Press

    Google Scholar 

  72. Allen AN: Astrophysical Quantities. 2000, New York: Springer-Verlog

    Google Scholar 

Download references


This author claims no originality or credit for some of the referenced technical probabilistic concepts incorporated into this paper. He is merely categorizing, adjusting, organizing, and mathematically formalizing ideas from previously published work [68, 12] into a badly needed general principle of scientific investigation.

Citing a few mathematical technical contributions found in prior peer-reviewed literature does not constitute an endorsement of the cited authors' personal metaphysical belief systems. Philosophic and especially religious perspectives have no place in scientific literature, and are irrelevant to the technical UPM calculation and UPP presented in this paper.

Author information

Authors and Affiliations


Corresponding author

Correspondence to David L Abel.

Additional information

Competing interests

The author declares that he has no competing interests.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Abel, D.L. The Universal Plausibility Metric (UPM) & Principle (UPP). Theor Biol Med Model 6, 27 (2009).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: