In philosophy, verisimilitude (or truthlikeness) is the notion that some propositions are closer to being true than other propositions. The problem of verisimilitude is the problem of articulating what it takes for one false theory to be closer to the truth than another false theory.
This problem was central to the philosophy of Karl Popper, largely because Popper was among the first to affirm that truth is the aim of scientific inquiry while acknowledging that most of the greatest scientific theories in the history of science are, strictly speaking, false. If this long string of purportedly false theories is to constitute progress with respect to the goal of truth, then it must be at least possible for one false theory to be closer to the truth than others.
Popper assumed that scientists are interested in highly informative theories, in part for methodological reasons—the more informative a theory, the easier it is to test, and the greater its predictive power. But clearly . So Popper proposed that closeness to the truth is a function of two factors—truth and content. The more truths that a theory entails (other things being equal) the closer it is to the truth.
Intuitively at least, it seems that Newton's theory of motion entails a good many more truths than does, say, Aristotle's theory—despite the fact that both are known to have flaws. Even two true theories can have differing degrees of verisimilitude, depending on how much true information they deliver. For example, the claim "it will be raining on Thursday next week," if true, seems closer to the truth than the true yet logically weaker claim "it will either be raining next Thursday or it will be sunny".
Popper's formal definition of verisimilitude was challenged since 1974 by Pavel Tichý, John Henry Harris, and David Miller, who argued that Popper's definition has an unintended consequence: that no false theory can be closer to the truth than another. Popper himself stated: "I accepted the criticism of my definition within minutes of its presentation, wondering why I had not seen the mistake before." This result gave rise to a search for an account of verisimilitude that did not deem progress towards the truth an impossibility.
Some of the new theories (e.g. those proposed by David Miller and by Theo Kuipers) build on Popper's approach, guided by the notion that truthlikeness is a function of a truth factor and a content factor. Others (e.g. those advanced by Gerhard Schurz in collaboration with Paul Weingartner, by Mortensen, and by Ken Gemes) are also inspired by Popper's approach but locate what they believe to be the error of Popper's proposal in his overly generous notion of content, or consequence, proposing instead that the consequences that contribute to closeness to truth must be, in a technical sense, "relevant". A different approach (already proposed by Tichý and Risto Hilpinen and developed especially by Ilkka Niiniluoto and Graham Oddie) takes the "likeness" in truthlikeness literally, holding that a proposition's likeness to the truth is a function of the overall likeness to the actual world of the possible worlds in which the proposition would be true. An attempt to use the notion of point-free metric space is proposed by Giangiacomo Gerla. There is currently a debate about whether or to what extent these different approaches to the concept are compatible.
Another problem in Popper's theory of verisimilitude is the connection between truthlikeness as the goal of scientific progress, on the one hand, and methodology, on the other hand, as the ways in which we can to some extent ensure that scientific research actually approaches this goal. Popper conceived of his definition as a justification of his own preferred methodology: falsificationism, in the following sense: suppose theory A is closer to the truth than theory B according to Popper's qualitative definition of verisimilitude; in this case, we will (or should, if that definition had been logically sound) have that all true consequences of B (ie: all predicted consequences of theory B's mathematical and physical predictions subject to a particular set of initial conditions) are consequences of [theory] A (['s] similarly predicted consequences – i.e., informally, B ≤ A), and that all false consequences of A are consequences of B (in that those set of events deemed impossible by theory A are a subset of those events deemed impossible B, subject to the same initial data conditions for both – i.e., informally, ¬B ≥ ¬A, so that ¬A ≤ ¬B); this means that, if A and B are so related, then it should be the case that all known false empirical consequences of A also follow from B, and all known true empirical consequences of B do follow from A. So, if A were closer to the truth than B, then A should be better corroborated than B by any possible amount of empirical evidence. Lastly, this easy theorem allows interpretation of the fact that A is actually better corroborated than B as a corroboration of the hypothesis (or 'meta-hypothesis') that A is more verisimilar than B.