Menu

Inductive Logic

A Thematic Compilation by Avi Sion

header photo

17. An Inductive Logic Primer

 

1.    Introduction

The reader of the present volume does not need to have previously studied logic in depth to be able to follow the discussion fully, but will still need to grasp certain concepts and terminologies. We will try to fulfill this specific task here, while reminding the reader that the subject is much, much wider than that.

Broadly speaking, we refer to any thought process which tends to convince people as ‘logical’. If such process continues to be convincing under perspicacious scrutiny, it is regarded as good logic; otherwise, as bad. More specifically, we consider only ‘good’ logic as at all logic; ‘bad’ logic is then simply illogical. The loose definition of logic allows us to speak of stupid forms of thought as ‘logics’ (e.g. ‘racist logic’), debasing the term; the stricter definition is more demanding.[1]

Logic, properly speaking, is both an art and a science. As an art, its purpose is the acquisition of knowledge; as a science, it is the validation of knowledge. Many people are quite strong in the art of logic, without being at all acquainted with the science of logic. Some people are rather weak in practice, though well-informed theoretically. In any case, study of the subject is bound to improve one’s skills.

Logic is traditionally divided into two - induction and deduction. Induction is taken to refer to inference from particular data to general principles (often through the medium of prior generalities); whereas deduction is taken to refer to inference from general principles to special applications (or to other generalities). The processes ‘from the particular to general’ and ‘from the general to the particular’ are rarely if ever purely one way or the other. Knowledge does not grow linearly, up from raw data, down from generalities, but in a complex interplay of the two; the result at any given time being a thick web of mutual dependencies between the various items of one’s knowledge.

Logic theory has succeeded in capturing and expressing in formal terms many of the specific logical processes we use in practice. Once properly validated, these processes, whether inductive or deductive in description, become formally certain. But it must always be kept in mind that, however impeccably these formalities have been adhered to - the result obtained is only as reliable as the data on which it is ultimately based. In a sense, the role of logic is to ponder information and assign it some probability rating between zero and one hundred.

Advanced logic theory has shown that what ultimately distinguishes induction from deduction is simply the number of alternative results offered as possible by given information: if there is a choice, the result is inductive; if there is no choice, the result is deductive. Deductive logic may seem to give more certain results, but only because it conceals its assumptions more; in truth, it is merely passing on probability, its outputs being no more probable than the least probable of its inputs. When inductive logic suggests some idea as the most likely to be true, compared to any other idea, it is not really leaving us with much choice; it is telling us that in the present context of knowledge, we decisively have to follow its suggestion. These are the reasons why the word “proof” is often ambiguous; do we mean deductive proof or inductive proof, and does it matter which we mean?

[…]

 

2.    Induction

How do propositions, such as [the categorical or the conditional], come to be known? This is the question inductive logic tries to answer. The way we commonly acquire knowledge of nature, as ordinary individuals or as scientists, is by a gradual progression, involving both experience or perception, whether of external phenomena (through the sense organs somehow) or of mental phenomena (with what we often call the “mind’s eye,” whatever that is), and reason or conceptual insight (which determines our evaluation and ordering of experience).

At the simplest level, we observe phenomena, and take note, say, that: “there are Xs which are Y” (which means, “some X are Y” = I), leaving open at first the issue of whether these X are representative of all X (so that A is true), or just special cases (so that IO is true). The particular form I is needed by us as a temporary station, to allow us to express where we stand empirically thus far, without having to be more definite than we can truthfully be, without being forced to rush to judgment.

If after thorough examination of the phenomena at hand, a continued scanning of our environment or the performance of appropriate experiments, we do not find “Xs which are not Y,” we take a leap and presume that “all X are Y” (A). This is a generalization, an inductive act which upgrades an indefinite particular I to a universal of the same polarity A, until if ever evidence is found to the contrary. The justification of such a leap is that A is more uniform with I than O, and therefore involves less assumption: given I, a move to A requires no change of polarity, unlike a move to O, whereas with regard to quantity, the degree of assumption is the same either way.

If, however, we do find “Xs which are not Y” (i.e. that “some X are not Y” = O), we simply conclude with a definite contingent IO. If the discovery of O preceded any assumption of A, so well and good, the induction of IO proceeded in an orderly fashion. If on the other hand, we had assumed A, and then discover O, an inconsistency has effectively occurred in our belief system, and we are forced to reverse a previously adopted position and effect a particularization of A back to I, to inductively conclude IO. Needless to say - and we need not keep pointing out such parallels between positive and negative polarities - the sequence of such harmonization might equally have been O followed by E, and then I followed by IO.

Note that the particulars involved, I or O, may be arrived at directly, by observation, as suggested above, or, in some cases, indirectly, by deduction from previously induced data. The inductive processes we have so far described, of observation followed by generalization and particularization, are only a beginning. Once a number of propositions have been developed in this way, they serve as premises in deductive operations, whose conclusions may in turn be subjected to deductive scrutiny and additional inductive advances and retreats.

But we are not limited to the pursuit of such “laws” of nature; we have a broader inductive method, known as the process of adduction[2].

This consists in postulating propositions which are not arrived at by mere generalization and particularization, but involve novel terms. These novel terms are put forward by the creative faculty, as tentative constructs (built out of more easily accessible concepts[3]) which might conceivably serve to explain the generalities and particularities (the “laws”) developed more directly out of empirical evidence, and hopefully to make logical predictions and point the way to yet other empirical phenomena. The imagination, here, is not however given free rein; it is disciplined by the logical connections its postulates must have with already available data and with data which might eventually arise.

Scientific theories (complexes of postulates and predictions) differ from wild speculations in that (or to the extent that) they are grounded in experience through rational processes. They must deductively encompass accepted laws, and they stand only so long as they retain such a dominant position in relation to newly discovered phenomena. If logical predictions are made which turn out to be empirically true, the postulates are regarded as further confirmed - that is, their own probability of being true is increased. If, however, any logical predictions are found to be clearly belied by observation, the postulates lose all credibility and must be rejected, or at least somehow modified. Theories always remain subject to such empirical testing, however often confirmed.

Thus, knowledge of nature proceeds by examining existing data, making intelligent hypotheses as to what might underlie the given phenomena, showing that the phenomena at hand are indeed deductively implied by the suggested postulates, and testing our assumptions with reference to further empirical investigations. However, there is one more component to the scientific method, which is often ignored. It is not enough to adduce evidence in support of our pet theory; and the fact that we have not yet found any grounds for rejecting it does not suffice to maintain it....

We must also consider all conceivable alternative theories, and if we cannot find grounds for their rejection, we should at least show that our preferred theory has the most credibility. This comparative and critical process is as important as the constructive aspect of adduction. To the extent that there are possible challenges to our chosen theory, it is undermined - that is, its probability of being true is decreased. Evidence adduced in favor of one set of postulates may thus constitute counter-evidence adduced against other hypotheses. We may regard a thesis as inductively “proved,” only if we have managed to eliminate all its conceivable competitors one by one. Very rarely - though it happens - does a theory at the outset appear unchallenged, the exclusive explanation of available information, and so immediately “proved,” Also note, at the opposite extreme, we are sometimes stumped, unable to suggest any explanation whatsoever.

 

3.    The Art of Knowing

Induction, as an epistemological concept, refers to the logical processes through which all propositions, and their various constituents, are gradually developed. Some philosophers have tended to define induction as the pursuit of general principles from particular ones, but such a formula is too limited and only reflects the greater difficulty of and interest in that specific issue. In the largest sense, induction includes all the following factors of cognition:

  • perception (direct consciousness of concrete phenomena, whether material/sensory or mental/intimate) and conception (direct consciousness of abstract phenomena[4] or indirect consciousness of anything), as well as recognition (memory of percepts and concepts) and imagination (perceptual or conceptual projection);
  • identification (awareness of similarities between phenomena) and differentiation (awareness of differences between phenomena), which make possible classification (grouping), often accompanied by verbalization (naming);
  • formulating propositions, with varying degrees of awareness, sometimes but not always verbally, which relate together various percepts and concepts in various ways (first as possible potential particulars);
  • generalization and particularization (including the techniques of factorization, factor selection, and formula revision - see my work Future Logic for details), which are the processes through which one discovers how far one may extend or one must narrow the applicability of propositions;
  • deduction, the inference of some new proposition(s) from one or more given proposition(s) of any kind, through a host of processes like opposition, eduction, syllogism, a-fortiori, apodosis, paradox, and others;
  • adduction, the formation and tailoring of postulates, as well as their testing and confirmation or elimination, with reference to rational-empirical considerations (more on this topic below).

All the above depend on reference to the main Laws of Logic, which ensure the ultimate fullness and harmony of knowledge, namely:

  1. Identity - acknowledging all phenomena cognized, as being at least appearances, and so problemacies with varying credibilities, whether ultimately found to be realities or illusions; never ignoring data or issues. (This is what we mean by “facts are facts,”)
  2. Non-Contradiction - not acknowledging as real, but insisting as partly or wholly illusory, any set of propositions cognized as incompatible, whatever their levels of abstraction and cognitive roots; always pursuing consistency in one’s knowledge. (Contradictions are impossible in reality.)
  3. Exclusion of the Middle - not rejecting all possible alternatives, but seeking resolution of conflicts, through some new alternative or some commonalty; seeking solutions to all problems. (There is no nebulous middle ground between being and not-being.)

Now, these various factors of cognition play a joint role in the acquisition of knowledge, and although here listed in a ‘logical’ manner, with some subdivisions and in a semblance of chronological order, they in actual practice function very interdependently, feeding off each other’s results in every which way and in no set order. Furthermore, they are here described very succinctly, so much so that their individual, relative and collective significances may be missed if one does not take time to reflect.

This brief overview of the theory of knowledge should be understood as both descriptive and prescriptive. That is to say, there is no essential difference between the palette of cognitive processes used by different human beings, be they common folk or expert scientists, trained in logic or purely instinctive, male or female, young or old, of whatever class or people, healthy or sick. This must be stressed: everyone has more or less the same cognitive tools; some people are, there is no denying it, better endowed, others somewhat handicapped, but their overall arsenal is roughly the same, as above listed.

What distinguishes individuals is perhaps rather the effort and skill they exercise with these same instruments, in each given context. Knowing is an art, and artists may vary in style and quality. Some people lay more stress on experience, others on reasoning, others on their emotions. Some people are more visual, some more auditory, some more touch-sensitive. Some people are excessively categorical or class-conscious, too verbal in their thinking, to the detriment of intuition; some people are slaves to their passions, exercising insufficient control on the objective quality of their thought processes. And so forth - but in any case, the range of faculties available to human beings is roughly the same. The art, as with music, as with painting, is to find a balance - the right measure, the right time and place, for each instrument.

It must be added that two people equally skilled in the art of knowing (or one person at different times) may arrive at different specific conclusions, due to different contexts of knowledge. The content and volume of one’s experience - in the largest sense of the term experience, including material and mental perceptions and conceptual insights - has a direct influence on one’s logic, affecting one’s every rational process.

 

4.    Adduction in Western Philosophy

Logic, since Antiquity and throughout the Middle Ages, in Europe at least, has been associated more specifically with deduction, because that was the field in which the most impressive theoretical work had been done, mainly by Aristotle. Only in recent centuries was a greater stress laid, thanks in large part to practitioners like Newton, on the experiential aspects of knowing (by philosophers like Locke and Hume) and on its adductive aspects (by philosophers like Bacon and Mill); and in more recent times on the crucial role of imagination in theory formation (by Einstein, for instance).

This does not mean to say that induction, nor more specifically adduction, are novel concepts as such. People certainly always used all the factors of induction in their everyday efforts at knowing - they used their senses and their heads, to try and make sense of the world around them, sometimes more wildly than we do, sometimes more rigidly, sometimes more sensibly perhaps. Also, we have to admit that Aristotle, after some four or five centuries of development in Greek philosophy including his predecessors Socrates and Plato, was well aware of the primary issue of induction, the so-called ‘problem of universals’ (namely, how concepts are known).

Indeed, his formal work in logic, including on opposition, on immediate inference and on the syllogism, was a lucid attempt, however incomplete, to solve just that problem. Deduction, in Aristotle’s view, was not apart from induction, or against it, but rather a major aspect of induction. For him, it seems, certain generalities were known directly and indubitably (like the axioms of logic), others had to be developed empirically (seemingly, by complete enumeration); thereafter, one could arrive by inference to all other general principles. The grey areas in that view were, no doubt, the source and validity, and the number, of the initially given top principles, as well as the scope of empiricism in the light of the practical difficulties in complete enumeration.

Today, we would certainly agree that deduction is one of the instruments of induction - needed to infer predictions from postulates for testing purposes, and more broadly, to pursue consistency. The grounds of knowledge, in our view, are primarily experiential data, whether concrete or abstract, and to a lesser extent self-evident propositions whose contradictories are self-contradictory. We are more aware of the hypothetical and tentative nature of much of knowledge; and instead of complete enumeration, we refer to processes like generalization and particularization.

But if we regard the perceptual and conceptual phenomena which are the starting-points of knowledge as being effectively ‘axioms’ (in an enlarged sense of the term), then our view is seen as not much different from Aristotle’s in essence, though varying in detail and emphasis. The historical point I am trying to make is certainly not, that Aristotle was omniscient and as fully aware of epistemological questions and answers as we are today. Rather, that in his time and earlier still, a search for such questions and answers was already in motion, and a spirit of intelligence, honesty and objectivity was already at work, so that to make a fair assessment we must focus on his contributions instead of his blanks.

I think it is important for historians to keep in mind that philosophers are human. They do not have time to put everything they know or think into words, down on paper. Often, too, they intuit a larger horizon than they have the time to actually tread in detailed thought. No one philosopher can therefore be expected to point out and clarify every aspect of induction, or to develop a truly full spectrum of logical techniques. Not saying something is not necessarily not knowing it, or at least being on the way to know it. Some unimaginative disciples, as well as historians, tend to ossify philosophies, and make them seem more rigid and limited than they were to their living wellsprings.

Thus, the suggestion that general propositions are arrived at by ‘complete enumeration’, attributed by some historians to Aristotle, contains within it the seeds of empiricism. We today certainly acknowledge the major role played by partial enumeration - this is how particular propositions are known: one experiences one or more cases of a kind to have a certain attribute or behavior, and one expresses that observation verbally, without thereby presuming to comment on the unobserved cases or to claim that they have the same attribute or behavior.

This is the common ground, between us and Aristotle; the issue is only, how one moves up from there to generalities. Complete enumeration may have been, for Western philosophy, a first and tentative suggestion; but upon reflection it was soon enough seen to be an impractical ideal, because most classes we deal with are open-ended. Today, we realize that the answer is to be found in the trial and error processes of generalization and particularization, or more broadly speaking in adduction.

Nevertheless, in spite of their manifest deep roots in the past, it is evident that until the Enlightenment the concept and laws of adduction were relatively little discussed and little understood, in Western philosophy at least. Historians tend to attribute to Francis Bacon (1561-1626, London) the clear formulation of these laws. As Anthony Quinton points out, the crucial innovation in Bacon’s ‘new method’ was that it was eliminative (“major est vis instantiae negativae[5]). Bacon also gave due credit to the positive aspects of induction (i.e. observation and confirmation), and he made explicit many of the pitfalls possible in the course of such processes (which he referred to as “idols”).

Needless to say, Bacon’s words were not the last on the subject; many further contributions have happily been made since then. Whatever their precise history, the Laws of Adduction may be expressed as done below. By ‘postulate’ is meant a set of imagined propositions of yet unsettled truth. By ‘experience’ is meant any appearance, preferably concrete rather than abstract, taken as is, as it appears, as a mere configuration of phenomena, without classificatory work of comparison and contrast to other, remembered phenomena. By ‘confirmation’ or ‘weakening’ of a thesis is meant adding or subtracting some credibility from it; whereas by ‘proof’ or ‘disproof’ is meant extreme credibility or incredibility.

 

  1. If some postulate has certain necessary logical implications, and these implications are found to be in accord with experience, the postulate is thus far confirmed, though not necessarily proved (Positive Law).

 

  1. If some postulate has certain necessary logical implications, and these implications are found to be in discord with experience, the postulate is disproved, and not merely weakened (Negative Law).

 

These laws may be explained, and unified, with reference to the concept of probability, and on the same basis many corollaries can be derived from them. The corollaries emerge from the consideration of competing postulates - a couple of examples: every time a postulate is confirmed, while a competitor is not confirmed, then the latter is weakened; when a postulate is disproved, then all its remaining competitors (whether known or unknown alternatives) are strengthened (though all equally so, unless some of them predicted the disproving experience, rather than merely accepted it). However, these issues and details are too voluminous for the present study (see my work Future Logic).

 

Drawn from Judaic Logic (1995), Chapter 1:2 (part), 2:1-2.

 
 

[1]             We may also speak of 'a logic' in a non-pejorative way, when referring to intelligent forms of thought which are found especially in certain areas of knowledge or scientific fields; e.g. logistics is the logic of willed deployment of (material or mental) objects in space and time, mathematics is the logic of numbers and spatio-temporal relations. Similarly, historians of logic may objectively refer to the logic of (used by or known to) different geographical or cultural groups or periods of history. All specific logics, good or bad, may be subjected to objective study, of course.

[2]             This is also called the hypothetico-deductive method or the scientific method.

[3]             A good example of this, is the Newtonian concept of 'force'. At the root of this scientific concept are the notions obtained through our intimate experience of push and pull, speeding and slowing. These intuitions give meaning to the idea of invisible attractions and repulsions between physical bodies, which cause them to accelerate or decelerate as they visibly do. The invisible factor of force is then quantified with reference to measurable changes of velocity. (Positivistic philosophy regards the invisible factor as superfluous; but it is convenient, and we do use it, and furthermore, positivism itself makes use of such abstracts.) The 'novel terms' used in adduction are always based on notions recycled from experience, through the imagination, by analogy, into a new context. What gives the process scientific legitimacy is the check-and-balance provided by adduction.

[4]             The process of abstraction consists in ignoring (excluding from consciousness) all but certain aspects of something perceived in whatever way; this process precedes the comparisons, contrasts and mental manipulations through which we conceptualize.

[5]             This statement can be found in Bacon’s Novum Organum, Book I, aphorism 46. The whole book is available online at: https://www.gutenberg.org/files/45988/45988-h/45988-h.htm.

Go Back

Comment

Blog Search

Blog Archive

Comments

There are currently no blog comments.