So, just how plausible is it that adding more truth values to logic would increase our reasoning powers significantly (or at all)? I'll get to that after an introduction to logic and then some examples of real logics that have more than two truth values.

Logic 101a: the Syllogism

Logic is the study of forms of argument. It looks at how an argument is structured rather than at the content of the argument. For example the following two arguments are structured in the same way

Argument 1:

All men are mortal.

No mortal is perfect.

Therefore no man is perfect.

Argument 2:

All trees are blue.Now a person reading these arguments may notice that the first one seems reasonable and the second one seems silly, but that is because you are looking at the content. If you look at the form, you will see that they both follow the pattern

Nothing blue is wet.

Therefore no tree is wet.

every X is a YThis argument form is valid. What that means is that

no Y is a Z

therefore no X is a Z

*if*the first two sentences (the premises) are true, then the last sentence (the conclusion) must be true also. But this is logic, so it doesn't say anything about whether the premises are true in the first place. Other argument forms similar to those above are not valid. For example

All men are mortal.regardless of what you may think of the conclusion, the conclusion does not follow from the premises. Often the non-validity of a form is shown by way of a counter example which can be easily seen to be false. For example the following argument has the same form as the one above:

Some mortals are fools.

therefore some men are fools

All squares have corners.The examples above are all syllogisms. The logic of Aristotle dealt entirely with syllogisms and their various patterns.

Some things with corners are triangles.

Therefore, some squares are triangles.

The logic of syllogism is a sort of class theory (except that it is not extensional like most class theories are). The descriptive terms of the sentences are viewed as classes and the rest of the sentence expresses some relationship of the classes. The relationships can all be expressed with the following relations:

A is contained in BThe logic of syllogisms is essentially a list of the mathematical properties of these four relations on classes.

A is not contained in B

A overlaps with B

A does not overlap with B

Logic 101b: the Proposition

Modern logic does not deal with syllogisms as such (although the sentences of syllogisms are included in propositions). Instead modern logic deals with abstract propositions in various combinations. You can think of a proposition as a declarative sentence. Each of the sentences in the examples above is a proposition, but so are

John went to the store.None of these sentences could be part of a classical syllogism, but to modern logic they are just fine. Propositional logic has rules about combining propositions. For example, let P, Q, and R be propositions. The propositional logic has rules like

John is married to Mary.

John and Mary went to the store.

if "P or Q" is true, and "P" is false, then "Q" is true.Propositional logic has logical operators. The three most common are "and", "or" and "not". The rules that go with the operators can be expressed in a truth table like this:

if "P" is false, then "P and Q" is false.

P | Q | P and Q | P or Q | not P |

T | T | T | T | F |

T | F | F | T | F |

F | T | F | T | T |

F | F | F | F | T |

This is just a taste of how different modern logic is from Aristotle's logic. So the word non-Aristotlean logic to describe the special logic of null-A is a little misleading.

Logic 201: Multi-valued Logic

What Van Vogt says disginguished null-A from Aristotlean logic is that it is a logic that recognizes more than true and false. Today, such logics are called multi-valued logics and there are several of them that are studied.

So how can there be something more than true and false? Well, sometimes you just don't know the answer. Suppose someone tells you the two sentences

B. John is at the storeSuppose you don't know where John is, so you cannot assign a truth value to B. But you were just over at Mary's house so you know that she is not at home. You can reliably say that the combination "B and C" is false. How do you know that? Well, by looking at the truth table above, you know that for the combination "P and Q", if "Q" is F, then it doesn't matter what "P" is, the combination is always false.

and

C. Mary is at home.

We can formalize this reasoning process by adding a third value to logic, let us call it "N" for Null (the name doesn't come from null-A, but from the programming language SQL).

The typical truth table for a three-valued logic is like this:

P | Q | P and Q | P or Q | not P |

T | T | T | T | F |

T | F | F | T | F |

T | N | N | T | N |

F | T | F | T | T |

F | F | F | F | T |

F | N | F | N | N |

N | T | N | T | N |

N | F | F | N | N |

N | N | N | N | N |

Now let's consider B and C again. Suppose you know that there is a 50% chance that B is true and a 100% that C is true. What can you say about the truth of "B and C"? Questions like this are answered by probability. Most mathematicians and other people who work with probability think of it as a sort of measure on "outcomes". But you can just as well think of it as a multi-valued logic where true is replaced by a probability of 1, false is replaced by a probability of 0, and the values in between are intermediate values of certainty or uncertainty. Then you can make a logic based on probability. Obviously you can't make truth tables for an infinite number of truth values, but you can write formulas such as

p(P and Q) = p(P) * p(Q) (P and Q independent)But there are other ways to have infinite logical values. Some propositions don't really have exact truth values, but the fuziness is not from uncertainty but from inexactness of language. Are roses actually red? Well, kind of. Let's say roses are 90% red. Are violets actually blue? Well, not really. Let's say violets are 30% blue. Then what can we say about the proposition

p(P or Q) = p(P) + p(Q) - p(P and Q)

p(P) = 1 - p(not P)

roses are red and violets are blue? Well since the claim of the sentence is that both propositions are true this can't be any more true than the least true part of it, or 30%. On the other hand, the sentence

roses are red or violets are blueis as true as the most true part of it, or 90%.

Such issues are the topic of fuzzy logic. Like probability, fuzzy logic has an infinite number of truth values from 0 to 1, with 0 corresponding to false and 1 corresponding to true. But in between those two extremes, the answers are a little fuzzy. Some of the formulas for fuzzy logic are

f(P and Q) = min(f(P), f(Q))

f(P or Q) = max(f(P), f(Q))

f(not P) = 1 - f(P)

Logic 401: Theory of Computation

Well, all of these multi-valued logics and more have have been around for quite a while and they don't seem to have generated any great advances in human understanding such as are postulated in "The World of Null-A". Why not? It should be pretty clear that while multi-valued logics may be convenient formalisms, and may be interesting in their own right, none of them are revolutionary advances in human thought. In fact, they don't add any additional reasoning power at all to the original two-valued logic. The most obvious way to see this is to notice that probability is almost always used as a mathematical theory rather than a logic, and that when people use probability theory they usually do so within a two-valued logic framework, saying things like

p(A or B) = p(A)+p(B)-p(AB)which uses the old-fashioned true-or-false relation of equality. So probability can be viewed as a multi-valued logic, but you can do the same reasoning by putting a theory over two-valued logic. The same is true of 3-value logic and of fuzzy logic. Any reasoning that you can do with those systems, you can also do with normal two-valued logic and a bit of extra machinery.

There are some pretty general results about this sort of thing. Two distinct symbols is sufficient to represent anything that can be represented in any symbols, even an infinite (countable) number of symbols. Above a certain level, all systems of computation (or formal reasoning) are equivalent in power. Any system that is powerful enough to do general arithmetic is incomplete in the sense that there are true things that the system can say but cannot prove.

All of the above are results about computation, not about reasoning. They do not rule out the possibility that there may be ways of reasoning that get around the limitations. But whatever form of reasoning this is, it will not be a logic. Logic is about the form of reasoning; it is essentially computational and therefore subject to the limits of computation.

On a more general level, the purpose of logic is not to create insights or to produce knowledge; the purpose of logic is to check your reasoning. Logic cannot guide you to new thoughts except in the most mechanical sense; it can only tell you if your thought processes are well-founded after the fact, and then only within a fairly limited domain.

## No comments:

Post a Comment