next up previous
Next: Monday, January 24 Up: Lecture Notes Previous: Wednesday, January 19

Friday, January 21

reading:  sections 4.1 and 4.2.

It is intuitively obvious that & and | (conjunction and disjunction)
are commutative and associative.  It is true but much less obvious
that <-> and <+> (biconditional and xor) are commutative and
associative.  Moreover, it is far from intuitively obvious what a
formula like

P <-> Q <-> R <-> S 

means.  Hint:  it does NOT mean that P, Q, R and S all have the same
truth value!!!

Point 5:  

The adequacy of negation, conjunction, and disjunction to express all
truth functions is seen from the method of truth tables: we can read
off a form for any truth functional expression in any (finite) number
of variables in terms of negation, conjunction, and disjunction: it
will be a disjunction of conjunctions each of which contains a single
instance of each propositional letter involved, with some letters
negated (in the limiting case one may have a disjunction with no
elements, which can be written ``false'', expressible also as P & ~P).

In the book this is called full disjunctive normal form.

A footnote:  since P & Q == ~(~P | ~Q) and P | Q == ~(~P & ~Q) (deMorgan's
laws -- you can verify these using truth tables easily), it is possible
to use either negation and conjunction or negation and disjunction as
two-operation bases for all truth functions.  Further, there are two
one-element bases:  define X ~& Y as ~(X & Y) (this operation is called
NAND):  X ~& X == ~X and (X ~& Y) ~& (X ~& Y) == X & Y, so ~& can be used 
to define negation and conjunction, and so to define disjunction via
deMorgan and all truth functions via the method of truth tables.  One can
show in the same way that the operation X ~| Y defined as ~(X | Y) (NOR)
is also a one-element basis for all truth functions. 

We develop a set of algebraic laws adequate to convert all expressions
written in terms of our propositional connectives to the form which would
be read off a truth table (and so necessarily adequate to demonstrate all
equivalences between propositional formulas).

We want to be able to eliminate any propositional connective other
than negation, conjunction or disjunction:

X -> Y == ~X | Y
X <- Y == Y -> X
X <-> Y == (X -> Y) & (Y -> X)
X <+> Y == ~(X <->Y)

along with the defining equations for NAND and NOR (if these are desired)
will do the trick.

We want to be able to convert any expression to a disjunction of
conjunctions of propositional letters and negations of propositional
letters.  These equivalences will suffice:

~~X == X
~(X & Y) == ~X | ~Y
~(X | Y) == ~X & ~Y
X & (Y | Z) == (X & Y) | (X & Z)
(X | Y) & Z == (X & Z) | (Y & Z)

The first three can be used to make sure that any negations are
negations of propositional letters.  The last two can be used to
eliminate conjunctions of disjunctions (we could of course take either
of these along with the commutative laws of conjunction and
disjunction).  Applying these rules puts any propositional formula
into ``disjunctive normal form'' (possibly with ``degeneracies'': for
example, a single letter or negation of a letter may appear in place
of a full conjunction as a ``disjunct'').

We want to be able to eliminate multiple occurrences of the same
propositional letter in a conjunction:

X & Y == Y & X
(X & Y) & Z == X & (Y & Z)
X & X == X
X & ~X == false
false & X == false

Using these, any conjunction of propositional letters and negations
can be converted either to a conjunction in which no letter occurs
more than once or to the expression ``false''.  Commutativity and
associativity of conjunction are needed to make this work, as well as
to put the final form of a conjunct in a standard order and grouping.
In class I failed to notice that I needed commutativity and
associativity of conjunction at this point.

(If one does not like the expression ``false'' in one's language,
one can replace the axioms involving ``false'' with the axiom

X & Y & ~Y == Y & ~Y

and use an arbitrary contradiction Y & ~Y instead of ``false''.
Notice that Y & ~Y = (Y & ~Y) & (Z & ~Z) == Z & ~Z by two applications
of this equivalence; the choice of Y is unimportant, as all
contradictions are equivalent.)

We would like to be able to arrange for every letter to occur
in each conjunction:

X == (X & Y) | (X & ~Y)

(this allows us to expand conjunctions of a subset of the letters
into a disjunction of all possible values of all the letters that
occur)

We would like to be able to eliminate redundant conjunctions and
rewrite the remaining ones in standard order:

X | Y == Y | X
(X | Y) | Z == X | (Y | Z)
X | X == X
X | false == X

(The already given equivalence X & ~X == false allows us to rewrite
false in terms of negation conjunction and disjunction if the whole
expression simplifies to false.)
  
The previously given identity 

X == (X & Y) | (X & ~Y)

can be used in reverse to simplify tautologies down to

X | ~X for any X; for completeness we add an identity

X | ~X == true.

This set of algebraic axioms has to be adequate to prove any tautology
valid, because it amounts to the ability to compute the truth table of
any propositional formula.

We present essentially the same axioms in a different format:

double negation:

~~X == X

commutative laws:

X & Y == Y & X                      X | Y == Y | X

associative laws:

(X & Y) & Z == X & (Y & Z)          (X | Y) | Z == X | (Y | Z)

distributive laws:

X & (Y | Z) == (X & Y) | (X & Z)    X | (Y & Z) == (X | Y) & (X | Z) 

idempotent laws:

X & X == X                          X | X == X

cancellation laws:

X & ~X == false                     X | ~X == true

identity laws:

X & true == X                       X | false == X

zero laws:

X & false == false                  X | true == true

deMorgan laws:

~(X & Y) == ~X | ~Y                 ~(X | Y) == ~X & ~Y

Notice the ``duality'' between the two columns.  Some axioms appear
here which are not used in the development above.  The equation

X == (X & Y) | (X & ~Y)

from above is proved as follows:

X == X & true == X & (Y | ~Y) == (X & Y) | (X & ~Y)

It is equally possible to convert any propositional form to a
conjunction of disjunctions of letters and negations of letters (where
a conjunction or disjunction may have one or no elements).

[we do allow the convention reading X == Y == Z as ``X == Y'' and ``Y == Z''
by analogy with the convention for writing chains of equations]

This is a presentation of the field called Boolean algebra.  It is common
to write disjunction as addition, conjunction as multiplication, true as
1 and false as 0 in this context.

This set of axioms is not minimal; for example, it is possible to
prove any axiom in the right column using the analogous axiom in the
left column, the deMorgan laws and double negation.

We prove commutativity of disjunction in this way:

X | Y = ~~(X | Y) == ~(~X & ~Y) == ~(~Y & ~X) == ~~(Y | X) == Y | X

using double negation, de Morgan's law, and commutativity of conjunction.

This is where I stopped on January 21.



Randall Holmes
2000-05-05