Skip navigation

Category Archives: Logic

Yesterday I completed examining eight examples from How to Prove It.  I have reviewed the first two chapters of the book, worked the examples through section 3.5, and started 3.6 examining Existence and Uniqueness Proofs.  The first example was the existence/uniqueness proof the three equivalent statements, and it took a while to thoroughly understand the equivalences existence/uniqueness proof.  The scratch work shows how the proof is developed.  The book provides the best explanations for creating and understanding proofs because the author guides the reader to use a structured approach.

Compared to other materials that I studied on proofs, this book delays Mathematical Induction towards the end of the books.  In Mathematical Analysis, the Well ordered principle is crucial for mathematical induction.  In one of my materials, the author demonstrates a proof of mathematical induction based on the well ordered principle.

A common proof that I have seen is if m-squared is even if and only if m is even.  This particular theorem is useful if you are trying to prove the square root of 2 is irrational.  The proof  (proof by contradiction) starts that you assume that there is a rational number for the square root of 2 and the rational number has no common factors.  As you work through the proof you discover the rational number for the square root of 2 does have a common factor using the aforementioned theorem for m squared is even, which is a contradiction to the original assumption that there is no common factor.

In conclusion understanding mathematical proofs is the key to understanding advanced mathematics.

A few weeks ago I started reading a math book called Mathematical Analysis.  I read the first section on sets and then read a proof.  After reading the proof, I said huh?  It did not make sense to me.  So this started my journey into discovering what is a proof, how to read and analyze a proof, and how to write a proof.   I have not dealt with this topic since high school geometry.  I have a copy of Prove It: The Art of Mathematical Argument from the Great Courses to start my journey to understanding proofs.

The following story is quoted from N. Wirth’s Algorithms + Data Structures = Programs (Wirth 1976).

I married a widow (let’s call her W) who has a grown up daughter (call her D).  My father (F), who visited us quite often, fell in love with my step daughter and married her.  Hence, my father became my son-in-law and my step daughter became my mother.  Some months later, my wife gave birth to a son (S1), who became the brother-in-law of my father, as well as my uncle.  The wife of my father, that is, my step daughter, also a son (S2).

Using predicate calculus, create a set of expressions that represent the situation in the above story.  Add expressions defining basic family relationships such as the definition of father-in-law and use modus ponens on the this system to prove the conclusion that “I am my own grandfather.”

Read More »

In the Google group Artificial Intelligence in Action, I found the following lecture in Knowledge Representation.  Enjoy.

I am reading the book Logic, Programming and Prolog (2nd ed).  After completing the preliminaries chapter, much of the material was familiar since it was defining predicate calculus and logical inferences.  The material is mathematically rigorous by authors Ulf Nilsson and Jan Maluszynski.  They develop a series of definitions to build the language of predicate calculus and inference.  The familiarity was due to my studies from the material presented in Luger and Stubblefield (1992), and Stewart and Norvig (1995).

The authors continue with the SLD-Resolution along with Unification.  They illustrate a unification algorithm as well as proof trees.  Afterwards negation is discussed; thus  they develop the SLDNF-Resolution.  The cut and mathematics is covered.  If used improperly, cuts can make programs unsound and yield false answers.  All this tied into logic programming theory.  The theorems and propostions come from various resources.  The goal of the book is teach programmers on how to write logic programs that are sound, correct, and finite.

I found this book interesting thus far and good introduction to logic programming and prolog.

I was reading Chapters 6 and 7 of the AIMA book (Russell, Norvig 1995). They describe propositional calculus, which from their perspective is a weak form logic. The authors start with the basics and build on the fundamentals by demonstrating the language using Backus Naur Form (BNF).  They used the Wumpus World as an example and demostrated that it will take at 6400 propositional logic statements to adequately describe the 4×4 Wumpus World.

In the next Chapter, the authors discuss predicate calculus and show its BNF.  The main focus is on first order predicate calculus and then applying it to the Wumpus World example.  They discuss the frame problem and situation calculus.  They used predicate calculus to demonstrate the differences and static components between two situations, which includes the frame axioms.

The discussion was neatly tied together between a problem and predicate calculus unlike the Luger book that I read earlier.  In the Luger the frame problem was used to illustrate robotic planning in the block’s world.

My goals are to run the AIMA wumpus world lisp code and complete the predicate calculus of the Wumpus world problem.

Reading Chapter 11 of the Luger book, I originally skimmed over the topic of the Logic Theorist (LT) and the refutation resolution. The chapter was devoted to automated reasoning and theorem proving. Logic Theorist was a symbolic pattern matching algorithm with basic theorem proving capability.  I was curious about the LT and found it in Wikipedia. It gave a brief overview of the program, the inventors of the software (Newell, Shaw, and Simon), and a hyper link containing the original technical report by the inventors of the program dating over 50 years ago. After reading the report, it was very interesting that the creators of LT written a production system for solving mathematical theorems based on five axioms. The work was ahead of its time since the only programming language available was IBM assembly language.

So, can the LT be recreated using modern day techniques and programming languages? Yes, since it is basically production system with symbolic pattern matching.

Recently reviewing a blog on knowledge representation and predicate calculus, I noticed the blog writer handled poorly the topic of unification and resolution theorem proving.  The writer stated that unification had to with matching Prolog data structures – that is true for the Prolog programming language since unification and resolution are built into the language.  At first I thought and understood unification algorithm provided by Luger (1993).  Nevertheless, upon reviewing the Unification section, I realized that I missed some the critical points and updated my OneNote Notebook by adding a new page discussing the Unification topic and extracting the important concepts.  In addition, I added details to the Unification algorithm.

Before we can be begin the the Unification topic, please review the recommended websites in Wikipedea for the subjects of propositional calculus and predicate calculus.  The links are excellent.

Unification and resolution principle was developed and published in 1965 by John Alan Robinson when he was visiting researcher at the Argonne National Laboratory’s Mathematics Division.  His paper A machine-oriented logic based on the resolution principle is the basis for all theorem provers and eventually lead to the development of the Prolog programming language.

Basically unification is ability of an inference system to match two expressions.  In propositional calculus this task is very trival.  However, in predicate calculus, the task is more involved since a number of issues require resolution.  Basically, the existential variables require to be set to constants in the logical database.  Next, the universal quantifier variables can be computed, if possible.  The next issue is if variable is itself in a function, which could lead to infinite assignment – thus the occurs check is required to test this condition.  Once a variable is bound, then that assignment must consistent, otherwise, it leads to less generality.  Also, there is the issue of composition of variables.  Finally, the concept of the most general unifier (mgu).

At last the introduction to the unification algorithm pseudo code.  Basically the unify functions requires the two expression be in the form of s-expressions.  For example,

p(X) becomes (p X), p(a, f(b, c)) becomes (p a (f b c)).

The above format makes it easier to work with a functional language such as LISP.  I enclosed my version of the unify pseudo code:

function unify(s1, s2) returns either { fail, {}, {Unification} }
     he1 <- first element of s1
     he2 <- first element of s2
     if both he1 and he2 are constants and if he1 = he2,
               then return {} else return fail.
     if he1 is a variable and is present in he2,
               then return fail else return {he2/he1}
     if he2 is a variable and is present in he1,
               then return fail else return {he1/he2}
     result1 <- unify(he1, he2)
     if result1 is fail, then return fail.
     apply substitutions from result to rest of s1 and to rest of s2
     result2 <- unify(rest s1, rest s2)
     return compose result1 result2

I have presented the pseudo code for a version of the unify function, which is a generalized symbolic matching algorithm, and have seen various versions of the unify function in the Internet, in Luger(1993), and in Winston and Horn (1989).  Finally, much research has been done to make the unification algorithm as linear as possible.

Read More »


Get every new post delivered to your Inbox.