My passion during college was the history of mathematics and logic.
Although we tend to think of computer science as something exciting and new, it is actually part of in intellectual tradition that stretches back millennia. I believe it is best to understand programming as part of a great intellectual tradition and not as something that is apart from history.
That said, I must caution that the history of ideas is a mixed bag. There are periods of enlightenment and long periods of stagnation and corruption. There are occasionally extreme destruction.
I believe that, for our society to move forward, we should study the past and encourage the trends that lead to enlightenment while finding a way to avoid trends that lead to stagnation, corruption or destruction.
The story of Socrates (469-399BC) provides a cautionary tale. Socrates was known to engage in open inquiry. His dialogs would often delve into ethical questions such as the nature of justice. Legend tells that the intellectual climate of the day was dominated by Sophists who used sophisticated arguments to befuddle the people and that the Socratic dialogues often exposed the tricks and fallacies of the Sophists.
Socrates was eventually confronted and forced to drink hemlock by his enemies.
Socrates did not write down his arguments. His student Plato (circa 427-327BC) built a repution by writing down what where supposedly the Dialogues of Socrates. Plato was highly critical of Democracy and held a notably mystical view of mathematics.
Many consider Plato's student Aristotle (384-322 BC) to be the father of logic. Aristotle was prone to studying nature and classifying natural phenomena. His studies included efforts to classify formal arguments and common fallacies.
Aristotle was particular fond of a structure called the "syllogism." A formal syllogism involves a major premise, a minor premise and a conclusion. A major premise might be something like "Man is mortal." A minor premise might be "Socrates is a man." The conclusion is "Therefore, Socrates is mortal."
An "if then" statement is a form of the syllogism. The binary logic used in software is a continuation of a discussion that took place in Ancient Greek.
A fundamental premise of Aristotelian Logic is the "Law of the Excluded Middle." For premise to be valid it must resolve to a "yes" or "no."
The "Law of the Excluded Middle" is fundental to programming. The condition used in an if/then block must resolve to "true" or "false"; otherwise the computer would stall.
Aristotle had a strong dislike for paradoxes and absolutes. Common paradoxes found in computer programming include infinite loops and division by zero errors. Infinite loops tie up resources and division by zero errors crash programs.
From Aristotle to the mid 1900s, education in the Western world was founded on the Trivium. The three legs of the trivium are: "grammar, logic, and rhetoric."
- Grammar refers to the structure of the language.
- Logic refers to the structure of ideas.
- Rhetoric refers to way that we use language and logic in discourse.
I find the basic structure of the Trivium to be applicable in programming. This is especially true of the division between grammar and rhetoric.
For example, I learned to program with a language called Fortran. Although the syntax of Fortran and PHP is different, the basic logical structures are the same. Likewise, the codiing style that I used in Fortran is much the same as the coding style I use today.
I see the syntax of a program (grammar), the logic of a program (logic) and the discussion of how we use programs (rhetoric) as three different things that correspond to the same ideas in the Trivium.
If this were a class, it would be worthwhile to spend a week on the history of ideas to see how they apply to computer science. For example, the term "algorithm" comes from a 9th century Persian mathematician named Abu Abdullah Muhammad ibn Musa Al-Khwarizmi who built on ideas from a a 7th-century Indian mathematician Brahmagupta( *).
The scientific method, of course, is a direct product of Aristotelian Logic. Many of the first scientist took the arguments developed by the Scholastics and applied them to the study of nature.
Other great innovations are important. Descarte's (1596-1650) work on Analytic Geometry provides the mathematical foundation for manipulating images on a computer screen. Newton's calculus and ideas on physics are fundamental to all aspects of the physics and mathematics used in computer.
Computers did not just happen. The logic and science that we see in information age built up over time. And we should appreciate the hard work and dedication of the people who created the foundations for this technology for the people who developed these innovations often daunting challenges.
The Politics of Logic
If you look at the world today, you will that people are far more interested in politics, finances and power than they are in science or reasoning.
The primary concerns of the ancient logicians tended to personal or political. In the history of ideas one sees a great deal of talk about ethics, ettique, social structure, religion and-above all else-governance.
Rulers who learned ethics and logic tended to be effective leaders and the society would thrive.
Problems occurred when the people at large learned logic. People who learn logic are prone to question their leaders and become uppity.
A common pattern in history is that a group of people will discover logic and start applying logic to their personal lives. This group will thrive.
The leaders of the new movement that rose to power would try to kick down the ladder behind them to preserve their power.
Unfortunately, for this article, I am going to skip all of the important ideas that have influenced the development of our modern society to address an extremely dark trend that appeared in modern history.
The term "modern" is actually quite old. In philosophy the term "modern" refers to ideas influenced by Immanuel Kant.(1724-1804). Modern refers to ideas that are 200 years old!
Kant is famous for never having left his hometown of Konigsberg to see the world and spending a great deal of time contemplating "pure reason." Kant wrote some extremely obtuse books which he believed created a "Copernican Revolution" in philosophy. The dense nature of his work earned reputation as an intellectual's intellectual and he continues to be adored in many academic circles.
The American Revolution created a crisis for the monarchy and ruling class or Europe.
King George III, who was German and who directly funded the University of Gottingen and his in law Frederic the Great of Prussia tasked the German Universities to create a counter the ideas being expressed in the American and French Revolutions.
The best example of the new ideology created by this reactionary effort is a philosopher name George Hegel. Now, to a large extent, Hegel was simply rewording the "Divine Right of Kings" into philosophical speak. Hegel's new philosopher has two main components.
Hegel's philosophy of history presented a world view in which the state was the driving force of history and that society evolved through a predictable series of thesis/anti-thesis conflicts.
Hegel's second great innovation was a create a "modern logic" which lays the foundation for his conflict driven world by attacking the foundations of classical logic. Hegel did pedantic things like framing "The Law of the Excluded Middle" as absolutist.
The "Law of an Excluded Middle" is not absolutism. The purpose of the law of the excluded middle is forming premises into forms that we can use for reasoning.
Using an if/then statement in a computer program does not make you an absolutist. Programmers use statements that resolve to clean states for reliability and speed.
Aristotelian logic sought to avoid paradoxes. Hegelian logic makes paradoxes fundamental. The most common paradox is the reflexive paradox as in the self-negating statement: "This sentence is false."
Aren't I clever. I stated a sentence that negated itself. (This paradox has been known since antiquity. It is even in the Bible.)
Recursion (programs that call themselves) can be a powerful, experienced programmers know that if used improperly, or unknowingly, can lead to infinite loops that tie up computer hardware or that can even lead to a system crash!