Mute Musings

mute
(myt)
  1. Expressed without speech or aural perception.
  2. The dung of birds.
    mus·ing
(myzng)
  1. A product of contemplation; a thought.
  2. Persistently or morbidly thoughtful.

Saturday, November 06, 2004

Knowledge and Belief

One long-standing theory of human knowledge is that it must meet the definition of a justified true belief to be knowledge. Huh? What? Why?

For me to say "I know" something I must first believe what I say; hence knowledge must be believed.

For you to accept that I do indeed know of that of which I speak, what I say must be true; that is, if what I claim as knowledge is flat out wrong (e.g., “I know if I release an apple it will fly upward”) it cannot be knowledge. Hence knowledge must be true.

Finally, if I claim knowledge on a subject, and it turns out to be true, one must eliminate the possibility that it was a “lucky guess”. So if you toss a coin and hide the result from me, and I say “I know it is heads up”, and it is indeed heads, then did I really “know” it was heads, or just guessed right? Hence there must be some justification for my knowing before you can say it is knowledge (e.g., I gave you the coin and I knew it was a two-headed coin when I gave it to you, then indeed I “knew” the result was heads).

Now there is a tremendous amount of discussion around the web on this concept of what is "justified", and whether justification needs causality and/or absence of false premise and/or countering evidence, etc. But I don't see much discussion on the “true” or the “belief”. I won't go into the matter of "true" at this time (the argument is fairly predictable in a philosophical discussion anyway), but allow me to address the "belief".

On what is based the premise that knowledge must be believed? I have a plethora of knowledge I do not believe. The ancient phrase "take it with a grain of salt" is based on the premise that some knowledge is not to be believed.

I am justified to say that the earth goes around the sun; it is true that the earth goes around the sun; I don't happen to believe that the earth goes around the sun (that's just silly, look outside sometime and see for yourself). My lack of belief does not eliminate the fact that I have the knowledge that the earth goes around the sun.

I am justified to say that Elvis died in 1977; it is true that Elvis died in 1977; I do not believe Elvis died in 1977 (he was killed in Korea and was replaced with a double for monetary and patriotic-moral reasons).

And this is not just limited to me, Carl Sagan had a tremendous amount of true knowledge about how the universe was created (in 7 days) and how the earth was destroyed by a flood killing all land creatures (save for Noah and his menagerie), and yet I understand (know?) that he did not believe any of this (I personally don't believe this latter point; he had to know the truth, even if he had to deny it--even to himself--to hold his position in society).

I need to stop now, the logic is getting too circular and a vortex is threatening to swallow us all (of course I don't know that for certain).

Friday, November 05, 2004

Lighting a Candle for Mathematicians

In the sprit of lighting a candle rather than curse the darkness, I offer the following bit of incendiary tallow for our mathematical friends. I have stated before that something is wrong with mathematics as it exists today. One possible genesis for this is that most of our concept of numbers comes from the "number line". This is thinking of numbers like they live on a ruler laying before you. The number one is to the left of the number two; number four is further down to the right, past number three, etc. When the concept of negative numbers and zero came along, places were found for them off to the left more still.

Concepts of adding and subtracting numbers evolved while staring at the number line. Multiplication and division followed in the same vain. When irrational numbers came along, the number line showed its first cracks since π and e were not definable points on the line. They could only be considered to be "between this number and that number, just about here". When they looked for the square root of a negative one, it wasn't there at all, so it was dubbed an "imaginary number". Rather than go back and reconsider the number line metaphor for such inadequacies, they just accepted that the number line was right. It was irrational and imaginary numbers that had problems (hence their names; born of frustration).

What would have happened if the number line had never been conceived? Where would we have gone when negative numbers were invented? (some would say "discovered" but we won't get into that here.) First let's think about this; positive numbers and negative numbers. We have positives and negatives throughout the universe around us. For positive electric charge there is negative electric charge. There are positive reactions, there are negative reactions. Positive numbers - negative numbers. But nature also shows us that besides positive and negative, there is also neutral. Protons are positive, electrons are negative, neutrons are neutral.

Where are the neutral numbers?

The number line never revealed them because it is a flawed metaphor. But describing nature with mathematics demanded more that the number line could accommodate. Maybe it is time to consider the neutral numbers. In fact, in some ways we do. Often when a neutral number is needed from an equation we will instruct the mathematician to take the absolute value of the answer. Basically we are forcing an unsigned or neutral result where our otherwise a flawed equation can only produce positives and negatives.

Let's take a simpler example. If we look at the interaction of positive and negative numbers a curious pattern unfolds:

(+1)×(+1)=(+1)
(+1)×(-1)=(-1)
(-1)×(-1)=(+1)

The series is asymmetrical. More positives shake out than negatives. This shows an incongruity with the real world right there. Look closer at the second equation, here a positive and a negative yield a negative; as if the positive one was actually a neutral one. Now look at the first equation, the two positives yield a positive. Again, the polarity of the number has no effect on the outcome; it is "neutral". So you see, we have endowed positive numbers with the attributes of neutral numbers somewhat haphazardly. Lets see if we can describe a series of equations with positive, negative, and neutral numbers which has more symmetry:

(+1)×(+1)=(-1)
(~1)×(-1)=(-1)
(~1)×(+1)=(+1)
(-1)×(-1)=(+1)
(~1)×(~1)=(~1)
(-1)×(+1)=(~1)

The tilde (~) symbol is used here as the designator for neutral numbers.

Now see how pleasing this looks. Equal quantities of positives, neutrals, and negative products. Now look at the rules that appear: Neutrals never affect the sign of the product, otherwise, two like signs yield the opposite sign, and opposite signs cancel each other. Perfect harmony with our understanding of nature. And the big bonus? Look! The square root of negative one! Right there! That's why simply putting the letter "i" in front of a positive number always worked to show imaginary numbers. But they're not imaginary anymore! This may not be the key to unlocking all of the shackles on modern mathematics, but doesn't it warrant a little consideration and exploration.

The Shaky Foundation of Mathematics

When we watch the Space Shuttle lift itself into the remote extremes of our atmosphere we are conscious of the triumph of technology, but we are also witnessing the triumph of mathematics. For it is mathematics that serves as the foundation and the language of science and technology. But what is "mathematics"?

The math we know today is as different from the math of the Ancient Greeks as is our spoken language. Like our spoken language, our mathematics is a tapestry of concepts and rules accumulated from around the world since prehistory. And while it is common to think that mathematics is a self evident truth of nature which we cleverly discovered and put to use; this is far from reality.

Mathematics has been evolving ever since the first human recognized that two apples is more than one apple. And this evolution has been surprisingly slow. It took many centuries for mathematicians to accept concepts such as zero and infinity. The simple geometry of the Ancient Greeks was inadequate to solve a great many problems. It took the introduction of calculus before many problems could be reduced to a solution. And while our current collection of mathematical tricks allows us to create wonders never dreamed of before, it is far from perfect.

Modern mathematics has some major problems which have long been recognized but never properly addressed. Without venturing into too much detail, let's look at some of the obvious examples and their implications to further advancement.

Irrational Numbers - There is a class of numbers which behave very strangely. So unlike whole numbers are these that they were labeled irrational because when they were first discovered, the primary characteristic was that they could not be represented simply as a ratio of two rational numbers. Irrational came to take its modern meaning for the very reason that mathematicians could not help but feel that something was wrong with irrational numbers. This would not be an issue except that many of these numbers are vitally important, such as π (Pi) which is needed when calculating various attributes of a circle. Unfortunately π has no exact value. While we can calculate the value of π to a degree where there is no practical impact to everyday life, there is currently no hope that an exact value exists. So it is for all irrational numbers.

The most disturbing thing about irrational numbers is the frequency in which they appear in nature. π, e, the Golden Ratio, √2, etc. All of these numbers appear over and over again in nature, and yet our system of mathematics has no exact value for them; only approximations. It is almost as if we purposely created a system of mathematics that could never describe the world around us. The existence of irrational numbers alone should force the abandonment of current mathematics as hopelessly flawed.

The Square Root of a Negative Number - As we were taught in elementary school, any two like-signed numbers, when multiplied together, yield a positive number. So while we know that the square root of nine is three (3 × 3 = 9 therefore √9 = 3 )*, there is no one number which will yield a negative nine (-9) when multiplied with itself (-3 × -3 = 9; 3 × 3 = 9). This is true for all even roots, but not for the odd roots (-3 × -3 × -3 = -27). And since there is an infinite number of negative numbers and even roots, there are an infinite number of "unexplainable" or "missing" roots. While this may seem trivial in the grand scheme of things, in mathematics in has always been a real nuisance.

To simplify the problem, mathematicians used some of the accepted rules of manipulation to reduce the number of missing roots from infinity down to one; no small feat in itself. They did this by factoring out a negative one from every number, or in other words, by changing negative nine (-9) to nine times negative one (9 × -1 = -9) they could take the square root of any negative number and express it as the square root of the number (as a positive) multiplied by the "square root of negative one" (√-1). In our example this would be √-9 = √(9 × -1) = √9 × √-1 = 3 × √-1. So now there was only one "unexplained" number, the square root of negative one. To make things easier to write, the mathematicians called this the "imaginary number" and wrote it as a lower case i or sometimes j. So now they could write √-9 = 3 × √-1 = 3 × i, or by dropping the multiplication sign as is common practice √-9 = 3i.

So a major problem–an infinite set of imaginary numbers–was reduced to a small problem–one imaginary number. Now mathematicians were free to calculate, but they weren't always sure what to do with the imaginary numbers, so they regularly discarded any answer that had an imaginary component. Other mathematicians found that they could not afford this convenience and were forced to hang onto numbers that had both real and imaginary components (now called "complex numbers" because it is not palatable to present "imaginary" answers to your peers). While this "work-around" has survived and allowed mathematics to progress, there still is no answer to the question: what is the square root of negative one.

Zero and Infinity - These "numbers" were late to the party for the simple reason that they don't follow the same rules as the other numbers; and mathematicians live and die by rules. First, while zero and infinity seem similar to whole numbers, all whole numbers are either odd or even; zero and infinity are neither (or perhaps they are both simultaneously). Whole numbers are either positive or negative; zero is neither. Whole numbers divide into themselves one time; zero and infinity do no yield this result.

Zero multiplied by any number yields zero, therefore any number divided by zero can yield any number as a result. This flies in the face of common sense and lead early adopters of zero to state that "division by zero is undefined". In real world experiments however, scientists found that as the denominator of a function approaches zero, the quotient approaches infinity. So it was accepted that allowing the denominator to reach zero would yield an answer of infinity. This forced the development of calculus to help explain how equations behave as certain numbers in the equation move towards limits like zero and infinity.

But these are not firm answers like 2 × 3 = 6 or 9 ÷ 2 = 4½; they are more like convenient "tricks" to yield a useful answer where none would otherwise exist. So mathematicians will say that the answer to 5 divided by zero is infinity (or more precisely they will say that the answer approaches infinity when five is divided by a number that approaches zero; no need to commit too strongly when using a "trick"). But since 5 × 0 = 0 and 19,847 × 0 = 0, then 5 ÷ 0 = 5 and 19847 and 2 and 0 and every other number there is; hardly useful to a serious mathematician.
So as you can see, there are some serious problems with mathematics as we have defined it today. And while mathematicians have created cleaver tricks to work around these problems, we may be facing some serious limitations and outright errors as a result.

The reason that these tricks survive is because they were developed against the problems faced by human beings on planet earth. The scale of our bodies and the things we interact with on a daily basis–baseballs, automobiles, airplanes, etc.–all allow these tricks to yield acceptable answers on a consistent basis. But our scientists have left our familiar scale and are working with galaxies and atoms, and here our tricks are catching up with us.

Our flawed mathematics have our scientist fumbling to understand important fundamental concepts whose explanations are approaching the ludicrous as a result. We can't determine if light is a wave or a particle. Galaxies seem to be missing more matter than they contain. The existence of gravity depends upon the existence of imaginary particles having no size nor mass. We are at a point where the mathematics allow situations to exist where cats are dead and alive simultaneously as long as no one looks at them. Where we all exist in an ever increasing number of parallel universes with that number approaching infinity by adding an infinite number of universes everyday. Where people flying to the edge of the solar system and back at a velocity near the speed of light will take generations to return to earth, while going at a slower rate of speed would result in a quicker trip.

All of these problems stem from our unwillingness to throw away our current system of mathematics and start fresh. And while no one knows what the new mathematics will be, we can state a few of the properties it will have:
  1. No imaginary numbers - Absolute truths must exist and be applicable to real situations.
  2. No irrational numbers - A structure of fixed dimensions must have absolute values associated with it.
  3. No ambiguity between whole numbers, zero and infinity. All must follow the same rules if they are to be considered.
  4. No infinite sets of infinite sets - Any set must be less than or equal to infinity, not less than and equal to infinity as today's flawed system allows.
  5. No need to discard an answer - Where today a calculation can yield an answer like "the cat weighed 3 kg and -12 kg" leaving it to us to recognize that the former is the only correct answer, a less flawed system would yield only the correct answer.

This is in no way meant to be an exhaustive commentary on this subject, rather it is hoped that you will appreciate that much more needs to be done toward this end.


Ignorance of Uncertainty

There is a branch of science which concerns itself with understanding how the universe works. It is called physics and it has been around since man achieved the ability to ask "how?" Along the trail of physics from that time has come the like of Sir Isaac Newton and his famous apple on the head. He did much to create "rules" that said "this is how things work, and you can count on it". This allowed other sciences to create great works from artillery cannons to airplanes. His rules worked fine for a long time and are still relevant for most things in our everyday lives.

A problem arose when physicists went from looking at macro things like planets and baseballs, and started looking at micro things like atoms and electrons. When we watch a baseball fly through the air, we can see it because light pours from the sun or stadium lights, hits the ball and bounces off to be absorbed by the backs of our eyes. We are sure of the position, direction, and speed of the baseball because the light bouncing off travels much faster than the ball and yet the light does not change how the ball moves no matter how bright or dim the light may be.

Or so we thought.

Once the baseball was reduced to the size of an electron, all of a sudden the light started pushing it around. The very act of looking to see where it is - changes where it is. All we can hope to know is where it was. A very smart physicist Werner Heisenberg (1901 - 1976) studied this problem very closely. Unfortunately he used twentieth century mathematics rather than electrons and light. This led him to conclude that the more certain one was about the position of our tiny baseball, the less certain one would be about which way it was going. This became known as the "Uncertainty Principle" and is all the rage in mathematical physics. It basically excuses anyone from providing a correct answer. Can you imagine if home plate umpires were given this luxury? He may say that the ball moved through the strike zone, but couldn't say in which ballpark. Thank goodness umpires do not rely on mathematics.

This pitfall of physics is rooted in the concept of probability statistics, a terribly flawed branch of mathematics that continuously states the absurd with a straight face. We often hear probability statistics quoted in connection with gambling and the weather. If we took statisticians at their word, we would expect to be hit by lightning three time before winning the lottery. However an informal poll of lottery winners finds that very few have ever been struck by lightning even once. At the heart of this dilemma is a subtle but critical misunderstanding.

When one flips a quarter, probability statisticians say that there is a 50% chance it will land heads up, and a 50% chance it will land tails up. This is absolutely wrong! There is a 100% chance it will land the way it lands, and a 0% chance it will land any other way. Just like when you buy a lottery ticket there is a 100% chance you will win or a 100% chance you will loose. All probability can do is suggest the likelihood that you can guess the outcome correctly prior to learning the truth. As another example consider the drawing of lots. You and four friends write your names on slips of paper and place them in a hat. One name will be drawn. Probability statisticians would say that each individual has a 20% chance of having their name drawn.

Actually one person has a 100% chance that their name will be drawn and four people have a 0% chance of the same. All probability calculations can offer is to predict the likelihood that you can guess the name the correct person prior to learning the results of the event. The calculations have no effect on the event itself.

So if we go back to our misguided physicists we can look again at how they got into this mess. By misinterpreting statistics, they accepted the idea that small things like electrons behave according to probabilities where large things do not. Because we cannot see an electron they say that it can never (and they mean NEVER) have a specific position and direction (they call the direction momentum). When really they mean we just haven't found a way to guess it's position and momentum without looking at it. The fact remains that the electron has an exact position and momentum even if we are ignorant of its value.

But since when is science about living with guesses? It may start with guesses, but it needs to end in realities. Physics has gone so far off track with mathematical probabilities taken as reality that some purport that every time a spec of light shines from something, an entire parallel universe spawns identical to our own, right down to the broken lace on my shoe. Even worse, they say (with a straight face) that if you flip a coin and clamp your hand over it without seeing which side is up, the coin is BOTH heads up and tails up at the same time, until we peek under our hand (a slight simplification of the "theories" but accurate enough for the point made).

So how to we reign in these well meaning and hard working physicists (this math stuff they do is neither fun nor easy). I can't guaranty this as a foolproof solution, but I think it's worth a try. The problem lies in the subtleness of the language. "Uncertainty Principle"--it sounds so cool. Some of the greatest minds of the twentieth century never hesitated to stand-up and challenge their peers with the cry "but that idea violates the 'uncertainty principle'!".

Let's think about uncertainty. When are we uncertain? We are uncertain when we are ignorant of some number of facts. We are uncertain of the safety of diving across a puddle when we are ignorant of its depth. We are uncertain of the spelling of a word if we are ignorant of the words origin, use, and the rules of spelling.

Our degree of uncertainty is directly proportional to our ignorance. That is, the greater our ignorance the greater our uncertainty. If asked how to greet someone in Hawaii, you may think "my ignorance of such things is small and therefore my uncertainty is small, I would not hesitate to say Aloha". If asked how to greet someone in Cameroon your ignorance--and thus uncertainty--might be larger.

I suggest that if we rename the "Uncertainty Principal" to the equivalent "Fundamental Ignorance", and always use the term ignorance in place of uncertainty, the physicists themselves will resolve the matter in due time. After all, who would stand-up and proudly challenge "That idea violates my fundamental ignorance"?

Now let's look at the concept of statistical "probability". Probability is a useful tool when looking at general trends in large groups. Insurance industries and gambling casinos make their living by probability calculations, but only for general trends across large groups. You will never find anyone from either industry claim that a probability calculation would be valuable if they had only one customer for one day. They need large groups of people over long periods of time.

So why do physicists insist that probabilities are applicable to individual entities and events? Because their mathematics tells them so, or rather it offers no "real" answers so they have no choice but take what they get. The mathematics is incapable of the accuracies required and thus leaves them ignorant of the truth. So how to we correct this? Let's try a variation of our treatment of uncertainty. While uncertainty was directly proportional to ignorance (as one grows larger, the other grows larger), probability is inversely proportional to ignorance.

Consider meteorologists tracking a hurricane. When predicting the path of the storm, they employ statistical probabilities. They know that the storm exist somewhere specific, and they know that it will end up some place specific, but they are ignorant of the details before they occur. Now the science of meteorology has spent decades educating itself by observing nature. In doing so, the level of ignorance is smaller now than in the past. As a result, a meteorologist is capable of more accurate predictions about the path the storm and where it is likely to come ashore.

They convey these predictions by mapping the probability that the storm will cross a certain point. The point they feel is the most likely landing site will have the highest probability. Points off of that course will have lower probability. For example, traveling indefinitely in a figure eight would have a very low probability. They can state this because they have a lot of knowledge that hurricanes usually travel in an east to west direction. To state it in terms of ignorance, while meteorologists are not so ignorant as to think that hurricanes travel in figure eights, they are ignorant enough to acknowledge that it could hit both Maine and Florida while they are most confident that it will hit North Carolina. They do not say that the storm can't hit Maine because they are still somewhat ignorant of why storms do not follow a specific path. But they are less ignorant of the reasons it would keep a smooth course under the given conditions, and thus they state that it is more probable the storm will hit NC. As ignorance goes down, probability goes up.
So to help our physicists, lets describe probability in terms of ignorance. Instead of saying "there is a low probability of this event happening" we would say "I am highly ignorant of the reasons such an event would occur". This subtle change would again leverage their human pride to come up with precise answers such as "I am 100% sure of what will happen because I am 0% ignorant".

"Those who look up at the night sky and see a moon instead of a cloud of matter waves, just do not understand quantum mechanics." - The Muser