Wednesday, August 03, 2005

Dawkins' Weasel Algorithm, Revisited

In a previous article, "Information From Randomness?", I discussed Dawkins' "Weasel" Algorithm, which Richard Dawkins claimed was a demonstration or proof that evolution was inevitable. Here, we discuss the algorithm further.

Each character position (or column in the list of states) of the "Weasel" Algorithm can be described as an independent Markov process.

A Markov process has a set of states, and there are fixed probabilities for all possible transitions from one state to another. Take, for example, the process of tossing a die until it comes up "5".
The diagram at left describes this as a Markov process. The states are represented by the numbered circles. The interconnecting lines with arrow-heads represent one-way transitions from one state to the next. Lines without arrow-heads represent a pair of transitions in opposite transitions. Note that the looping arrow-lines indicate that a state may sometimes transition to itself -- for example, when the die is "1" and on the next toss is "1" again. Generally, probability values are written next to the transition lines in a Markov graph, but here we will simply state that all the transitions that exit any state are equally probable.

In this case, state "5" has no exits except the transition to itself because we stop tossing the die when it comes up "5". This called an absorbing state.

Markov processes can be analyzed by means of matrix algebra and graph theory, and in the case (as here) where there is one absorbing state which is reachable (there is a path to it) from all other states, it can be proved that the process will always end in that state.

Let us return to our statement that each character position of the "Weasel" Algorithm can be described as an independent Markov process. Each of these Markov processes are similar in form to the one that we illustrated in the above transition graph, except that it has 53 states instead of 6. And the template (the target phrase) determines, for each character position (each independent Markov process) which state will be the absorbing state. So, as we said earlier, the mathematics of Markov processes can be used to prove that each independent Markov process will stop in its absorbing state. Since the template determines the absorbing states, it also determines the results before the independent processes even start.

Let us return for a moment to the process of tossing a die until it comes up "5". It should be obvious that since we mentioned "5" in the definition of the process, there is no need to do anything random to get the result "5", because "5" is selected before we even start. Likewise for Dawkins' "Weasel" Algorithm, the random events do not select the result, because we selected it when we defined the template, or target phrase.

Thus we have shown that the information of the result of Dawkins' "Weasel" Algorithm does not come from the randomness, but from the definition of the explicit form of the algorithm. That is, the information is present before the process even starts.

It is also easy to show that Dawkins' "Weasel" Algorithm does not come close to approximating an evolutionary process. This is because the processes at each character position are independent. This is equivalent to saying that each bone, each muscle, etc. of an animal evolves independently, which of course is preposterous.



In the future, I will discuss more serious algorithms that seek to simulate or model evolution. However, there will be a delay, because I will shortly be off on a one-week trip, and when I return, my computer will be off for some serious repair. (Right now, it doesn't do much more than a WebTV.)

Tuesday, August 02, 2005

Why Music Sounds Musical

I wrote a piece called "Some Musical Theory, from a Christian Perspective" that discussed some of the mathematical aspects of musical theory, and I related that theory to some of the physics of how musical sound is made and how our ears and brains are constructed to hear it. It demonstrates how God designed us to recognize and appreciate music. Much of this I learned from a wonderful book called The Science of Musical Sound by John R. Pierce.

I regret that the wonderful message of that piece is not accessible to those that are not inclined to mathematics. I sympathize with them because in my career as an engineer, I have sometimes been 'dragged kicking and screaming' as I often have said, into the intricacies of mathematics because it was necessary to get my work done.

So here I will try to extract the basic ideas of that piece, leaving behind the equations and as much math as I can, and present them in a more palatable form. Here goes ..

Making Musical Sound

Music is made by vibrations: sometimes the shaking of strings, or sometimes of surfaces, but always involving the shaking of air, because sound is a moving variation of air pressure. The notes with higher pitch involve faster shaking, so that we can measure the pitch by measuring the number of shakes per second, called the frequency.

A string pulled tightly between two anchor-points can vibrate, as in many musical instruments. The frequency (pitch) will increase with increased tension, or with less weight of the string. or with decreased length between the anchor-points. (The bass strings of a piano are wrapped with heavy wire to save making the piano larger.) For the same weight and tension, a string half as long will have twice the frequency.

Air vibrates inside a long tube or pipe in other instruments. A pipe half as long will have twice the frequency, as for the strings.

When a string is plucked or struck or bowed near one end, the energy given to the string travels down the length of the string and bounces back and forth between the two anchor-points. When the air in a tube or pipe is made to vibrate by blowing over a hole or by a vibrating reed near one end, the energy given to the air travels down the length of the tube and bounces back and forth between the two ends of the tube. So in the case of a longer string or a pipe, the energy must travel further, so the vibration is slowed down, making a lower frequency.

Modes of Vibration

In its simplest mode of vibration, a string bends back and forth between two positions represented by the solid and dotted lines in the next diagram:
But the string can also bend back and forth between these two positions:
... Or bend back and forth between these two positions:
... And so forth.

The same modes occur in the case of air vibrating in a pipe. The first mode is called the fundamental, and the others are called overtones. The first overtone, where the string vibrates in two sections, has twice the frequency of the fundamental. The second overtone has three times the frequency, etc.

An example (for musicians): If the fundamental were middle C, the first overtone is C one octave higher (twice the frequency). The second overtone is the G a fifth interval higher (3/2 the frequency) than the first overtone. The third overtone is the next C a fourth interval higher (4/3 the frequency) than the second overtone. Notice that we get a lot of frequency ratios: 2/1, 3/2, 4/3, etc.

In most cases, especially when the string is plucked, struck, or bowed near the end of the string, the string actually vibrates in all these modes at the same time. That is, the actual motion is the sum of all these simple motions, with the overtones generally weaker as they go higher.

Hearing Tones

If the tones of the fundamental and overtones were sounded by separate strings, our ears would hear a chord (a set of related notes), but when these tones come from a single vibrating string, we hear one note with a rich sound that is more interesting than hearing only the fundamental. Another way of describing it is that we hear one note having the fundamental tone, but enriched by the overtones.

We hear the chord because the separate strings normally are not perfectly tuned, and we can hear that the frequency ratios are not exact. We hear the single rich-sounding note because the tones are 'locked' by perfect ratios, and our ears can hear the difference. How can our ears do this?

Sound entering one of our ears shakes our ear-drum, which shakes the three tiniest bones in our body (all three fit on a dime), which act as an adjustable lever, or volume control. These tiny bones transfer the sound vibrations to the cochlea, a snail-shaped organ in the inner ear. The cochlea is a coiled and tapered tube, with the fattest end connected to the bones.

If the tube of the cochlea were sliced, it would look like the diagram on the left, which shows the interior of the tube divided into three regions (shown as gray) filled with fluid: the vestibular canal (top) , the tympanic canal (bottom), and the cochlear duct (middle). The inner-most bone (the stapes) is attached to the oval window at the fat end of the vestibular canal. The sound pressure travels down the vestibular canal to the small end of the cochlea, where it connects to the tympanic canal. Then the sound pressure travels back up the tympanic canal to the fat end of the cochlea, ending at the round window below the oval window.

The pressure wave travelling down the top side reacts with the pressure wave travelling up the bottom side through the vestibular membrane and basilar membrane that separate them. The result is that high frequency tones shake the basilar membrane near the fat end of the cochlea, and low frequency tones shake the basilar membrane near the small end. Hair cells connected to nerve fibers leading to the brain detect the vibration of the basilar membrane. When we are hearing a complex sound with many frequencies, we hear each frequency component at a different place along the length of the cochlea. The precision of this detection is so good that scientists are baffled to fully explain it.

Hearing Chords

Another question that needs to be answered is: Why do the combinations of notes that we call chords appeal to us as sounding 'musical'? The key to the answer is the overtones that we described earlier. Take for example, the C major 7th chord CEGB. In the chart below, we list the positions of the fundamental tones (1) and overtones (2,3,4..) of each note of the chord (row labels at left) in the musical scale (column labels at top) :
The letters X, Y, Z, etc. on the bottom of the chart mark positions where overtones of different notes of the chord nearly match in frequency. Because the notes are never perfectly tuned, the overtones match, but not perfectly, and the small difference causes an interaction called the 'beat' effect that signals our ears that these notes are 'connected' to each other. Because the lower overtones are generally louder, this kind of 'connection' is stronger for pairs of notes related by a frequency ratio that is expressed by smaller integers -- what we call 'simple ratios'.

For musicians: The fact that our musical sensibilities favor simple frequency ratios leads to the spacing of the notes in a major scale, and is the basis of chord structure and melody. Using just a few mathematical rules about these simple ratios, the frequency of all the notes of any scale, and the sharps and flats for all of the keys can be calculated.

Summary

When we examine in detail how our hearing is designed, we can see that its capabilities goes beyond what is needed for recognition and interpretation of speech -- it has 'bonus' features designed for the appreciation of music.

Monday, August 01, 2005

ALL Things

All things were made through Him, and without Him nothing was made that was made.

John 1:3 (NKJV)

Modern science provides a remarkable perspective and insight to the "all things" of this verse. As summarized by the famous formula E=mc2, energy and mass (matter) are interchangeable; thus the creation must include not only material things (matter), but also all forms of energy.

matter — energy

Also, Einstein's theory of relativity, confirmed by experiments and measurements, shows that time and space are different sides of the same fabric.

space — time

More recent physics theories describe matter and energy as wrinkles or knots in the fabric of space-time. Space cannot exist empty, that is, without matter and energy.

matter — energy — space — time

So everything is inexorably, inextricably joined. Time cannot pass without space also existing, and space cannot exist empty, without matter and energy. So when "God created the heavens and the earth" (Gen. 1:1), the eternal God created a wondrous unity: matter, energy, space, and time, and all of the 'laws of physics' that make these a cohesive whole.

God must exist apart from time, because time is part of His creation. I used to think that the eternalness of God meant that He was infinitely old; but no, He can exist outside of time. And outside of space, and not made of matter or energy. And because He created all of these, that makes Him older and bigger and more powerful and massive than all that He has created.

Read it again:

All things were made through Him, and without Him nothing was made that was made.

But because of His great love and concern for His creation, He entered His own world and became one of His own creatures, that we might know Him better, and not be strangers to Him. Read verses 10 to 12 that follow soon after the above verse:

He was in the world, and the world was made by Him, and the world knew Him not.

He came unto His own, and His own received Him not.

But as many as received Him, to them gave He power to become the sons of God, even to them that believe on His name.

That name is Jesus -- which means Savior.

NOTE: Another blog considers whether information should be added to the unity of matter, energy, space, and time.

Friday, July 29, 2005

Praying for Strangers

When I am driving alone, as I often did when commuting to work, sometimes I think about work-related matters, sometimes I meditate on God's Word, the Bible, and sometimes I pray. But sometimes I will notice some stranger on the street, and feel moved to pray for them, although I know nothing about them.

I don't fully understand why I get these occasional quiet urges to pray for a stranger. Sometimes I wonder whether it's a look of worry or concern that I sense in their face, but then I doubt that I get that good a look as I pass by. Maybe it's because I am aware that God is watching over everyone, believers and unbelievers, and cares about them, and wants them to trust Him, so He can guide them. I know He wants to be involved, but doesn't force Himself upon anyone.

But I know that God does answer prayer, even prayers for strangers. Scientists have repeatedly done controlled, randomized, double-blind experiments that time and again have demonstrated the effectiveness of prayer. In most of these experiments, the praying person and the prayed-for person do not know each other.

Even in the face of the scientific evidence, some disbelieve and try to explain it away, which is probably why these kinds of experiments have been done so many times. Professor Leslie Francis of the University of Bangor has studied 31 experiments (conducted to the "highest professional standards") into the effectiveness of prayer. And this research report, although not entirely Biblical, makes some interesting observations and arguments about prayer.

Wednesday, July 27, 2005

Growing Up Shy

One of my earliest memories is when I was about three. I had been invited to a birthday party, and I was taken to the house where the party was, and dropped off. I had never visited other houses before or played with kids outside my family before. A lady in the house took me to a room full of loudly screaming kids. It intimidated me, so I began to cry. She tried to coax me into joining the party, but I would have none of it. Finally, she led me to the end of a hallway and showed me a toddler-sized desk-seat combination where the desk-top was a pegboard, and there was a supply of pegs of various colors. She demonstrated to me how the pegs could be put into the holes and invited me to sit down and try it. It was a quiet nook far from that noisy, scary room, and the novel toy kept me happily occupied until the party was over and it was time for me to go home.

I don't remember interacting much with other kids in kindergarten -- I avoided competition for the toys, and preferred to play by myself. But I do remember once talking to another boy. I was playing with some turtle-shaped metal containers on a window sill when the boy told me that he wished that the sunshine would shine on the other side of the room opposite from the windows. I told him that the sun was high in the sky and light travels in straight lines, so it couldn't reach that side of the room. (Many years later, I wondered how I knew at that age that light travels in straight lines, and figured that it might have been from clapping with chalky hands and seeing the sunlight from the window make straight beams in the cloud of dust.)

As I progressed through school, I didn't talk much, so most of the other kids ignored me. But I was watching them. I remember at an early age having an infatuation with a pretty girl that sat about four seats in front of me. One day I left a note in her desk before she arrived in the classroom, saying "I love you. Jimmy". I assumed that she would be as secretive about it as I was, but no -- when she found it, she blurted out loud to the girls around her "Oh, isn't this cute, Jimmy loves me!" I wished there was a trap-door in the floor that I could disappear through. I learned that communication with the opposite sex was hazardous.

Shy people are careful about talking, especially with strangers, because they are not sure what the reaction will be. They prefer to listen and observe, and I think that they learn more. But talking wasn't a problem at home -- I talked and talked -- they said I lectured. They called me "the professor" or sometimes "the absent-minded professor".

In gym class, however, communication was physical, and I felt I could get some respect. When playing dodge ball, most boys figured that the safest strategy was to hide behind someone else. But when the ball was thrown at the boy in front, you couldn't see the ball coming, and didn't have enough time to react to the direction that he dodged. I thought it was safer to stay in the open where you could see the ball coming, and in back where you had more reaction time. So I was often the last one left, and they would gang up on me, throwing two balls at once. I soon learned how to dodge two balls at once. The trickiest situation was when one ball was high and one low -- I jumped up and turned horizontal, putting my body between the two balls.

Another gym activity took place on a wrestling mat. Half a dozen boys started on the mat. Any one touching the floor off the mat would be out of the game, until only one was left. I had experience wrestling with my three brothers, so it was hard to get me off the mat. Again, they ganged up on me. Four boys went after me, each taking one leg or arm. But I could sense which of them had a solid stance on the mat, and which could be more easily pushed or pulled over. So I braced myself against the ones that were solid to push or pull the others. Another part of the strategy was confusing them as to whether it was a pulling struggle or pushing. And, since they surrounded me, I was in the middle, and less likely to be the first one over the edge of the mat. We went at it for quite a while before the instructor finally stopped the game.

When I was in college, an Israeli student, Marvin Haufmann, befriended me. I remember many times when he would be sitting at a table in the school cafeteria with his Israeli buddies, chatting in Hebrew. He would motion me to come join them, and tell his buddies to switch to English for my sake. Several of them had been aircraft mechanics in the Israeli Air Force, and it was interesting to hear their recollections of 24-cylinder aircraft engines, and other stories. But what was more interesting was how they shared stories and concerns without embarrassment, what a shy person would be afraid to discuss, and everyone was quite accepting. I thought I could learn to talk like that, too, and that's when I started to lose my shyness.

Tuesday, July 26, 2005

Communication

I've been a designer of communications systems for 43 years (now retired), and because these are so complex, it takes dozens of people to design something like a military radio, and hundreds of people for something like GPS. And my company has several divisions across the US, and deals with many different government agencies. So along the way, I've learned something of the art of personal communication while designing electronic communications systems.

In engineering work, there are a lot of specialties -- different people have different areas of expertise. For any project, a variety of specialties are needed, and they need to communicate and cooperate to fulfil all the needs of the project. There were situations where I had longer and broader experience than others on the project, but nonetheless, they had more expertise than I in certain important areas. So I showed respect for their expertise and they showed respect for mine.

Take for example the Phase Meter invention that I mentioned in my post "Invention or Discovery". The performance of the phase meter was predicted by simulations and 'paper' analysis -- no actual phase meter was built. So at some point, my boss asked me to build and test a prototype model, with the help of others. Part of the design was hardware, detailed by two engineers in Ft. Wayne, Indiana. Another part of the design was software, detailed by two programmers in San Diego, California. And I guided them, providing data from my simulations, in Clifton, New Jersey (all ITT locations). I didn't know any of the others beforehand, except John Petzinger (co-inventor), but communicated mostly by email, and occasionally by telephone. I saw some of them face to face when we were finally ready to put it all together and test it. But it was a success, proving the simulations to be correct.

Knowing that people tend to distrust strangers, I showed appreciation for their work at every opportunity, respect and thanks for their ideas, and honest praise (but not overdone. or it wouldn't sound sincere) when they were successful. Then whenever it became necessary for me to criticize or point out errors, it was not taken personally, but accepted as necessary to make the project a success. And I was careful to admit my own errors when that happened, and to thank them for finding them. After a while, I sensed a friendly tone in their e-mails, and sensed that they were not afraid to ask for help when needed, nor embarrassed to admit that they didn't understand something. Such barriers to communication can seriously hurt a project, because full cooperation and complete and accurate knowledge is important when a project is full of many complex details.

On another project, I first made the acquaintance of an engineer by email, and my initial impression was that he was careless or misinformed. However, it turned out that he was quite careful and knowledgable, but awkward expressing himself in writing.

I recall two cases where another engineer did something dumb and had a bad attitude, although most of the time people were intelligent and civil. In the first case, the engineer connected some data paths so that sometimes the data was reversed. It was like making a dictionary where sometimes the words are spelled backwards. ('Provide' is listed near 'edition' because it is spelled 'edivorp'.) When the error was pointed out to him, he insisted that nothing was wrong, and refused to change the connections. Soon afterward, he was fired.

Several years later, I wrote a specification for a digital radio design, and another engineer working miles away decided to ignore the specification. The specified data sequence was not compatible with test equipment that he wanted to use, making it inconvenient for him to test the radio. So he changed the design to fit the test equipment, rather than adapt the test equipment to fit the design. Again, it was improper data reversal, and refusal to correct the design. The design needed to be as specified to be compatible with another radio.

He didn't work directly for me, so I couldn't make him change it. I had to explain the situation to my boss, who talked to his boss, who made him change it. But I still had to work with him (over the phone), and I knew he wasn't likely to cooperate if I called him a jerk (although he was), so I treated him like a gentleman, in spite of his grumblings, so the job could get done.

Sunday, July 24, 2005

Invention or Discovery?

I think most people think an invention is entirely a 'bright idea' or a 'stroke of genius'; but actually, inventions are generally partly an intuitive idea and partly discovery -- at least that is what I have observed with my inventions. That is, there is a part that is understood, and a part that is not understood, at least initially. The idea has to be tried and tested to discover what happens, and if the desired result is achieved. When success is achieved, the inventor can't say "I told you so", but only "I was hoping it would work." After some experimentation and analysis, the unknown part may be understood; but sometimes it remains mysterious, even to the inventor.

For example, in my first inventions, previously described on this blog, there was a square-root relationship that was measured, but never fully explained.

Sometimes the results go far beyond what is expected, so that the inventor is just as amazed as any one else. I want to tell you about an invention like that. This invention, called a Phase Meter, aims to improve the performance of the Global Positioning System (GPS) , which allows GPS users to precisely locate themselves anywhere in the world.

The GPS satellites carry atomic clocks for very precise time-keeping. They are called 'atomic' because the timing is based on the vibration of atoms, free from friction and other flaws that spoil the precision of other clocks. Atomic clocks are so accurate that scientists have been able to observe the slowing of earth's rotation, so that one year is one second longer than another. (Official time standards now have leap seconds.) Each GPS satellite has three or four atomic clocks; some are based on the vibration of cesium atoms, and some use rubidium atoms.

The timing signals of a GPS satellite are based on a 10.23 Mhz clock, that is, 10,230,000 'ticks' per second. The second, of course, is 1/60th of a minute, which is 1/60th of a hour, which is 1/24th of a day, which is based on the rotation of the earth. But the timing of an atomic clock, based on the vibration of atoms, has no natural relationship to the rotation of the earth. The output of a GPS rubidium atomic clock is about 13.401,343,936 Mhz, and 13.400,337,86 for a cesium atomic clock.

The GPS electronics needs to continually adjust its 10.23 Mhz clock, guided by the more accurate 13.40.. Mhz atomic clock output. The GPS circuits count how many cycles of the atomic clock output occur during 1.5 seconds as measured by the 10.23 Mhz clock, but that doesn't measure any fraction of a cycle left after counting whole cycles. To get the accuracy needed, the fraction of a cycle needs to be measured. That's about as awkward as trying to adjust a yardstick, marked off in inches, by using a more accurate meter-stick, marked off in centimeters, with error less that the space between the ruler marks.

The clock signals, when graphed, look like this:

Each signal snaps up and down, between 'one' (high) and 'zero' (low), but with different time-scales. There doesn't seem to be any easy way to compare one with the other, to see if one clock is too fast or too slow, as measured by the other. Previous attempts to do this used faster clocks, which was awkward and expensive, and not accurate enough.

One day, I wondered what would happen if one clock was 'sampled' by the other. That is, whenever the bottom clock goes 'up', we look at the top one to see if it is 1 (up) or 0 (down). Doing that for the graph above, we get the sequence 1 ? 0 1 1 0 ? 1, where ? indicates a 'close call'. So far, this sequence of 'clock samples' doesn't seem to make any sense. Is there something we can do to make some sense of this sequence of samples?

Suppose we approximate the ratio of the two clock rates (time scales) by a ratio of integers. For example, 23 cycles of the 10.23 Mhz clock are nearly equal to 36 cycles of the 13.4 Mhz clock. I thought that perhaps the following sequence might unravel the sequence of samples:

13 = the remainder when 36 is divided by 23
3 = the remainder when 36 x 2 is divided by 23
16 = the remainder when 36 x 3 is divided by 23
6 = the remainder when 36 x 4 is divided by 23
etc.

The sequence can be obtained by adding 36 to the previous number, then subtracting 23 as often as needed to reduce the value to less than 23. This process generates the following repeating sequence:

13, 3, 16, 6, 19, 9, 22, 12, 2, 15, 5, 18, 8, 21, 11, 1, 14, 4, 17, 7, 20, 10, 0...

The sequence is also a permutation, because all the integers from 0 to 22 appear exactly once each, but in a scrambled (permuted) order.

I tried the idea of using this permutation sequence to permute (scramble) the sequence of clock samples. Think of a circle labeled with the numbers 0 through 22, something like the way a wall clock is labeled with the numbers 1 through 12. We generate the permutation sequence at the same time as we generate the sequence of clock samples, and we use the permutation numbers to place the clock samples on the circle. When I first tried this, I saw a sequence of samples around the circle that looked something like this:

0000000?1111111111?0000

-- where the ? marks 'close call' samples. WOW! The sequence no longer looks random! The permutation has actually unscrambled the samples into a sensible sequence! It actually looks like one cycle of a clock signal, as illustrated here:

0000000?1111111111?0000
' ' ' ' __________
_______/ . . . . .\____

Further experiments showed that this unscrambled sequence actually gives a picture of how one clock aligns with one cycle of the other at the beginning and end of the sampling process. If one of the clocks goes faster or slower, the 'picture' shifts to the left or right.

The next step of the inventing process was to figure out a way to measure the position of the 'up' and 'down' in the 'picture' generated by the unscrambled sequence. I worked out two different methods of doing this, which led to two different patents. A fellow engineer and Christian brother, John Petzinger, helped me with the second method, so he is listed as co-inventor on the second patent.

I have illustrated the principles of the invention using the integers 23 and 36. But more accurate measurements are possible with larger integers that better approximate the clock ratio. I wrote a computer program to simulate the phase meter invention, to evaluate its performance when the clocks are compared for about one second, and this analysis predicted that the clocks could be compared with an error of only one picosecond.

"What's a picosecond?" you may ask. A picosecond is one-thousandth of a nanosecond, which is one-thousandth of a microsecond, which is one-thousandth of a millisecond, which is one-thousandth of a second. That is, a picosecond is one millionth of one millionth of a second. If a second were the distance from New York to Los Angelos, then a picosecond would be the thickness of a hair.

Going back to the analogy of comparing a yard-stick to a meter-stick, it would be like measuring the difference with an error of a hair's-breadth, even though the spacings of the 'tick'-marks on the rulers (one inch on the yard-stick and one centimeter on the meter-stick) are not nearly that small. Even the inventor is amazed.

Saturday, July 23, 2005

The Big Picture

Here's something I found by meditating on God's Word.

When we think about God, we usually think about mankind's relationship to God, and sometimes also we think about the environment around us that God created for us. But that's not The Big Picture. God has also created other thinking, living creatures -- the angels -- and environments for them to live in. Literally, 'angel' means 'messenger' -- those that God has used to convey messages to us -- but in general, it includes all the races of 'people' creatures that God has created other than the race of mankind.

I'm not implying that it is wrong to concentrate on our own relationship to God. Certainly the Bible speaks far more about us than of the angels. But I think we can understand our own relationship to God better if we see it as part of The Big Picture.

The explanation of sin and evil in our world is inextricably linked to one angel in particular, Satan, or the Devil. This issue is so fundamental, that it is addressed by the first book of the Bible to be written, the book of Job. Yes, although Job is placed in the middle of our Bible, it was the first book written. And the entrance of sin into our world is recorded in the first book of the Bible, Genesis.

Pretend that you are an angel, not one of those that joined Satan's rebellion and his fall. How would you see the history of mankind?

Before Satan's rebellion, the angels knew nothing of sin, and so also knew nothing of grace and forgiveness. How could they understand grace and forgiveness when there is no sin? They knew of God's love, of course, but not knowing about grace and forgiveness, I think they must have taken God's love for granted.

Now, when Satan rebelled and some other angels joined him, and they were forced to leave heaven (the environment originally created for them to live in), did the remaining angels learn about grace and forgiveness? No, because God hasn't ever, and never will, offer forgiveness to Satan and his followers (demons).

What did they learn by observing what Satan did, and God's reaction to Satan's rebellion? They learned about God's wrath, which they had not seen before. Now, if you were an angel, and you suddenly learned about God's wrath, what would you think about God's love? I think you might doubt God's love a little, or at least be confused. You would need to learn about God's grace and forgiveness.

When mankind (Adam and Eve) were created, did the remaining angels learn about grace and forgiveness? No, only after sin was allowed to contaminate the human race, and a Redeemer was foretold, did they begin to get a hint. And when that Redeemer came, and paid for our sins on the cross, and was resurrected, and some forgiven sinners also resurrected, then they had enough to understand more.

Notice that it was necessary for God to allow sin to contaminate the human race so that the angels might have the opportunity to learn about God's grace and forgiveness. It may be hard to understand why God would allow this to happen; but it is certain that it did happen, so either God allowed it or God is not in total control. The Bible assures us that God is in total control, and that God has allowed sin to enter our world. It likens God to a potter who from one lump of clay makes a garbage container to be tossed out and a beautiful container to be admired and preserved. And does the clay have any right to tell the potter that he is wrong to do this?

We anticipate the time that we will live in heaven and have an opportunity to converse with the angels. But the angels would say that they anticipate the time that they will have an opportunity to converse with us. The relationship between us and the angels, I think, will be something like the relationship between the athletes of some sport and the fans of that sport: We have participated in this grace and forgiveness thing that the angels wonder about, and have watched from afar, but have never experienced. I think they are eager to hear our stories -- our personal testimonies -- and to listen to our songs of praise.

Friday, July 22, 2005

Hiding in Plain Sight

Donna had to get a 'stress test' as part of a routine physical exam. Her doctor gave her the address of a imaging center on Bloomfield Avenue in Glen Ridge. A couple of days before the appointment, we made a trial run to find this imaging center, but failed. We ran out of time, so we just had to do it 'cold' on the day of the appointment.

I should explain that you can drive through Glen Ridge while holding your breath. And you don't need to be an experienced pearl diver -- any amateur can do it, because the town is only 1/2 mile wide. And in case you live in some other other part of the country, like the Midwest, I should also explain that here in suburban New Jersey, towns are joined seamlessly together. Donna grew up in Nebraska, where there are miles of corn and wheat between towns, and you can spot the next town at a distance by the tall grain elevator. When she was first in this suburbia, she was puzzled, and asked me "how do you know when you are leaving one town and entering another?" I said "when you see the fire hydrants change color."

Well, just knowing that the imaging center is on Bloomfield Avenue in Glen Ridge limits the location to a half-mile stretch of road. We had a house number, but as is typical of business streets here, hardly anyone displays house numbers. On this half-mile stretch of road, we found only one house number, and it was displayed by two-foot high gilded numerals. I figured that the responsible businessman was flaunting his rebellion against the tradition of no house numbers. That told us which side of the street our destination was on, but not much more.

So what we needed was a sign saying "Imaging Center" or some such thing, but there was none. So we had to evaluate ALL the buildings on that side of the street to see what might plausibly house an imaging center. That narrowed it down to a building that looked like it could be a town hall, a church, a library, a museum... an imaging center? There was no sign viewable from the street, but it had a parking lot, so we turned in and parked. Donna said that her blood pressure was going up trying to find the imaging center -- would it skew the stress test?

As we approached the entrance, there still was no sign. There was no house number, no doorbell, even no mailbox. High above the door was a coat-of-arms, but without words. On the pavement in front of the door was a Latin inscription long enough to be some sort of motto. These only made the place more mysterious. Should we just walk in, uninvited? We had no choice. Only inside the inner door did we finally see signs indicating which floor to go to for the imaging center.

Hours later, when Donna was waiting for me to come and get her, she was busy answering people who showed up at the door, asking if this was the right place for the imaging center.

Most businesses cannot stay in business hiding in plain sight like this. But the imaging center gets their business from referrals from doctors, whose patients are forced to play hide-and-seek. But maybe it's really meant to be part of the stress test. In any case, she found that she doesn't have any blocked arteries.

Wednesday, July 20, 2005

PyroWorm Game

Most programmers write software for other people to use. But as an engineer, most of the programs I wrote were customized tools for me to use. (But some tools were more general-purpose, and I made these available to others to use.) I thought of the computer as a mental lever, or as a robot that would do all the hard work. But sometimes, at home, I used programming for fun and games.

Once I got the idea to combine two simple games that others had created. One of them, John Conway's Game of Life (See here and here) is not really a game, although it has rules, and pieces on a board that come and go according to the rules. The board is like a chess board, but with many more rows and columns, and the rules are much simpler than chess. If you program a computer to play this 'Game of Life', you can populate the board with pieces any way that you like, then watch the computer 'play the game' according to the rules, and see what happens. If you haven't seen it, that may sound boring -- but it really is fascinating, because in spite of the fact that the rules are so simple, the moving patterns are surprisingly life-like -- hence the name. Ragged asymmetrical patterns grow into beautifully symmetric objects. Patterns glide across the board, collide with other patterns, and are either destroyed, or morph into new patterns. It is more a spectator sport than a real game.

I wrote a program to do this, and it was fun to watch, but always, after an interval of interesting activity, the 'game' would end with a mixture of static, unmoving patterns and moving patterns that would repeat the same motions over and over again. It seemed to me that it needed some kind of spark that could rekindle the interesting activity whenever it died down.

As I began to experiment with this rekindling idea, it seemed that the moving patterns were like a wild-fire. When the patterns expanded, it was like fire spreading; and when the patterns drifted sideways, it was like wind was moving the fire. I modified the program so that when a piece on the board was not there in the previous change-cycle, it would be colored bright red; but if it had been there in the previous change-cycle, it would be colored a dull red, nearly brown. Thus, an active 'fire' was flashing bright red, but a static pattern of 'ashes' was the brownish dull red.

I recalled a simple game that was available on the Unix computers, called 'worm' (similar to this). It, too, was played on a board with many rows and columns. The worm was a string of pieces, like a string of pearls, made of the letter 'o', with a capital 'O' for the head, like this: ooooooO. The computer could make the worm move by adding a piece at the 'head' end and removing a piece at the 'tail' end. What you had to do was to steer the worm's motion, using the arrow keys of the computer's keyboard. A numeral from 1 to 9 would appear at a random location on the board; and if you could steer the worm to 'eat' the numeral, then the worm would grow longer by that amount. The computer could make the worm grow by adding pieces at the 'head' end without removing pieces at the 'tail' end. So the length of the worm was your score.

Now, if the worm bumped into itself or the walls (the edges of the board), the game was over. And as the worm grew longer, that was harder to do -- you had to learn strategies of winding up in the crowded space without making any fatal collisions. But in a more forgiving version of the game, collisions were not fatal. Instead, the penalty for collisions was that the worm would shrink. The computer could make the worm shrink by not adding pieces at the 'head' end while removing pieces at the 'tail' end. The only way this form of the worm game could end was for the worm to shrink down to nothing.

So I thought, why not use the worm to rekindle the fire? And to make the game more dramatic, the worm would need to avoid collisions with the fire, that is, getting burned. The worm would shrink if it touched a bright-red 'fire' piece, but would grow if it touched a dull-red 'ashes' piece, igniting it. But since the worm's head would always touch a piece first, and then the rest of its body would be dragged over that spot, it had to be that the 'ashes' would be ignited when the tail touches (or moves off) the ashes, not the head. I wanted the worm to play with danger, not be suicidal. It's hard enough staying away from the fire, so I removed the penalties for the worm bumping into himself or the edges of the board. In a way, there are no edges, because if you try moving off the right edge, you 'wrap around' and show up at the left edge; and likewise, the top and bottom edges are joined.

I made the worm a light green against a darker green background (grass: wwww). His head was a sunny yellow smiley-face, but when burned, the burned part of the body would turn black and the face turned red.

It turns out that these simple, but crazy rules make a very exiting game. To get a high score, you have to make the worm grow, and to do that, your worm has to start fires -- then run away from them. If you get too greedy, and start too many fires, there may be too little safe ground left and your worm will be fatally trapped. The worm is obviously a pyromaniac, so I named the new game PyroWorm.

You can download my game for free. Just click on PyroWorm Game and read about it, and scroll down to the Download section.

Tuesday, July 19, 2005

Electronic Advances

I began my engineering career in 1959, and it is not only amazing how technology has advanced since then, but I feel privileged to have been able to see much of it first-hand.

The junction transistor was invented in 1948, but it needed development before it was practical to use it. Computers had used vacuum tubes until the first fully transistorized computer in 1954, just five years before I started my career. I remember trying different circuit configurations that could acheive the same function, then counting the transistors so that I could select the lowest-cost configuration. That was important, because each transistor cost about $5 to $10. (Other kinds of transistors cost $45 or more.) Now, about 10,000 transistors cost one cent.

You could see the transitors back then. Each one looked like a little tin can, 1/4-inch wide, with three wires sticking out. Each transistor in the computer, with the help of a resistor and a capacitor, served as a switch that could turn electrical current on and off. It took a configuration of 12 such switches just to add two 'bits' (binary digits), including the 'carry' from a nearby digit position. That would occupy about a 4-inch by 10-inch area on a circuit board. Now that transistors are so much cheaper, twice as many are used for the same function, and it's all smaller than the period at the end of this sentence.

Back then, a computer was a room full of refrigerator-sized cabinets. And because everything was so costly, the computers were as simple as possible. Today's computers, although much smaller physically, are bigger in terms of the numbers of equivalent parts. I remember one of those refrigerator-sized cabinets was a memory storing just 256 words. Just a few days ago, I bought a memory card for a digital camera with one million times larger memory, and it is the size of a penny.

Some kinds of circuits need a 'matched pair' of transitors, for accurate balance. That was hard to acheive when transistors were made as individual devices. The solution was to make them as a pair. It was something like two cookies side-by-side on the same baking sheet, baked at the same time, would come out nearly identical. These were sold in the same little tin cans, but with six wires sticking out of each can. Later, when someone figured out how to make the resistors and capacitors on the same piece of silicon as the transitors, the 'integrated circuit' was born. At first, the integrated circuits where packaged in the same little six-wire cans as the matched transistors. With just two of those integrated circuits, we could make one 'flipflop' (a circuit for storing one bit), which previously occupied about a 4-inch by 6-inch area. Today, thousands of 'flipflops', or millions of transitors, fit on one integrated circuit. I remember that as companies like Motorola and Texas Instruments made more and more dense integrated circuits, they would brag about how many transistors were in each circuit. Now, nobody bothers to count.

The steady increase in integrated circuit density was described by Moore's Law (see here and here), which observed that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. As the rate slowed, this was later revised to 'double every 18 months' and then to 'double every 24 months'. A few years before I retired, I was asked to design a circuit that could not be built -- yet. I was asked to design the most powerful digital correlator that could be built on one 'chip' in 2010. So I had to use Moore's Law to estimate -- to predict -- how much addition logic could be put into one integrated circuit in 2010. It was tricky, because Moore's Law was sometimes also stated as a doubling of speed rather than density, and addition can not only be done twice as fast by using circuits that are twice as fast, but also by using twice as much circuitry.

This wasn't the first time that I have designed for the future. Most electronic designs are rushed into production, to try to beat the competition. But when designing for GPS (the Global Positioning System) satellites, the design cycle is much slower-paced. The main reason is that if the circuits in a satellite fail, it is VERY expensive to send a repairman (a.k.a. space-walking astronaut) up to fix it. It is also very expensive to launch the satellite to begin with, and often a launch fails, and millions of dollars are suddenly lost. So the design cycle is deliberately slow and very careful, with lots of checking and testing. Then when a GPS satellite is built, it is not launched right away -- it is put into storage, and launched only when an older satellite fails so badly that it needs to be replaced. So, for example, my Phase Meter, which was invented in 1995 and patented in 2002, and destined for new GPS designs, is still not in space yet.

Sunday, July 17, 2005

My First Purchase

Here's one of my earliest memories -- when I was about four years old. I found a penny in a couch, and asked my oldest sisters, who were teenagers at the time, who I should give it to. They explained that there was no way of knowing who lost it, so now it belonged to me, because I had found it. Mom agreed. I was partly delighted, and partly puzzled. One sister said I could save it until I had enough to buy something I wanted. The other said it would be a long time before that could happen, because I got no allowance when I was four. Mom said I could buy a cookie with the penny. There was a store down town that sold sugar cookies for one cent each.

I could walk down the street to the store and buy the cookie all by myself, it was proposed. But I had never walked down the street by myself before -- I could get lost. But it was easy, they assured me. And next year, I would be going to school, Mom said, and this was on the way to school, so I would learn part of the route. All I had to do was go down the hill, then uphill a little, across the railroad tracks, then to a street with stores. The store with the cookies was the first one on the right, and the cookies were in a glass case near the front of the store.

They rehearsed the directions with me until I was convinced that I could do it. So, clutching my penny in my fist, I left on this new venture. It helped that the street was straight, because half-way down the hill, I could look ahead and see the railroad tracks, and could look back and see the street in front of our house. When I reached the street with the stores, I turned right and entered the first store. There, behind glass, I saw stacks of BIG sugar-coated cookies. A kind-looking lady leaned over and looked down at me and asked what I wanted. One of those cookies, I said, pointing, and holding up my penny.

Soon I had one of those big cookies in my hands. I had to admire it a bit before I ate it. And I was proud that I had walked to the store and bought it all by myself. Then it occurred to me that if I ate it before returning home, there would be no evidence that I had completed my quest successfully. So I held it very carefully and carried it home and showed it to everybody before I ate it.

Saturday, July 16, 2005

Death, From a Heavenly Perspective

Another poem of my youth -- I imagined what death must seem like AFTER arriving in heaven.

Death
written 3/11-12/56

I still remember when I died,
With joyous expectation sighed
While they, not understanding, cried.
My wife was kneeling by my side
Alone; the others stood outside.

I told ner, "God is calling me
The second time. The first time He
Called me a worker here to be,
And now He calls me home. I'll see
Him there, and from sin's grasp be free.

Don't look so sad." She said, "But Dear,
Although to die I have no fear,
To live without you will be drear.
Of course you know I'll shed a tear."
I said, "Although you want me here,

God also wants me. I suppose
He has a reason, for He knows
It's better in that realm where woes
Are gone, and never blows
The storm of strife. The close

Of life on earth is nigh.
I'm going. Do not cry.
I'll see you soon. Good-bye."
And then I breathed the final sigh.
I'm waiting now for her to die.

Friday, July 15, 2005

In The Beginning Was Information

In Wednesday's blog post, Information From Randomness? and in Monday's, The Development of Information Processing, I compared information and energy: Both cannot be perfectly stored or transmitted.

There's another similarity: Both are essentially invisible, although we have ways of seeing them. We cannot see heat energy, for example, but we can generally feel and sometimes see its effects.

If I send a message by telephone, my words are first carried by sound (pressure variations in the air), then by voltage and current variations on wires, then by sound again. Then someone may write my message down, so that it is carried (and stored) by ink on paper. They could even rearrange Scrabble pieces to record my message.

Now, my information was transported from one place to another, and even recorded (transported from one time to another), but the information was not made of air, nor of electrons, nor of ink. Neither was the information made of the sound energy nor the electrical energy. It is not matter, not energy, but it needs matter and energy to be transmitted and to be stored.

The ancient Greeks had a word, logos, that comes close to our word, information. One definition that I found says that "it may refer to a word or a thought or a spoken phrase or an idea". Another authority says that logos meant a visible representation of an invisible thought. The Greek philosophers thought that there was a mysterious universal power in the logos. Heraclitus, a philosopher of ancient Greece preceding Socrates and Plato, thought the logos was the underlying order or reason for the activities of nature and the universe.

So when the apostle John wanted to introduce Jesus Christ to the Greek-speaking world as the invisible God in visible flesh, the one who spoke the world into existence and who holds it together by his power, he introduced him as The Logos: "In the beginning was the Word, and the Word was God." (John 1:1)

The scientist, professor, lecturer, and writer Werner Gitt took that phrase for the title of his book, In the Beginning was Information, which I reviewed at the "Chapel Summer Readers" night. Gitt saw his work as an extension of Claude Shannon's Information Theory. As such, Gitt's definition of information is broader that Shannon's definition. I cannot cover all that is in Prof. Gitt's book, but his theories provide great support for creationism. This, of course, has sparked much debate, and much of the objection to Gitt's ideas are confused by not recognizing that Gitt's definition of information is broader than Shannon's.

What I find especially significant is the fact that the only recorded information that we know of that is not recorded by mankind is the DNA that we find in all living things. The DNA has a four-letter alphabet, and these symbols are arranged in linear sequences that describe how to build the proteins of living things. Scientists say that theoretically, there can be other 'DNA languages' that work equally well for providing this information, but for reasons they cannot explain, there is observed only one 'DNA language' for all living things, from viruses to humans. If living things somehow evolved from a 'primordial soup', we would expect many 'DNA languages', just as we have many human languages.

There are scientists who search for signals from outer space hoping to find signs of intelligence. There is radio energy reaching us from outer space, but they search for patterns that might convey information. I wonder what would happen if someone launched a spacecraft that radioed back to earth a signal that encoded a DNA sequence. Would they recognize it as a sign of intelligence? I think they might -- at first. They would first recognize that the radio signal was actually carrying information, and that the information must have come from some kind of intelligent being. But as soon as they found out that it was DNA and that the implication was that DNA has information from some kind of intelligent being, they would change their minds, because they wouldn't like where the logic was leading them.

Thursday, July 14, 2005

We're Not Afraid!

As he was evacuated from a King's Cross underground train after the recent bombing, Adam Stacey sent a photo by cell phone to his friend Alfie Dennen, a web designer in London. When Dennen posted the photo on his blog and began to get a flood of reactions from other friends, he was inspired to set up a new web site to send a defiant message to the terrorists: "We're not afraid."

Within two hours, the web site www.werenotafraid.com was operating. The site has over 2200 photos from all around the world, many with the text "We're not afraid" in the picture or added to it. And they have raised nearly $2000 for the Red Cross London Bomb Relief fund.

In the middle of Gallery 33 on the web site, one photo sent the message without need for words:

It reminded me of the Psalmist's words:
Yea, though I walk through the valley of the shadow of death, I will fear no evil: for thou art with me; thy rod and thy staff they comfort me.
(Psalm 23:4)
What a comfort! -- for those for whom The Lord is their shepherd.

Wednesday, July 13, 2005

Information From Randomness?

At the end of a previous post The Development of Information Processing, I segued into a discussion of DNA and stated:
Those who ascribe to the faith called Evolution have convinced themselves that all this complex machinery [DNA], which we have only begun to decipher, came into existence through random processes. They would like to believe that somehow information can arise out of randomness, but we who design computers know better.

Making information out of nothing is like the pseudoscience of perpetual-motion machines. These were proven to be impossible, because energy cannot be perfectly stored or transmitted. Always a little bit leaks out of the machine -- typically friction creating heat -- lost energy. In computer science and information theory, we know that likewise, information cannot be perfectly stored or transmitted. Always a little bit (or more) of error creeps in, and the data erodes.
One evolutionist, Richard Dawkins, wrote a book, The Blind Watchmaker, in which he describes a computer program and results that he claimed demonstrated that evolution was virtually inevitable. His program, sometimes called the Dawkins' Weasel Algorithm, essentially claims to create information out of randomness. (According to information theory, pure randomness is zero information.) But like those putative perpetual-motion machines once made by quack inventors, we can show that the claim is fraudulent.

We will show that (1) the program does not create information out of randomness; and (2) the program does not simulate evolution, as claimed.

Suppose I put a stencil with the words STOP HERE in front of a piece of paper and sprayed paint in the general direction of the stencil. The drops of paint fly in random directions, and some drops go through the stencil, coloring the paper behind it, and other drops are blocked by the stencil. Eventually, the words STOP HERE appear on the paper behind the stencil. Now suppose that I argue that the random paint drops have evolved into a meaningful message -- that I have created information out of randomness. You wouldn't be convinced, would you? No, it is obvious that the information on the paper came from the stencil, not from the randomness of the paint drops.

Dawkins' Weasel Algorithm essentially uses a similar principle, but is cloaked in evolutionary terminology and other description that hides the deception. It's not quite as simple as the stencil illustration, but it also uses a template that forces the random actions to create the desired result. And it's simple enough that you don't have to be a computer programmer to understand it.

The object of the program is to create, using random selections, the following phrase taken from Shakespeare's Hamlet:

METHINKS IT IS LIKE A WEASEL

It has a length of 28 characters, including the spaces. (We need to say 'characters' instead of 'letters', because space isn't a letter.) So the random process begins with a random string of 28 characters (randomly chosen from the upper and lower case letters and space), such as:

trial 01: zYODhPvNZUhwMGOBzik LTqJipFB

The first step is to compare the random string with the 'target' string (the template), and make new random choices for characters that do not match the target in corresponding positions:

template: METHINKS IT IS LIKE A WEASEL
trial 01: zYODhPvNZUhwMGOBzik LTqJipFB
trial 02: OgoRTm dnNdCvPeLJmP aWQsgraU

Notice that because one of the spaces was matched on the first trial, that position is not changed. The process described above for the first step is repeated until all positions are matched. Notice that once a match is obtained in some position, that position is no longer changed.

We show a typical sequence below, where, to save space, we show only every 10th trial after the first 10 trials:

trial 01: zYODhPvNZUhwMGOBzik LTqJipFB
trial 02: OgoRTm dnNdCvPeLJmP aWQsgraU
trial 03: QhgYWG TAZLjAshLVhU OvSylTIH
trial 04: jHQszcLeGLXy kYLXmg kzZNynvO
trial 05: mkIlsHmMjdjjzRlLcWL KyEsSZSA
trial 06: ky WAEphIFBhFfiLCnz MIwQYCJC
trial 07: kebPbNxUhLBj yoLwXC sjAOJCbk
trial 08: wCjseNNsuNWqeqULJjS ZaVDbTil
trial 09: aEAqLNMXaAhlIjTLrwx euhfhgEf
trial 10: xEfIpNNHIfSXIZZLhGL TELvHdEx
trial 20: yEQNONKa sJdIspLTrJ HhudZREA
trial 30: PEUdWNKV plNIERLhbE enWPISEG
trial 40: MEOpyNKs BVIIcSLLOE IRWEASET
trial 50: METJgNKz hTkISRLNpE v WEASEU
trial 60: METHINKF YTYISALfJE O WEASEN
trial 70: METHINKo hTNISOLuKE A WEASEx
trial 80: METHINKX DTvISfLgKE A WEASEC
trial 90: METHINKb iTyISwLvKE A WEASEk
trial100: METHINKQ IToISbLTKE A WEASEC
trial110: METHINKp ITAISrLaKE A WEASEz
trial120: METHINKS ITjISrLIKE A WEASEC
trial130: METHINKS ITMISpLIKE A WEASEx
trial140: METHINKS ITGISeLIKE A WEASEn
trial150: METHINKS ITvISELIKE A WEASEB
trial160: METHINKS ITEISwLIKE A WEASEL
trial170: METHINKS IT ISjLIKE A WEASEL
trial177: METHINKS IT IS LIKE A WEASEL

Notice that there is nothing in the rules that involves any interaction between character positions (columns). Therefore, every column is an independant random process where random choices are made until the target character is chosen. And, of course the target character is eventually chosen, else the choices are not truly random. (For example, if a die never fell with face 6 up, we would know that there was something wrong with the die.)

It is obvious where the information is coming from. In the program that I wrote to generate the above data, there is the line:

Target: string28 = 'METHINKS IT IS LIKE A WEASEL';

The information is coming from this part of the program. If I change this program line to read --

Target: string28 = 'METHINKS IT IS LIKE A RABBIT';

-- then the result of the program will change accordingly.

In Dawkins' description of the program, each step is said to simulate one generation of an evolutionary process, and the random choices are said to simulate random mutations. But the program does not simulate evolution, and isn't even an evolutionary algorithm.

No evolutionist claims that evolution proceeds until a given animal is acheived. No, the basic theory is that the random choices (mutations) survive when the animal is more 'fit' for its environment.

So let's say that grammatically correct strings with correctly spelled words simulate 'fit' animals. Spelling and grammatical errors will simulate flawed animal designs, or at least a need for improvement. So the line above labeled "trial170" might plausibly simulate such a case.

But what about trials 1 through 50? (Go look at them.) They have so many flaws -- they are so badly mangled -- that they don't even look like dead animals! So here's the key question: How did they reproduce to get to generation 50?

Furthermore, note that the data above is biased in favor of evolution because the mutation rate is way too high, and because the choices are guided by a grammatically correct template. In spite of this, the first 50 'generations' don't resemble anything that can plausibly represent an animal that can reproduce.



Sometimes it helps to look at a subject from multiple viewpoints. The discussion is continued in Dawkins' Weasel Algorithm, Revisited, which looks at it as a Markov process.

Tuesday, July 12, 2005

Bubbles

Another poem of my youth --

Bubbles

"And the world passeth away, and the lust thereof; but he that doeth the will of God abideth for ever." -- 1 John 2:17

Never trust a bubble,
Though it bobbles in the air,
Or drifting gently there,
Does allure.
Though it twinkles in the light
With colorful delight,
Don't be sure.
Though you very lightly grasp it,
Though you very gently clasp it
Like a dunce;
All at once,
Nothing first,
It will burst.

Monday, July 11, 2005

The Development of Information Processing

Mankind has always communicated, and earlier than many admit, by written language. But the invention of the printing press launched a major change in the spread of knowledge, because it was so much more efficient than hand-copied books and traveling teachers and story-tellers. More recently, the Internet has accelerated the spread of knowledge more than ever.

But we have discovered how to do much more than simply reproduce and distribute information efficiently. Perhaps it began when clockmakers figured out how to put short and long notches on a wheel to control the chiming of a clock. Or when the player piano was invented, where holes on a roll of paper control the sequence and timing of the notes played. Other machinery was made to robotically play drums, violins, horns, and other instruments. All these machines translated recorded information into sound. Then the phonograph was invented, which translated sound into recorded information, and afterward translated it back to sound as often as desired. Then came the telephone, which translated sound to an electrical form that could be transported over long distances without recording and playback.

In some of these examples, you can say that the recorded information was translated into mechanical action. For example, the player piano roll controlled the striking of the piano keys. Perhaps this was the inspiration for machines that automated the weaving of tapestry designs -- punched holes controlled whether threads were lifted above or dropped below the path of the shuttle of the loom. Later, punched paper tape was used to control machines that could drill any set of holes in a part to be manufactured, or robotically apply any set of rotary tools to a manufacturing task. These all translate recorded information into a sequence of actions.

Other people were interested in just processing the information, that is, calculating. Astronomers and other scientists relied on long, tedious, and error-prone calculations. Much of the general-purpose calculations could be prepared beforehand and stockpiled (like prepared foods) -- for example, a table of square roots, or trigonometric functions. So people created adding (and subtracting) machines, multiplying (and dividing) machines, and 'difference engines' to generate and use these tables. These machines translated information (such as "30x31") into a useful equivalent of the information (such as 930). The methodology was mechanical actions (for example, rotating digit wheels), but the overall function was information in and equivalent (derived) information out.

So far, all of the types of machines mentioned use a single sequence of information, except when the operator intervenes by choosing the sequence -- choosing the song to be played, or the hole pattern to be drilled, or the formula to be calculated.

Now, what if the machine could control its own sequence? For example, the music player could play the verse, then chorus, change key, play the verse and chorus again, increase the volume and repeat the chorus. The drilling machine could drill 30 boards with pattern A, then 50 boards with pattern B. Or the calculating machine could compute formula A, and if the result is positive, compute formula B, else formula C. And repeat this for another set of data, and another, until 50 sets of data have been processed.

The concept of sequence control, or self-control, or the machine talking to itself, so to speak -- thrust information processing into the computer age. First attempts where mechanical, then vacuum tubes, then transistors. That's the stage where I first got involved. You could see the transistors back then, because they weren't miniaturized yet. Today, millions of transistors are packed into one small package.

It was obvious that many of the things these machines were designed to do were similar to human activities, so words like read, write, memory, and decision were used to describe machine functions. We knew we were trying to emulate human thinking, difficult as it was, and still is.

When computers were developed, the machines became more general-purpose. That's because the machines now had two kinds of information: the information being processed (data), and the information that controlled the processing (software). The visible machine (the hardware) could do almost any processing, given suitable software. Give it word processing software, and the machine becomes a word processor. Give it accounting software, and it becomes an accounting processor. Give it telephone control software and hide it inside a telephone, and you have a 'smart' telephone. And don't tell the consumer that there's a computer in his telephone, lest he be afraid to use it.

At some point along the way, the designers realized that the information that they were putting into these machines was actually language. It was strange languages designed to best fit the design of the machines, but still, it was language. It was difficult and error-prone to write machine language, so more human-like languages were designed that could be translated (by computer, of course) into machine language. A simple example of 'programming language':

X:= 0; repeat X:= X+1 until X > 9;

Translation to real English: Set the data called 'X' to zero, then keep adding one to it until it is greater than nine.

This programming language gets translated into machine language for use by the computer hardware. I won't show it to you -- trust me, it just looks like gibberish.
___________________

Contemporary with the computer scientists, scientists of biology were discovering DNA, and RNA, and began unraveling the mysteries of the machinery of life. The DNA, they found, was another kind of machine language. Whereas our computers use an alphabet of 0 and 1 (zero and one), the DNA uses an alphabet of A, G, C, and T, which name the acids Adenine, Guanine, Cytosine, and Thymine which are the symbolic parts of a DNA molecule. These are arranged in a sequence, just as are the symbols of human and computer languages. Some parts of the sequence describe how to make proteins, and some parts function like punctuation. Some parts haven't been deciphered yet; some have assumed that these are useless junk, but others are beginning to understand uses for the presumed 'junk DNA'.

Those who ascribe to the faith called Evolution have convinced themselves that all this complex machinery, which we have only begun to decipher, came into existence through random processes. They would like to believe that somehow information can arise out of randomness, but we who design computers know better.

Making information out of nothing is like the pseudoscience of perpetual-motion machines. These were proven to be impossible, because energy cannot be perfectly stored or transmitted. Always a little bit leaks out of the machine -- typically friction creating heat -- lost energy. In computer science and information theory, we know that likewise, information cannot be perfectly stored or transmitted. Always a little bit (or more) of error creeps in, and the data erodes. That's why hard drives have CRC (Cyclic Redundancy Check) codes to detect errors, and we backup our data and software with extra copies. That's why our bodies have redundant copies of the DNA.

Yes, we see DNA errors (genetic defects), and we see adaptive adjustments to the 'gene pool', but nobody has ever observed information being created out of nothing. Like any other information, Somebody created it. That's a subject that we will pursue further, later.

Sunday, July 10, 2005

The Start of System Engineering

In my early years working for ITT, we designed computers and other related hardware, but not software.  The software for our computers was written by other companies.  Then the day came that ITT management decided that we needed our own programmers (software writers).  So they hired a bunch of programmers, built a bunch of new offices, and created a Software Department.

It seemed strange to me, but these new people kept to themselves -- they were fellow employees, working on the same project, but strangers.  They sat in one area of the lunchroom, and we sat in another area.  It seemed that the hardware engineers thought that the programmers were wizards of the mysterious realm of software, and the programmers thought that the engineers were wizards of the mysterious realm of hardware.  It was like we spoke two different languages.

It didn't seem right to me, so on one lunch hour, I introduced myself to one of the strangers, and started fishing for some common ground that we might be able to talk about.  I mentioned a 'register' (hardware holding a small piece of data) that I knew held data that the programmers used.  He told me that the programmers thought the sequence of the data was annoying, because it made their work more difficult.  But, he added, "I guess the engineers must have a good reason for doing it that way."  I told him, no, we didn't have any reason for arranging the data in that sequence.  One sequence was as good as any other to us, so we just chose an arbitrary sequence.  But if we only knew what the programmers preferred, we would happily arrange the data any way they wanted.

After lunch, I told my boss about my conversation.  My story made it clear that if the engineers and programmers had an opportunity to discuss common issues, we might be able to help each other do our jobs better.

A week later, my boss and his boss called all the engineers to a meeting.  A new department was going to be formed, it was announced.  The new System Engineering Department would oversee the technical issues common to both hardware and software, to ensure that both would work together smoothly.  And I and my boss would work in the new department.  That was the beginning of 'System Engineering' at ITT.

Saturday, July 09, 2005

Christ's Love For Me

Two more poems from my youth --

Christ's Love For Me

I cannot understand why He
Could love and die for such as me;
I wasn't worthy of His love,
But yet He came from Heaven above
And took my sin and died for me.
I cannot understand why He
Could love and die for such as me.

He carried all my sin away
And now I have no sin today.
I cannot understand why He
Could love and die for such as me.

He gave me faith to live by grace,
So I could see Him face to face;
I cannot understand why He
Could love and die for such as me.

Because Christ's love motivates us, it seems natural that the next poem would be --

Something For Him

I think of what He's done for Jim,
And like to do something for Him.
In my own strength, I can't do aught,
But in His strength -- see what He's wrought!

He'll knead the clay from every wrong,
And make of me a vessel strong
And filled with service to the brim;
I'd like to do something for Him.

Friday, July 08, 2005

Engineering Precursors

As I look back at my youth, I am amused at the little things I did that sparked my interest in engineering, and even gave me some insights into the workings of computers -- even though I hadn't the foggiest notion what a computer was back then. Each time I learned a new physical principle, it fascinated me, and I just had to explore how it could be used.

At one point, I learned how to make an electromagnet. You could wrap thinly-insulated wire many times around a big nail, connect the wire to a battery and the big nail turned into a magnet and could pick up little nails. Disconnect the battery, and the little nails would fall to the floor. I bought the wire and battery, and demonstrated the magic to my younger brothers.

Then I learned about the telegraph and the Morse code, and the history of how these were used to send messages over great distances. Now I was cutting up 'tin' cans to get strips of steel. I mounted an electromagnet and a strip of steel on a block of wood, so that when the electromagnet was energized, the strip would click down on the electromagnet. Another steel strip, wood block, and nails was used to make a switch to turn the electromagnet on and off.

.---- switch -------------------------------- electromagnet
'---- battery ------------------------------- and click-strip

The switch and battery would be 50 feet away from the electromagnet, connected by a pair of wires. When you tapped on the switch, the electromagnet would click 50 feet away. Now all I had to do is teach my younger brothers Morse code, and we could have loads of fun. Well, they didn't think that memorizing a code was fun, so I made a chart for them. That was a little easier, but still they resisted. It was hard to get a consistent rhythm, else a 'dit' and 'dah' could be confused. So I modified the telegraph with a double switch, three connecting wires, and two electromagnets, so that a 'dit' and 'dah' were signalled by separate electromagnets. Years later, I learned that some historic telegraphs were actually constructed in a similar manner.

Then I learned about 'relays' -- the metal strip pulled by the electromagnet could function as a switch. Now, turning on a switch here could turn on a switch over there. Or, you could make it so that turning on the first switch would turn off the second switch, and vice versa. That opened up a bunch of new possibilites.

What if you connected the relay so that when it was on, it would turn itself off, and when it was off, it would turn itself on? The cycle of cause-and-effect would repeat itself, wouldn't it? Well, I built one to see what would happen, and sure enough, I had a buzzer -- the relay couldn't decide whether it should be on or off, so it turned on and off, on and off, as fast as it could.

What if you connected two relays so that relay 1 would try to do the same as relay 2 (on if on, and off if off), but relay 2 would try to do the opposite of relay 1 (off if on, and on if off)?
Then the relays would go through a cycle like this:
relay1 .. relay2
off . . . . . off
off . . . . . on
on . . . . . on
on . . . . . off
off . . . . . off
off . . . . . on
on . . . . . on
on . . . . . off
... etc.

That made an even louder buzz! (With a slower cycle, the relays had more time to turn fully on and fully off.) I was having so much fun that I had to buy more batteries.

Next, I learned about serial and parallel connection of switches. If switches were connected in series, like this --

======== switch1 ----- switch2 ----- switch3 ========

-- then the wire pathway was on if switch1 AND switch2 AND switch3 were on. And if switches were connected in parallel, like this --

=======,--- switch1 ---,
. . . . . . |--- switch2 ---|
. . . . . . '--- switch3 ----'======

-- then the wire pathway was on if switch1 OR switch2 OR switch3 were on. Since relays could be substituted for the switches, endless possibilities lay before me. I struggled to construct interesting and useful machinery with these ideas, but I was overwhelmed. My trial-and-error methods didn't work because there were too many possiblities.

I didn't know it at the time, but I was learning some of the principles of computer logic. But I didn't even know what a computer was, and I didn't have all the tools. Later, in college, I learned about Boolean logic, Karnough maps, DeMorgan's theorem, Venn diagrams -- tools that a designer of computer logic needs. And when I got out of college and into ITT, the first thing I did was to design part of a computer. But the switches and relays were now replaced by transistors.

Thursday, July 07, 2005

China Fears the Internet

A recent CNN news article Beijing clinic ministers to online addicts caught my eye. The Chinese clinic treats youngsters that are addicted to the Internet. It was reported that "They are suffering from depression, nervousness, fear and unwillingness to interact with others, panic and agitation. They also have sleep disorders, the shakes and numbness in their hands."

I knew that the Chinese are interested in more than the mental health of their young people. I read further to see if the news article would mention China's hostility toward the Internet, putting the news in perspective. No, the Associated Press reporter, working in Beijing, was probably fed a press release from the Chinese authorities, putting a 'politically correct' Chinese spin on the report.

The Chinese, like all other Communist nations, have always tried to control the information available to their citizens. That is why it has been necessary to smuggle Bibles into China. And the Internet, although needed for China to catch up with modern technology, is seen by the Communists as a threat. In recent years, the Chinese authorities have been installing equipment to filter the Chinese connections to the Internet.

For example, see the report Empirical Analysis of Internet Filtering in China where "The authors are collecting data on the methods, scope, and depth of selective barriers to Internet access through Chinese networks." A sample of web sites blocked by the Chinese filtering equipment includes:
  • Asian American Baptist Church
  • Association of Christian Community Computer Centers
  • American Cancer Society - Northern California Chinese Unit
  • AltaVista - The Search Company
  • Amnesty International USA - Defending and Promoting Human Rights Worldwide
  • Center for Anti-Communism
  • Russian Christian Orthodox church in Boston USA
  • Christian Academy in Japan
  • Defend AMERICA - US Department of Defense News About The War on Terrorism
  • The Free Methodist Church in Canada - MAIN PAGE
  • Integrity Episcopal Church
  • The Truth in America Project
  • Voice of America
For more, see Sites Blocked in China - Highlights

The concern of the Chinese Communists is more than political -- it is idealogically hostile to Christianity. (See, for example, Chinese Christians Sentenced to Death and persecution.org.) Unable to stop the house church movement, they have attempted to control religion by government-controlled 'religious' organizations such as the Three Self Patriotic Movement (TSPM).

Wednesday, July 06, 2005

Our Potted Herb Garden in July

The herbs are doing well, and we have already harvested some. They grow better if up to a third of the growth is trimmed now and then.

-------From left to right, below, we have chives, oregano, and rosemary.


-------Below, we have sage and dill, with petunias for color.


-------Below, we have more dill, french tarragon, and peppermint.


-------Below, we have (the french tarragon and peppermint shown again), tarragon and lavender, with basil in front.


-------Below, we have (tarragon and lavender shown again) thyme, with parsley in front.


-------The small hoses that are seen in most of the photos are part of an automatic drip watering system that is controlled by a timer. Most herbs grow well in pots, in a sandy, not-too-wet, well-drained soil. The basil and parsley like it a little wetter.

Tuesday, July 05, 2005

A REALLY Personal Computer

Years before the PC (the so-called Personal Computer) became widely known to households across America, there were a few of us that had really personal computers. In those days, we predicted that someday, computers would be sold like radios and toasters. We called this dream the appliance computer, because it would be just another household appliance. Alas, when the appliance computer arrived, the marketeers called it a personal computer, but it wasn't nearly as personal as what we had before that.

When I began my engineering career in 1959, the first job I had was designing part of a computer. Computers were a roomful of refrigerator-sized cabinets back then. Later, I designed entire computers, and the software that was used to make software. As computers became smaller, I often yearned to have my own. I once designed one that was so small I might afford to build it, but it was really a toy that wouldn't be very practical. Finally, the technology advanced to the point where a few companies made kits that allowed people with the right skills to build a computer that they could afford.

I had already built a few radio receivers and audio amplifiers from kits, so I knew I could do it. The kits included the design drawings, and I also had all of the details for all of the software. So with full knowledge of every detail of the hardware and software, I could customize the design to my liking. For various reasons, I made modifications to both hardware and software, so it was as personal as you could get.

The picture on the left shows the main computer box and its contents: the power supply, one board for the computer chip and essentials, another board for memory (RAM) , and a small board to interface to the keyboard and monitor. There was room to add more memory boards and interface boards.


I also built the keyboard and monitor shown on the left here. All of those keys on the keyboard are actually switches mounted on a circuit board.

The monitor was built with a television tube, and the circuitry handled only text -- no graphics. It could display 25 lines of text 40 characters long. I modified the design to double the display memory. This didn't display twice as much text at once. Instead I put a switch in front that selected which memory to use.


There was no hard drive, and no floppies. The only permanent (power-off) memory was a pair of ordinary audio cassette recorders. The box shown on the left here, also built from a kit, interfaced the computer to the audio recorders. The data rate was only 300 bits per second, so when it was time to load or store a program or data, you started it, took a coffee break, and hoped that it went OK. I modified this design, too.

When I finally made the transition to a new appliance computer (a.k.a. "PC"), it seemed strange to be using a computer that held hardware and software secrets. Something like driving a car that you're not allowed to look under the hood.

And for a few years, the media didn't dare mention words like "floppy", "software", etc, assuming that this was some realm of specialized knowledge, like Markov Analysis, that most people would have no idea about. Then they suddenly realized that there were many households with PCs, and it was OK to mention them to the general public.

Monday, July 04, 2005

My First Patents

Some people ask me about my inventions. So here's the story of my first two inventions, at least the first two to be patented. I'm lumping two together, because the second invention was an improvement on the first one, and because the second invention was the first to be patented, and vice versa. First, a little historic background..

The U.S. Army started using digital communication long before the commercial world, because only digital communication could be safely encrypted. A voice signal was sampled 8000 times per second, and each sample converted into 8 bits, converting the voice into a stream of 64000 bits per second. To minimize the number of radios or cables, 12 (or more) voice signals would typically be multiplexed (merged) into one signal, so that one radio or cable could carry 12 voice signals at once. The company I worked for (ITT) made radios, cable modems, and multiplexers for the Army.

A 12-channel (12 voices) multiplexer would arrange the data in 'frames', at 8000 frames per second. Since the frame rate equaled the sampling rate, each frame contained one sample from each voice channel (signal) -- 12 samples in all, 8 bits per sample, or 96 bits per frame. So a received stream of bits could be divided into 96-bit frames, the frames divided into 8-bit samples, and the samples sent to separate circuits that ultimately reached 12 different soldiers, one of which was the communications operator.

If a radio or cable modem was turned on, or had recovered from an outage, it wouldn't generally be starting at the beginning of the frame. The circuits needed a way to discover where the frame began, else those 12 soldiers might all get the wrong bits, and that would be very confusing. So they 'stole' the last bit of the frame, which was the last bit of the sample for the last channel, for a marker (called a 'synch bit') to identify the 'edge' of the frame. The 'synch bit' was zero and one on alternate frames -- an easy pattern to recognize. That left only 7 bits for each sample used by the last channel, the one used by the communications operator, degrading his voice quality, so he had to say "What was that again?" more often than the other soldiers.

A 'frame synchronization' circuit was used to find the synch bits, correcting the multiplexer's timing so that it would start at the beginning of each frame. From an arbitrary start, it would count off every 96th bit and check if it looked like a synch bit, meaning that it matched a 10101010... pattern. If it matched, it would check one frame later to verify that it wasn't an 'accidental' match; but if it didn't match, it would slip the timing by counting 97 bits (instead of 96) to the next potential synch bit.

There was a need to make the synchronization procedure faster so that communication could get started faster, and restarted faster when there was an outage. This would also make the communication less vulnerable to enemy jammers.

My first invention made the frame synchronization twice as fast, at a cost of about one more 'flipflop' in the circuit. The second invention made it even faster, using more flipflops. After many experiments, I found that the second speed-up was proportional to the square root of the number of additional flipflops. So the cost/benefits were:

1 flipflop -- 2 times faster
1+4 flipflops -- 2x2 times faster
1+9 flipflops -- 2x3 times faster
1+16 flipflops -- 2x4 times faster
1+25 flipflops -- 2x5 times faster (5 = square root of 25)
etc.

I thought the square root relationship was strange and mysterious. It illustrates the fact that inventions are generally half bright-idea and half discovery.

The first invention allowed the next bit to be examined after a mismatch -- a delay of 1 bit rather than 97 bits. The second invention anticipated the timing slips, examining the next several bits before they become the current candidate for synch bit. Later inventions dealt with the problem of noise (bit errors). These inventions helped ITT get more contracts.

Patent 3,597,539 - issued 8-3-71
Patent 3,594,502 - issued 7-20-71 - links to USPTO

Later, I was asked to sign papers when rights to use these patents were sold to various countries: Brazil, Canada, Denmark, France, Netherlands, India, Italy, Mexico, Sweden, Russia, South Africa, and Belgium. It seemed strange to sign papers in languages that I couldn't read, although there were English copies. The ones for Russia had the most paper and the most signatures. They even double-notarized some of the documents -- they didn't trust us! Years later, they must have changed the procedures, because they stopped asking me to sign such documents. I didn't really have a choice, anyway.

I guess I should esplain that when I joined ITT, I had to sign a document giving them full rights to any inventions arising from my work for them. So that's why I didn't have any choice about signing the papers. The only time ITT didn't get full rights was when the Contracts Department goofed, and one of my inventions became the possession of the U.S. Air Force.

The patent protection rights only last 17 years, so these patents have been in the public domain since 1988. And you can't get full-text copies from the US Patent Office web site, because their database only has patents issued since 1976.