Turing In-Complete (part 1)

finderding

Before man-built machines that could be used to manually calculate all the same mathematical problems we now regard as computation, we – humans were regarded as the “computers”, not the artificial machines. This explains the label “manually” calculated. Man built the machines. This has only been true for a relatively short period of time when compared to the timeline man has existed in the current evolutionary state.

This technology goes back much farther than the existence of our most popular desktop pc, laptops, tablets, or smartphones. Major developments in the twentieth century progressed at a very rapid pace, not with the help of Extraterrestrial beings, but by some very brilliant humans. Maybe you could make a case for “math” from outer space in ancient history, and you’d be technically close when you factor in the influence of the orbit of planets and positions of stars that inspired the desire to figure out what was seen in the skies.

The abacus was the first device currently known for crunching numbers. The Sumerian abacus is thousands of years old and noted throughout ancient history. This isn’t what I would regard as an early computer, but it was and still is an impressive design.

The Analytical Engine an improvement over the Difference Engine – both designed by Charles Babbage in the early 1800s could be considered the foundation of modern computing. Ada King, countess of Lovelace created the first computer program for the Analytical Engine – if it had been completed. The design was, but not the fully functional machine. So the idea or design for the device came before the actual machine – as did a program that could have run on the machine.

I always felt that this part of history was a bit murky, but within the fog, there was a spark. The point is that this was a starting point that others could build upon.

Could the Analytical Engine be categorized as the first Turing Complete machine?

If we consider all modern programming languages Turing-compatible, then could it have run a program that would solve any calculation initially performed manually? In theory – possibly, in practical application, I am skeptical.

To consider the current concern about Artificial Intelligence taking over every aspect of man’s future in both positive and negative light, you should look back through its short history of advancements. Computers have come a long way not fully envisioned by the early creators, but it is still a very short time compared to man’s intellectual development.

Turing Completeness requires a system to make a decision based on data manipulated by rule sets. Remember those “If”, “and”, “goto” statements from BASIC (Beginner’s All-Purpose Symbolic Instruction Code). Maybe you remember (90s version) QBasic. If you don’t, no problem. Just know that there was some amazing progress in computer development from the 1950s and 1960s that used instructions which could be considered Turing-Complete -theoretically – not always in practice. This may not be the best way to explain this, but I think I’m in the ballpark.

I’m not disregarding Turing’s calculating machine design of from the ’30s, but things started to ramp up in the ’50s.

Consider the fact that we still use Fortran and LISP programming, both from the 1950s. Yes, I should mention assembly language which dates back to the late ’40s.

You can look back at the Rand Corporation’s Math-Matic AT-3 from 1957 used as a compiler and programming language for the Univac 1. Charles Katz led a team tasked with developing “Math-Matic” programming language under the direction of Grace Hopper who was notable in the movement towards “machine-independent” programming languages which helped lead to the development of high-level programming languages.

This was all done in the 1950-1960s. This is the era of Big computers like the DATATRON 200 series weighing in at over 3000 lbs. Big computers working with word size 10 decimal digits. All this incredibly amazing computer development which would later lead to the machines we now fear. It’s amazing to think we would later spin up the development of AI – which initially required the development of sophisticated computer code which came from these early systems. The history of computers and programming languages is very interesting and usually not referenced enough when we look at our current state of affairs with how much we depend on them. Man built these with the intent to improve his condition, and in most cases they have. What may be getting lost through time is the appreciation of all those who contributed over the last two centuries to the existence and development of all this amazing technology. It continues today, and it still requires some very brilliant minds to continue the advancement for the good of man. This is just the beginning. We are still in the early stages of computing, and we are still the computers.

Leave a Reply

Your email address will not be published. Required fields are marked *