Turing In-Complete (part 2)

From the ABC (First digital electronic computer) named after it’s creators John V Atansoff and Clifford Berry to the fictional Multivac written about by the great science fiction author Isaac Asimov there is a lot of development that would need to take place. We started the first computer revolution with vacuum tubes and switches using the concepts of binary arithmetic and logic, progress to integrated circuits and chips, and push modern silicon limitations beyond today’s 5 GHz peak. Here we approach the possible saturation limitation of Moore’s law (double number of transistors in a dense integrated circuit every 18 months).

If we reach the limitation in modern manufacturing techniques we would theoretically slow the rapid advancements in computer technology and probably never build the ultimate computer we are destined to create. Today’s super computers built with many processors running custom variations of Linux are incredibly powerful when compared to today’s consumer and enterprise systems, but they will be much more powerful in years to come because we are never satisfied. We look back at early vacuum tube technology and re-apply it in modern engineering techniques to push processing speeds to 1000 times that of today’s current silicon based designs. Development moves from GHz measured processing speeds to approach Terahertz speeds thanks to Air Channel Transistors and Graphine Nanoribbon technology which can use magnetic fields to control current flow, and on to light-induced super circuits which potentially brings quantum computing into play.

If you’ve been out looking for a new laptop recently you might suspect that Moore’s Law already has hit a wall. Dennard scaling law based on the theory that as transistors get smaller, their power density stays constant. That didn’t really pan out, so in the last decade we’ve seen the move to increase performance of modern computers with increases in processing frequencies and multiple cores. You would think that more cores would equal faster more efficient computers, but that doesn’t always work out due to power dissipation which causes heat, and workloads that don’t actually get divided among all cores. This is why you don’t see the 18 month doubling of computer power in the selection of available consumer products like you may remember just a decade ago.

The idea that computers would continue to get smaller and faster at a constant rate indefinitely may need some revising. Some components will benefit from advancements in nano technology and allow smaller intelligent devices to perform many functions much larger systems once did. This also will allow larger assemblies of these tiny computer components to work together in forms that may resemble the first and second generations of digital computers. In effect computing systems have already gotten much larger, an example being today’s server farms used for cloud computing. The combined computing power of multiple servers still has limitations. A glimpse of this will be seen as the IOT (Internet of Things) devices and 5G technology already impacting modern networking designs becomes more common place. This won’t necessarily result in smarter computers, but it could result in better services. It will also result in more noise and more power consumption.

All this amazing technology thanks to the “triode” an improvement on the early diode valve vacuum tube designs. We can thank Lee De Forest for inventing the “audion” in 1906. This was the vacuum tube design that enabled development of the early digital computers. The transistor wasn’t invented until 1947, and even then the vacuum tube was still widely used for quite a while on into the late Twentieth Century. Now Vacuum Transistors may usher in the next generation of computers. The early designs of the past make it possible to build the more improved components that could accelerate modern computer development. In some ways, whats old is what may be redesigned for the near future.

Will advancements in speed, memory, and processing power eventually bring about the one great computer that will be tasked with solving any and all great mysteries? Nope, it will probably encourage or stimulate continued competition among many different computer designs if other factors don’t create any undesired impediments. Creativity and determination will be paramount to moving forward, but we will also learn from what has already been done if we take the time to look back.