computers

Days Grow Shorter

The great push to get all timely side projects done before the Holidays begins. “The September and October surge” tasks I know I should start before the snow and ice sweep across the land.
Summer is over and suddenly, as if by total surprise, approaching seasonal imposed deadlines become priority. “Didn’t I just come off the Spring – Summer surge?” It’s the yearly temperature influenced push that some might misinterpret as productivity preparedness. If you plan on working within some level of comfort from the cold you should start thinking about things you can do before the weather changes. Pace yourself and avoid the last minute frenzy. Most likely there will be a few late nights where the wind will be howling and the cold will have to be kept at bay in order to focus on work. Maybe now is a good time to find my hoodies.

Of course this is a bit melodramatic, but it does make sense to get as much done before the holidays begin and frosty mornings become the norm.
This time of year has some positives; there will be less daylight to distract me from my tasks at hand and limit the temptation to step into the winter sun – if there is any this year.
Yes, the dreary weather and gloomy skies will be perfect for staring at the computer screen for hours and hours, but first any outdoor work that can be done before the approaching ice age will have to be squeezed in. Managing time to accommodate both in and outdoor work is sometimes a challenge.

Once I think I have everything perfect for working late into the night, I can settle and focus on writing, learning new computer skills, or testing out new Linux developments. Ill have the radio playing quietly in the background. Sometimes Ill just sit and read through a few technical manuals for hours.

Hot coffee from a thermos I bring out to my workshop is usually close at hand. It may not seem too exciting, but I enjoy working late into the night. There’s always something new to learn, or sometimes old skills I rediscover.

At least I won’t have to worry about the heat and humidity that limited the amount of time I worked in my shop this summer.

I’ve been following Hurricane Dorian developments. There’s not much if any impact due where I am, but as the tracking changes it does make me wonder how well we actually predict the weather. It does underscore the need to prepare for rough weather even when skies are clear. I don’t see any particular weather event in the near future, but I’m positive the cold is coming as it does every year.

Minding the Machine

Most computer systems aren’t much good if they are not provisioned, maintained, or applied effectively to perform a task or set of objectives.
For the most part this can all get very complicated or simplified depending on how everything is coordinated from data in to data out. A lot of human intervention still prevails as the norm in such a way that human error often becomes the weak link in the chain. The opportunity for human error must be minimized in many different operational applications. Simplified – it’s not too unreasonable to think a human could impact a system in such a way as to render it unstable or unreliable.
It is true that bad applications can be built on poorly structured code, but usually that code was written by an individual or individuals that could impart one’s ineptness onto the final instruction set. It is also true that data transport into and out of a system – via a network supported connection – often built upon human designed networks that data loss from external and internal factors could be mitigated, but often aren’t can diminish a good computer system’s reliability.
Data latency and packet loss give the impression that a system is not working correctly. System resource burdens that impact instruction completion can give the impression that a network constraint is at fault for reliability or that the network is creating the “possible” negative performance.
Aside from weather, security issues, and unavoidable power loss issues which are equipment and/or component stability related, backup and redundancy mitigation or lack thereof are other areas where human error comes into play.
Error checking, security, network health monitoring, and system implementation require thoughtful planning and execution from effective management systems which ultimately (for now) rely on individuals with proven skill sets.
One of these skill sets is leadership.
I disagree with the premise that effective leadership is a subset of time management alone. Leadership is a multifaceted mechanism that can either produce positive results or allow negative outcomes. Team leadership is another loosely used acronym that often borders on empty promises and hollow abstract references such as “High Level view” , a totally useless approach to understanding the nuts and bolts of complex operation or system. Great for a “macro” approach, useless for detailed understanding of critical inter-workings of anything much more complicated than sharpening a number 2 pencil. It’s a popular buzz word phrase, but I usually Interpret such noise as either a lack of interest, or a lack of understanding about whatever subject is bantered about in context to any immediate topic in need of serious review. Yes BS only wastes time and resources.
Great computer systems need great support teams, who require great leaders, who require vision, courage, and the ability to support such values in their team members and stakeholders.
Communication among all team members is critical to distilling any resolutions that might not develop in an information vacuum. It is true that there will always be the rare individuals who can do it all, but why not combine efforts to always look for alternative views which often lead to improvement.
I myself prefer to work alone when practical, but that isn’t always practical or productive. I know that I can fix some things, but not all things.

At some point it may become common for computers to design and build other systems without the plausible limitation of human intervention. There would be no need for support groups or managed leaderships based on rank or assigned importance. All decisions reduced to algorithms and soft coded deductive parameters. In the meantime we still need bright hardworking people to oil the levers, adjust the springs, and care for the error free operation of many computer systems currently deployed into service for a wide variety of operations. Computers are a wonderful resource and a powerful tool, but so is the human mind – and the human heart.
Together the future can be an amazing journey.

Turing In-Complete (part 2)

From the ABC (First digital electronic computer) named after it’s creators John V Atansoff and Clifford Berry to the fictional Multivac written about by the great science fiction author Isaac Asimov there is a lot of development that would need to take place. We started the first computer revolution with vacuum tubes and switches using the concepts of binary arithmetic and logic, progress to integrated circuits and chips, and push modern silicon limitations beyond today’s 5 GHz peak. Here we approach the possible saturation limitation of Moore’s law (double number of transistors in a dense integrated circuit every 18 months).

If we reach the limitation in modern manufacturing techniques we would theoretically slow the rapid advancements in computer technology and probably never build the ultimate computer we are destined to create. Today’s super computers built with many processors running custom variations of Linux are incredibly powerful when compared to today’s consumer and enterprise systems, but they will be much more powerful in years to come because we are never satisfied. We look back at early vacuum tube technology and re-apply it in modern engineering techniques to push processing speeds to 1000 times that of today’s current silicon based designs. Development moves from GHz measured processing speeds to approach Terahertz speeds thanks to Air Channel Transistors and Graphine Nanoribbon technology which can use magnetic fields to control current flow, and on to light-induced super circuits which potentially brings quantum computing into play.

If you’ve been out looking for a new laptop recently you might suspect that Moore’s Law already has hit a wall. Dennard scaling law based on the theory that as transistors get smaller, their power density stays constant. That didn’t really pan out, so in the last decade we’ve seen the move to increase performance of modern computers with increases in processing frequencies and multiple cores. You would think that more cores would equal faster more efficient computers, but that doesn’t always work out due to power dissipation which causes heat, and workloads that don’t actually get divided among all cores. This is why you don’t see the 18 month doubling of computer power in the selection of available consumer products like you may remember just a decade ago.

The idea that computers would continue to get smaller and faster at a constant rate indefinitely may need some revising. Some components will benefit from advancements in nano technology and allow smaller intelligent devices to perform many functions much larger systems once did. This also will allow larger assemblies of these tiny computer components to work together in forms that may resemble the first and second generations of digital computers. In effect computing systems have already gotten much larger, an example being today’s server farms used for cloud computing. The combined computing power of multiple servers still has limitations. A glimpse of this will be seen as the IOT (Internet of Things) devices and 5G technology already impacting modern networking designs becomes more common place. This won’t necessarily result in smarter computers, but it could result in better services. It will also result in more noise and more power consumption.

All this amazing technology thanks to the “triode” an improvement on the early diode valve vacuum tube designs. We can thank Lee De Forest for inventing the “audion” in 1906. This was the vacuum tube design that enabled development of the early digital computers. The transistor wasn’t invented until 1947, and even then the vacuum tube was still widely used for quite a while on into the late Twentieth Century. Now Vacuum Transistors may usher in the next generation of computers. The early designs of the past make it possible to build the more improved components that could accelerate modern computer development. In some ways, whats old is what may be redesigned for the near future.

Will advancements in speed, memory, and processing power eventually bring about the one great computer that will be tasked with solving any and all great mysteries? Nope, it will probably encourage or stimulate continued competition among many different computer designs if other factors don’t create any undesired impediments. Creativity and determination will be paramount to moving forward, but we will also learn from what has already been done if we take the time to look back.

Turing In-Complete (part 1)

Before man-built machines that could be used to manually calculate all the same mathematical problems we now regard as computation, we – humans were regarded as the “computers”, not the artificial machines. This explains the label “manually” calculated. Man built the machines. This has only been true for a relatively short period of time when compared to the timeline man has existed in the current evolutionary state.

This technology goes back much farther than the existence of our most popular desktop pc, laptops, tablets, or smartphones. Major developments in the twentieth century progressed at a very rapid pace, not with the help of Extraterrestrial beings, but by some very brilliant humans. Maybe you could make a case for “math” from outer space in ancient history, and you’d be technically close when you factor in the influence of the orbit of planets and positions of stars that inspired the desire to figure out what was seen in the skies.

The abacus was the first device currently known for crunching numbers. The Sumerian abacus is thousands of years old and noted throughout ancient history. This isn’t what I would regard as an early computer, but it was and still is an impressive design.

The Analytical Engine an improvement over the Difference Engine – both designed by Charles Babbage in the early 1800s could be considered the foundation of modern computing. Ada King, countess of Lovelace created the first computer program for the Analytical Engine – if it had been completed. The design was, but not the fully functional machine. So the idea or design for the device came before the actual machine – as did a program that could have run on the machine.

I always felt that this part of history was a bit murky, but within the fog, there was a spark. The point is that this was a starting point that others could build upon.

Could the Analytical Engine be categorized as the first Turing Complete machine?

If we consider all modern programming languages Turing-compatible, then could it have run a program that would solve any calculation initially performed manually? In theory – possibly, in practical application, I am skeptical.

To consider the current concern about Artificial Intelligence taking over every aspect of man’s future in both positive and negative light, you should look back through its short history of advancements. Computers have come a long way not fully envisioned by the early creators, but it is still a very short time compared to man’s intellectual development.

Turing Completeness requires a system to make a decision based on data manipulated by rule sets. Remember those “If”, “and”, “goto” statements from BASIC (Beginner’s All-Purpose Symbolic Instruction Code). Maybe you remember (90s version) QBasic. If you don’t, no problem. Just know that there was some amazing progress in computer development from the 1950s and 1960s that used instructions which could be considered Turing-Complete -theoretically – not always in practice. This may not be the best way to explain this, but I think I’m in the ballpark.

I’m not disregarding Turing’s calculating machine design of from the ’30s, but things started to ramp up in the ’50s.

Consider the fact that we still use Fortran and LISP programming, both from the 1950s. Yes, I should mention assembly language which dates back to the late ’40s.

You can look back at the Rand Corporation’s Math-Matic AT-3 from 1957 used as a compiler and programming language for the Univac 1. Charles Katz led a team tasked with developing “Math-Matic” programming language under the direction of Grace Hopper who was notable in the movement towards “machine-independent” programming languages which helped lead to the development of high-level programming languages.

This was all done in the 1950-1960s. This is the era of Big computers like the DATATRON 200 series weighing in at over 3000 lbs. Big computers working with word size 10 decimal digits. All this incredibly amazing computer development which would later lead to the machines we now fear. It’s amazing to think we would later spin up the development of AI – which initially required the development of sophisticated computer code which came from these early systems. The history of computers and programming languages is very interesting and usually not referenced enough when we look at our current state of affairs with how much we depend on them. Man built these with the intent to improve his condition, and in most cases they have. What may be getting lost through time is the appreciation of all those who contributed over the last two centuries to the existence and development of all this amazing technology. It continues today, and it still requires some very brilliant minds to continue the advancement for the good of man. This is just the beginning. We are still in the early stages of computing, and we are still the computers.

%d bloggers like this: