As the end of summer approaches in late August - (not officially the end of summer, but for practical purposes - in the north, it’s over at the end of August)I start to think about winter. This time of year you realize that you may only have a few opportunities to get ready for the cold weather, shorter daylight hours, and snow. This is especially true if you have a lot of work to do outside before the weather turns bad. While I’m writing this a Tornado warning has come across every phone in the area. It’s raining again, so we are already losing time for any outdoor work that I should be doing in preparation for a long winter. Lately the winters haven’t been as brutal as they normally are, but what is normal anymore?Aside from regular day to day living preparation for winter, there are al
Most computer systems aren’t much good if they are not provisioned, maintained, or applied effectively to perform a task or set of objectives.For the most part this can all get very complicated or simplified depending on how everything is coordinated from data in to data out. A lot of human intervention still prevails as the norm in such a way that human error often becomes the weak link in the chain. The opportunity for human error must be minimized in many different operational applications. Simplified - it’s not too unreasonable to think a human could impact a system in such a way as to render it unstable or unreliable.It is true that bad applications can be built on poorly structured code, but usually that code was written by an individual or individuals that could impart one’s ineptn
From the ABC (First digital electronic computer) named after it's creators John V Atansoff and Clifford Berry to the fictional Multivac written about by the great science fiction author Isaac Asimov there is a lot of development that would need to take place. We started the first computer revolution with vacuum tubes and switches using the concepts of binary arithmetic and logic, progress to integrated circuits and chips, and push modern silicon limitations beyond today's 5 GHz peak. Here we approach the possible saturation limitation of Moore's law (double number of transistors in a dense integrated circuit every 18 months). If we reach the limitation in modern manufacturing techniques we would theoretically slow the rapid advancements in computer technology and probably never build ...
Before man-built machines that could be used to manually calculate all the same mathematical problems we now regard as computation, we – humans were regarded as the “computers”, not the artificial machines. This explains the label “manually” calculated. Man built the machines. This has only been true for a relatively short period of time when compared to the timeline man has existed in the current evolutionary state. This technology goes back much farther than the existence of our most popular desktop pc, laptops, tablets, or smartphones. Major developments in the twentieth century progressed at a very rapid pace, not with the help of Extraterrestrial beings, but by some very brilliant humans. Maybe you could make a case for “math” from outer space in ancient history, and you’d be technic
Lemon-Lime Gatorade and crushed ice is (almost, coffee still #1) my new favorite beverage. I know it seems wrong, but it’s pretty nice on hot humid days - especially when I’m waiting for Windows 10 Update Assistant to finish upgrading Windows 10 on my Lenovo ThinkPad. So slow, I know I should be patient, so I finish some other work I’m chipping away at on another Linux laptop. I downloaded the latest Win 10 64 bit iso also just in case I have to build from scratch, but I wanted to have my current configuration updated by the Update Assistant so I could experience the method many users would choose.Thus, the cold beverage on such a hot day. (Which I just spilled)This entire exercise started this morning as a plan to play around with the new Windows Command line. The iso I had on hand was ve