EPH

Nifty Linux Monitoring Tool “Netdata”

This week while supplementing my usual coffee intake with Dr Pepper’s Venom Black Mamba energy drink (not a sponsoring plug-I just like it) I’ve been test driving the new MX Linux 19-beta 1.

I installed it on a few low powered laptops, and wanted to get a good idea how it really was performing. So far, it’s been a very smooth experience, but the most interesting part of this endeavor has been using Netdata to monitor my machine in my browser. Usually I’ll use Htop, Glances, and Nmon along with built in Linux shell based tools to analyze and monitor my systems. I decided to give Netdata a spin, and I think I like it.

You’ll probably see this tool’s full potential more applicable to server builds, but I can also see this useful for a standalone machine.

The latest MX Beta I installed – It’s no secret I’m a MX fan.

Latest MX Beta-1 simplified the installation process using Apt

Netdata available options – I stuck with all the default options.

Advanced options

Follow netdata on Twitter @linuxnetdata or facebook for more detailed information and updates.

The web view is very cool. Everyone likes cool graphics. Netdata doesn’t disappoint.

You can view a lot of details and see what your system is doing from CPU usage, memory, processes, network health, system applications and much more.

Yes I still have a Windows 10 machine running the May 2019 release so naturally I have WSL (Windows Subsystem for Linux) installed to use Debian

Naturally I had to see if it would work on Windows 10 Linux WSL and it did!

This was from the Windows Browser – no need for a Debian GUI

The info page to view Netdata’s configuration on your system.

This was just a brief glance at what you can see with Netdata. You might want to check it out and see if it works for you. You also might want to keep an eye on the next version of MX Linux and future Windows 10 WSL developments. While you do that I think I’ll try a few more energy drinks this week.

Hold steady for now

Another week, another post, but not what I had planned. I’ve been spending more time working with the latest Windows 10 upgrade. Today I just shut it down and went back to MX. There’s nothing about the new version that I really need. It’s not a bad design, and I think it’s probably the best incarnation of Windows OS I’ve ever worked with, but I don’t really need it. It won’t be my daily driver, I tried, but I seemed to spend more time navigating around than I really needed. Having Ubuntu supported was useful, but I could see how things were starting to get a bit busy in that if I’m just working with the Linux shell – why do I need Windows?
It feels like carrying around a huge tool box full of tools when I probably won’t need any of them, or when all I really need is a Swiss Army knife.
Too much noise when I’m really trying to simplify how I work.
I would recommend the New May update to anyone who currently is still using Windows 7. It works. I like it a lot better than 7, but I also don’t recommend Linux to anyone who is already very productive with Windows.
I used to because this version of Windows 10 wasn’t available until this summer. Now it’s here and a lot of Windows OS users should be very happy. I’ll just keep one machine setup for Win 10, but go back to using Linux as a daily driver, and my Chromebook as an occasional lite travel alternative. What works for you may not work for me, and what works for me may not work for you.
I would suggest trying different operating systems out if you can and pick what works best for you. I’ll still use Windows when I need to, but I don’t always need to.
I’ve been following the news lately and “if” there is any economic slow down or recession, I think I’ll try to get more mileage out of my old laptops. I haven’t seen any reason to upgrade my current hardware, and really don’t foresee any compelling reason to on the horizon.
I still get a lot of use out of my budget buy Chromebook. I’m still hoping to get by for the most part with just the Chromebook, but I still find that I can do everything I need with MX Linux on an old Thinkpad. I also don’t feel the need to upgrade my phone just yet, but that too could change. I’d be more inclined to upgrade my phone than to buy a new laptop or even pick up a reasonably priced tablet. My phone is always with me, which at times can be annoying, but that has become my main computer. I just still happen to fall back to working with a laptop for some odd reason. Maybe it’s because I can see the screen a lot easier, and even though I can use a folding keyboard for the phone, the laptop remains what I’m more comfortable with. I try to get the most out of everything I purchase. Sometimes it can be a challenge, but such a challenge can also be fun. This next few months could be very interesting to see where the economy is heading. The market’s going up, the market’s going down – who knows?
It used to be that a new operating system from Microsoft resulted in the perceived notion that one had to upgrade their cpu, ram, storage, or just upgrade their computer completely.
I don’t think that’s a thing anymore. In fact the new Windows 10 has probably extended the life of many systems. That was something I relied on Linux for – to squeeze more useful life out of my laptop.
Throw in a Chromebook for most users, and you probably save even more expenses. There is however usually a new MacBook Pro lurking just around the corner for some, and that would be a nice machine to work with, but I think I’ll muddle through for a while with what I already have. It works, and that’s usually an important consideration.

Summer’s End

As the end of summer approaches in late August – (not officially the end of summer, but for practical purposes – in the north, it’s over at the end of August)
I start to think about winter. This time of year you realize that you may only have a few opportunities to get ready for the cold weather, shorter daylight hours, and snow. This is especially true if you have a lot of work to do outside before the weather turns bad. While I’m writing this a Tornado warning has come across every phone in the area. It’s raining again, so we are already losing time for any outdoor work that I should be doing in preparation for a long winter. Lately the winters haven’t been as brutal as they normally are, but what is normal anymore?
Aside from regular day to day living preparation for winter, there are also a few things I can do to prepare for any down time should the weather limit travel. Along with making sure my computers are all running their best – which is a continuous challenge, I like to have a few good books on hand to read.
This usually consists of a few up to date Linux books – which there seems to be fewer released each year. Nevertheless I have a few that are still relevant and use for reference on occasion if I come across noteworthy challenges. Of course there is always the Internet that everyone seems to rely upon for information, but I still prefer using books for any research I may wish to do.
If the Internet connection is down, a book will still work – even by candlelight.
I don’t necessarily believe everything I see on the Internet, or on television for that matter.
I usually have a few laptops loaded with various Linux distributions to tinker with. I’ll usually have one laptop to run the latest Windows release on, but it’s not something I would use for much more than running a few Windows versions of similar applications I use on Linux – mostly for comparison.
I have a few tools for any minor repairs I might need to attempt in a pinch, and a nice little workshop to work, or just read in.
In the winter I can still work here in the shop provided it’s not too cold outside. A thermos of hot coffee makes everything seem perfect. If it gets too bitterly cold I go back to the house and work by the fireplace. This is also not so bad. At least it won’t be as humid as the summer has been. The last few years have been cold and wet most of the year. The nice warm summers we all look foreword to in the winter tend to go from cold and damp spring to hot humid summer with very few super nice days. There are a few each year and we try to make the most of those days, but it’s a given that we will see some colder weather soon enough. I suppose it’s best to get ready for another cold autumn and winter.

I’m sure I still have a few trips to the local hardware store for all sorts of miscellaneous stuff, and I’m certainly planning on stocking up on chicken soup as usual. If the weather is bad, I like having the option of staying in.

The old adage of not going out in a storm if you don’t have to is good advice. New tires for my truck is also not a bad idea. Who thinks about picking up wool socks in late August when it’s hot out? Well I do, and while I’m at it, it might be time to get some new hoodies.

If retail stores can start advertising for Christmas, then I can start thinking about getting ready for snow.

Minding the Machine

Most computer systems aren’t much good if they are not provisioned, maintained, or applied effectively to perform a task or set of objectives.
For the most part this can all get very complicated or simplified depending on how everything is coordinated from data in to data out. A lot of human intervention still prevails as the norm in such a way that human error often becomes the weak link in the chain. The opportunity for human error must be minimized in many different operational applications. Simplified – it’s not too unreasonable to think a human could impact a system in such a way as to render it unstable or unreliable.
It is true that bad applications can be built on poorly structured code, but usually that code was written by an individual or individuals that could impart one’s ineptness onto the final instruction set. It is also true that data transport into and out of a system – via a network supported connection – often built upon human designed networks that data loss from external and internal factors could be mitigated, but often aren’t can diminish a good computer system’s reliability.
Data latency and packet loss give the impression that a system is not working correctly. System resource burdens that impact instruction completion can give the impression that a network constraint is at fault for reliability or that the network is creating the “possible” negative performance.
Aside from weather, security issues, and unavoidable power loss issues which are equipment and/or component stability related, backup and redundancy mitigation or lack thereof are other areas where human error comes into play.
Error checking, security, network health monitoring, and system implementation require thoughtful planning and execution from effective management systems which ultimately (for now) rely on individuals with proven skill sets.
One of these skill sets is leadership.
I disagree with the premise that effective leadership is a subset of time management alone. Leadership is a multifaceted mechanism that can either produce positive results or allow negative outcomes. Team leadership is another loosely used acronym that often borders on empty promises and hollow abstract references such as “High Level view” , a totally useless approach to understanding the nuts and bolts of complex operation or system. Great for a “macro” approach, useless for detailed understanding of critical inter-workings of anything much more complicated than sharpening a number 2 pencil. It’s a popular buzz word phrase, but I usually Interpret such noise as either a lack of interest, or a lack of understanding about whatever subject is bantered about in context to any immediate topic in need of serious review. Yes BS only wastes time and resources.
Great computer systems need great support teams, who require great leaders, who require vision, courage, and the ability to support such values in their team members and stakeholders.
Communication among all team members is critical to distilling any resolutions that might not develop in an information vacuum. It is true that there will always be the rare individuals who can do it all, but why not combine efforts to always look for alternative views which often lead to improvement.
I myself prefer to work alone when practical, but that isn’t always practical or productive. I know that I can fix some things, but not all things.

At some point it may become common for computers to design and build other systems without the plausible limitation of human intervention. There would be no need for support groups or managed leaderships based on rank or assigned importance. All decisions reduced to algorithms and soft coded deductive parameters. In the meantime we still need bright hardworking people to oil the levers, adjust the springs, and care for the error free operation of many computer systems currently deployed into service for a wide variety of operations. Computers are a wonderful resource and a powerful tool, but so is the human mind – and the human heart.
Together the future can be an amazing journey.

Turing In-Complete (part 2)

From the ABC (First digital electronic computer) named after it’s creators John V Atansoff and Clifford Berry to the fictional Multivac written about by the great science fiction author Isaac Asimov there is a lot of development that would need to take place. We started the first computer revolution with vacuum tubes and switches using the concepts of binary arithmetic and logic, progress to integrated circuits and chips, and push modern silicon limitations beyond today’s 5 GHz peak. Here we approach the possible saturation limitation of Moore’s law (double number of transistors in a dense integrated circuit every 18 months).

If we reach the limitation in modern manufacturing techniques we would theoretically slow the rapid advancements in computer technology and probably never build the ultimate computer we are destined to create. Today’s super computers built with many processors running custom variations of Linux are incredibly powerful when compared to today’s consumer and enterprise systems, but they will be much more powerful in years to come because we are never satisfied. We look back at early vacuum tube technology and re-apply it in modern engineering techniques to push processing speeds to 1000 times that of today’s current silicon based designs. Development moves from GHz measured processing speeds to approach Terahertz speeds thanks to Air Channel Transistors and Graphine Nanoribbon technology which can use magnetic fields to control current flow, and on to light-induced super circuits which potentially brings quantum computing into play.

If you’ve been out looking for a new laptop recently you might suspect that Moore’s Law already has hit a wall. Dennard scaling law based on the theory that as transistors get smaller, their power density stays constant. That didn’t really pan out, so in the last decade we’ve seen the move to increase performance of modern computers with increases in processing frequencies and multiple cores. You would think that more cores would equal faster more efficient computers, but that doesn’t always work out due to power dissipation which causes heat, and workloads that don’t actually get divided among all cores. This is why you don’t see the 18 month doubling of computer power in the selection of available consumer products like you may remember just a decade ago.

The idea that computers would continue to get smaller and faster at a constant rate indefinitely may need some revising. Some components will benefit from advancements in nano technology and allow smaller intelligent devices to perform many functions much larger systems once did. This also will allow larger assemblies of these tiny computer components to work together in forms that may resemble the first and second generations of digital computers. In effect computing systems have already gotten much larger, an example being today’s server farms used for cloud computing. The combined computing power of multiple servers still has limitations. A glimpse of this will be seen as the IOT (Internet of Things) devices and 5G technology already impacting modern networking designs becomes more common place. This won’t necessarily result in smarter computers, but it could result in better services. It will also result in more noise and more power consumption.

All this amazing technology thanks to the “triode” an improvement on the early diode valve vacuum tube designs. We can thank Lee De Forest for inventing the “audion” in 1906. This was the vacuum tube design that enabled development of the early digital computers. The transistor wasn’t invented until 1947, and even then the vacuum tube was still widely used for quite a while on into the late Twentieth Century. Now Vacuum Transistors may usher in the next generation of computers. The early designs of the past make it possible to build the more improved components that could accelerate modern computer development. In some ways, whats old is what may be redesigned for the near future.

Will advancements in speed, memory, and processing power eventually bring about the one great computer that will be tasked with solving any and all great mysteries? Nope, it will probably encourage or stimulate continued competition among many different computer designs if other factors don’t create any undesired impediments. Creativity and determination will be paramount to moving forward, but we will also learn from what has already been done if we take the time to look back.

%d bloggers like this: