How to better understand virtualization for industrial automation
Virtualization allows you to benefit in terms of scalability, portability, security, agility and speed. That is why it is crucial to understand what virtualization is as a key factor in the development and implementation of industrial applications.
Dedicated vs virtual
When devices based on dedicated hardware are used to build industrial solutions, the situation becomes more complex and more expensive. When you engineer a solution using dedicated teams, over time companies are forced to invest in more equipment. Thus, when this situation is reached, an effort is required to find ways of interconnection and integration between the devices. Something that becomes a very linear relationship, so that when you want to add another function, you have to add another team.
Faced with a scenario like the one described, virtualization appears on the scene as something very pertinent. With this technology, you have access to software-defined equipment running on common hardware platforms, so when you want to add a feature, simply add another software application to extend the value and flexibility of your existing hardware.
The scenario described is ideal for today’s digital transformation environment because not only is the amount of hardware that needs to be purchased reduced, but also increased agility and adaptability, so that changing business requirements can be more easily responded to. And virtualization allows it to be done without large pull-and-replace cycles, reducing vendor lock-in.
From data centers to industry
It should be noted that virtualization is not something new. It has been taking place on servers and in data centers for years. In the world of data centers and servers, the way the virtual machine works is by adding a layer of software called a hypervisor, which sits on top of a piece of computer hardware. A hypervisor allows a piece of hardware to run multiple virtual machines.
The virtual machines are located on that hypervisor so that each virtual machine groups an operating system, the application itself and any of its dependencies, libraries or configurations necessary to run that application. And these virtual machines are easily replicated across different hardware platforms and are easy to scale up or down.
By relating this concept to industrial applications, the virtual machine in the hypervisor would not be in a data center with a controlled atmosphere, but on the edge of a network or embedded within packaging equipment where it could be subjected to higher temperatures. and low air flow. And this is where the concept of containers (Docker) comes into play.
Containers are an easier form of virtualization because they share the host operating system rather than replicating the operating system in each container. Therefore, only one operating system is needed within the system for each container to host the application and its dependencies, libraries and settings.
If there are libraries in the operating system that are shared between containers, it is not necessary to replicate them to each container. The containers are isolated from each other and from the outside world; interconnections are created through virtual networks within containers or connected to the outside world.
Because each container essentially has its own virtual network and does not have access to external sockets or other containers natively, if this configuration is managed properly, the use of containers can dramatically reduce the number of attack vectors on the network.
Additional aspects
Beyond the safety advantages of containers, there is another aspect that is worth highlighting: resistance. When building a monolithic application containing the user interface and databases, if a part of that compiled application crashes, everything goes away and must be restarted. However, in a container each of these functions is isolated from each other. This means that a blocked application does not disable the entire machine. You just have to restart that particular container, which can be configured to happen automatically.
An additional advantage is that of portability. Since containers house both the application and its dependencies, these containers can be moved across hardware platforms regardless of the type of processor you have. This is of vital importance in the industrial field because it is normally designed for very long life cycles. So, if you need to replace the hardware within five years, for example, it is very possible that it cannot be purchased, with the same processor and the same memory and drive configuration. But since the application needs to run on the new hardware smoothly and without local dependencies, containers provide that solution.
Another characteristic of virtualization is that of scalability. Since new containers can be deployed virtually in seconds, it is very easy to implement a single solution across thousands of pieces of hardware and thousands of locations. In this way, containers eliminate the problem of the application working on one machine but not another, as well as the debugging cycles associated with correcting those problems.
Speed and agility are another set of benefits that containers offer. If you need to update one of the virtualized containerized applications, you can simply stop that container and update it without removing the top piece of hardware or the rest of the applications. Or it may also be that you want the applications to always be updated to the most recent version, or on the contrary, you prefer that this never take place and you prefer that they always remain fixed in a specific version. In any case, everything can be done with the appropriate settings.