By: Michael Burke, Account Director

For those not in the IT profession, “virtualization” tends to be a confusing topic, and possibly for good reason. It deals with the abstraction of computer hardware, and “abstraction” by definition is, well…abstract. Also, it’s something that for the most part you don’t see because you’re not supposed to see it. It’s supposed to make your life easier by shielding you from underlying complexities in IT architecture.

If you’re wondering whether virtualization has anything to do with virtual reality, it actually does, at least conceptually. With virtual reality, you’re immersing someone in a reality that looks and feels real, but actually isn’t. Virtualization, likewise, according to Webopedia, involves creating “a virtual version of a device or resource, such as a server, storage device, network or even an operating system where the framework divides the resource into one or more execution environments.” In simpler terms, it involves creating something that looks and feels like an independent computer, but in actuality is just a portion of what is running on some other computer, somewhere else. In some cases, it’s not just “some other computer” but a configuration of servers and other resources, and your “virtual machine” is pulling resources from a variety of areas to function. And it isn’t only that a virtual machine appears to be a real machine to you–it appears as a single resource to other devices, applications and human beings as well.

 

The different kinds of virtualization

This has all kinds of positive implications for security, disaster recovery and other aspects of IT resource management such as storage. Which brings us to the main point–IT departments employ virtualization as a way of simplifying operations and infrastructure, and saving time and money. As a result, the concept of virtualization is employed in all kinds of ways:

    • Storage and server virtualization: coordinating the use of multiple storage devices or servers so that they function as a single resource for powering applications or storing data
    • Operating system virtualization: running multiple operating systems within the same hardware (or grouping of hardware) in such a way that they operate as a totally separate systems running on separate physical devices
    • Network virtualization: the resources of a single network are partitioned or segmented so that they operate as separate networks (even though it’s all really just one big, happy network)
    • Application virtualization: the application appears to run like a normal application on a single computing resource, but in actuality is “distributed” across a bunch of different resources  

In fact, there are so many extremely varied use cases for variation, that it starts to get a little bit difficult to see what they have in common, but the thread that runs through all of these and other examples is that it involves consolidating computing resources so that you can do more with less. To understand how this is beneficial, think about your PC or laptop. It’s running applications that are installed on your actual computer, but it also runs a ton of stuff that isn’t on your computer, like Google services, social media sites, etc. In reality, your computer could never run all of these resources–you’d run out of computing power before you could say Yahoo! But in “virtual” reality, is able to coordinate actions taking place in hundreds of different computing resources simultaneously. While this isn’t normally considered “virtualization”, the IT concept actually isn’t that much different in theory.

 

How virtualization is done

It’s far beyond the scope of any single article (or even a single really thick book), but for brevity’s sake, it involves using software applications to create a “virtual layer” of simplification over underlying computer architecture that is actually very complex. In the case of server and storage virtualization, this is done by means of a hypervisor (software that does just that).

There is a downside, as SearchServerVirtualization points out, in that there may be some degradation in performance vs. something running directly on hardware, because it doesn’t necessarily have access to the full resources of dedicated hardware. But, they also point out that most OSs and applications don’t actually need those full resources, so the concept tends to work as intended.  

 

Who is doing it?

Pretty much everyone. VMWare is of course the kingpin of all kingpins in the virtualization market, and their solutions are used to some degree by just about every sizable organization in the free world.

 

Why virtualization matters

Aside from a being a big time, money and resource saver for companies, and having certain security and disaster recovery benefits, virtualization is a foundational technology that paves the way for all kinds of other important things in business and government today. Perhaps the most notable of these is cloud computing. While virtualization and the cloud are talked about as separate approaches, in reality cloud computing is fundamentally built on virtualization, and could never be achieved without it. In fact, so closely are the two concepts coupled that it’s tough to tell where one ends, and the other begins (which, I suppose, should be too surprising since it’s all virtual!) It doesn’t end there, “big data”, or the ability to collect, store and analyze mountains of data, would not be possible without heavy reliance on virtualization. Automation, IoT, the list goes on…virtualization, it turns out, is a major pillar of modern computing.