Fifth Generation Computing: Virtualization And How To Manage It

Virtualization is the key to opening up a world of possibilities, and the technology has dawned the explosive transformation of the IT industry.

Mnet 192565 Tech
Govindaraj RanganGovindaraj Rangan

Imagine we are in a time machine and decide to go back more than 50 years to the year 1964. While there will be many differences, some things aren’t as different as we may initially think. During the period from 1964 to 1971, computer users were interacting with computers through keyboards and monitors. They also interfaced with an operating system, allowing the device to run many different applications at one time with a central program that monitored the memory. Sound familiar? Now, let’s fast-forward back to present day.

The first generation of computers used vacuum tubes (1940-1956); the second generation used transistors (1953-1963); the third generation used integrated circuits and microprocessors (1964-1971); the fourth generation used microprocessors (1971-present); and we have officially entered the fifth generation of computing with artificial intelligence (AI) at the core. If fact, an accelerated trend of virtualization kick-started the fifth generation of computing, and it has enabled web-scale workloads such as social, big data and the emerging areas of machine learning and AI.

Virtualization enables software-defined and programmable IT. The programmability extended with RESTful APIs make it easy for various systems to integrate and communicate with each other, which enables fluidity, flexibility and agility. This also makes it easier for intelligent scaling, reclamation, iterative development models and distributed computing, all of which are necessary to meet on-demand requirements at affordable costs.

Although the concept of virtualization has roots as far back as the third generation of computing, it’s relevance has only increased with time and made it more important today than 50 years ago. Not only has virtualization been a very crucial inflexion point dawning the social, mobile, analytics and cloud (SMAC) trend, but it is also fueling the digital and as-a-service economy. However, as true for all technological advances, there can be some drawbacks. Below are some things all companies considering virtualization should keep in mind.

Advantages and Disadvantages

Today, less than 10 percent of IT executives consider their companies’ data centers and operations to be fully optimized for efficiency and cost-effectiveness. It’s safe to say that any measures to increase efficiency and cost-effectiveness should go unchallenged. However, convincing decision makers to implement virtualization may take some convincing, so it’s important to be well-versed on the benefits of virtualization. These include:

  • Fewer physical servers
  • Reduced maintenance, energy costs, downtime and fewer unplanned outages
  • More efficient IT structure and reduced operating costs
  • Enables cloud migration and instant offsite access to files and applications

The benefits of virtualization do come at a cost of new challenges, but those challenges can be managed if appropriate measures are taken. Below are five challenges and tips on how to manage them:

  • VM Sprawl: Since it’s very easy to create virtual resources, there is a lot of capacity wastage with resources that are created but not used. Companies can manage this through continuous resource reclamation, which means monitoring what’s being used and ensuring that the instances are shut down when not in use.
  • Noisy neighbor: In cloud environments, it is never known who else is sharing resources with our workloads, making performance unpredictable unless the provider commits resources. Companies can manage this through performance commitment. For example, applications needing dedicated resources need to leverage the provider’s pre-committed offerings.
  • Configuration drifts: Virtual instances are created based on a pre-built image. Over the course of usage, there are many configuration changes that are not updated onto the source image. If the image isn’t updated on a regular basis, it could require a lot of manual work to return to working configuration in the event of failure. Companies can manage this through application modeling and blueprinting. Specifically, any application moving into a software-defined virtual environment should have an associated model. This model is kept up-to-date to keep pace with the configuration drifts. This also helps maintain a service-aware CMDB.
  • Licensing: Managing licensing in virtual environments is a nightmare if not controlled. Instances need to acquire and update the licenses in an automated fashion and release licenses when they go down. Perpetual licensing models prove expensive in virtual environments. This means it’s important for companies to understand vendors’ licensing requirements to prevent an uptick in costs and also implement as-a-service licensing models. Licensing of OS environments, applications and middleware needs to be innovative to support as-a-service, pay-per-use models, instead of perpetual models.
  • Security and control: Availability, confidentiality and integrity take different dimensions in virtual environments. Companies can incorporate a built-in audit system that leverages tools to automate security checks whenever possible.

The Future of Virtualization

Over half of all IT leaders are investing in new technologies to improve IT optimization in the next 12 months, with private cloud topping the list at 62 percent. In addition, by 2020, IDC predicts that 40 percent of data in the digital universe will be “touched” by the cloud, meaning either stored or processed in some way.

The cloud is one of the largest outcomes of the power of virtualization. For example, Platform as a Service (PaaS) is evolving to provide virtually unlimited computing capacity elastic enough to change on demand, which can keep costs under control. Also, Docker is an OS virtualization technology that is revolutionizing the applications architectures, providing new ways of packaging applications. Newer PaaS platforms are emerging to support the management, monitoring and scaling workloads built on OS Virtualization technologies. Plus, Bots and learning machines are becoming virtual humans, and automation has taken newer forms, with intelligent machines auto-reconfiguring itself based on continuous learning.

What’s Next?

Virtualization is the key to opening up a world of possibilities, and the technology has dawned the explosive transformation of the IT industry. Ultimately, it’s important for enterprises to deploy the right controls to ensure that this change yields appropriate value to the business, instead of turning into another technical debt.

Govindaraj Rangan is Practice Director and Head of Cloud Innovation at Wipro Limited.

 

 

 

More in Technology