BitDepth 660 - December 30

Microsoft introduces server level virtualisation tools.
Virtualisation from Microsoft

Microsoft's Jonathan Jagai explains the company's new virtualisation product for servers. Photo by Mark Lyndersay.

Virtualisation used to be an act of desperate software voodoo, a way of running software on computers too new or different to support it.
It's essentially a software trick that creates a discrete file on the computer (an image, in techspeak) that pretends to be compatible with the code you want to use.
It's been a big thing for Mac users who need to run Windows on their computers for almost two decades now, and gamers have long used emulators like MAME to resurrect long lost titles like Defender and Centipede.
An old Mac product, Virtual PC, was bought by Microsoft in 2003 and remains in the product lineup as a way for PC users to create virtual environments for older operating systems.

Microsoft recently introduced its new product for the server backroom, Windows Server 2008 Hyper-V, which replaces its older product, Virtual Server 2005. Virtual PC, defined by Microsoft as a personal solution, remains available.
Jonathan Jagai, Microsoft Technology Specialist for the West Indies introduced the product to a room of IT professionals two weeks ago, positioning the new server-based virtualisation product as a way to consolidate server iron and make more use of today's faster computers.

Optimising server hardware
"Virtualisation," he said, "will make more use of what you already have."
The details of how this gets done are seriously geeky but sensible. The emphasis for the product is less on emulation than it is making more use of today's fast, multiprocessor computer systems.
Jagai's presentation offered halcyon situations in which multiple older computers are turned into disk images and run in virtual mode on newer, faster hardware.

This allows IT administrators to reduce the number of computer systems in the server backroom and to create distributed computing systems that can press fewer physical computers to the task of running the corporate IT backbone.
Virtual servers images can be set up to emulate a wide range of processors and can mirror their contents using geoclustering between multiple computers in different physical locations.
It's possible then, to reduce the absolute count of computers, along with energy and cooling costs in a well equipped server room and to replicate critical services at another location.

Virtualisation 360
Administration of these virtual machines follows IT best practices, and Microsoft has made a crucial nod to its primary competitor in this business space - Hyper-V servers can convert existing VMWare virtual machine images.
Hyper-V enabled servers will run best with modern server iron, and Jagai suggests equipping such equipment with 8-16 GB of memory.
Servers running multiple virtual machines will handle their own load balancing and memory optimisation automatically.

Even if Microsoft's solution offers a more secure microkernel coding system, why would customer switch from an installation with, say, VMWare virtualisation?
"Virtualisation 360," said Jagai, "we can virtualise the entire experience."
The level of control that server administrators can exercise over multiple virtual machines in a server installation is truly intimidating and will, probably, result in long meetings about procedures and access limits in any IT department that adopts Hyper-V virtualisation.

Related:
Notes on Hyper-V
blog comments powered by Disqus