From the July 2010 Issue
Virtualizing your business can save time and money, help your applications run faster, make remote access easier, and be much safer in a business continuity/disaster recovery situation. What is not to like?
Well, that would be the difficulty of finding a competent installer that understands your applications and how to make them work. We continue to be appalled at the number of moderately competent installers in the United States who portray themselves as competent, and claim their way is the best way, only to take your money and leave you with a poorly implemented system.
I’m going to try to make virtualization as simple to understand as I can in this article, at the risk of not being 100 percent technically accurate. To be technically accurate, you’d miss the understanding of the concept as well as the capability from a management point of view masked in computer jargon. This is a summary of the state of the art of virtualization, and how you can capitalize on this technology to serve your clients better.
First, there are three main competitive products for Windows application servers and workstations: VMWare ESX and Workstation, Xen Server and Desktop, and Microsoft Hyper-V and AppV. These products all allow one or more instances of an operating system to run on a single piece of hardware. For small businesses, this means that you can use one or two hardware servers to replace the functionality of five to eight servers without installing everything on a single server, which is still a big no-no.
For example, you can have Exchange, SQL, Terminal Servers or Citrix, File and Print, Web servers, QuickBooks 2010, QuickBooks 2009, QuickBooks 2008 and specialty applications each on their own instance of a server. This protects all of the other applications instead of mixing them all together in one or two servers or buying six to nine servers as per this example.
Next, you can use virtualization for your desktop applications, as well. This is not Terminal Services or Citrix, but a true replacement of the native install of Windows and your other applications on a laptop or a desktop. This technology is just beginning to work acceptably enough for me to recommend it to you, but not quite. Where this works, it is phenomenal, driving IT costs down 30 to 75 percent, and increasing performance 25 to 33 percent. Updates to workstation software are far simpler, since one master copy is updated and used by all.
The downsides for this approach include the need to use open licensing, the need to have a hardware Storage Area Network (SAN) for reliability, the need for software to control user settings (an example of this would be a product like AppSense), and the biggest objection of all, having laptop users remember to disconnect properly. Most of these issues have become smaller, easier and less expensive over the last year. Both server and desktop virtualization package all the complexities of installation into a single file or very few files, and eliminate much of the hardware dependencies.
Conceptually, users don’t care what plumbing is required to run their applications and do their work. They care that the products work consistently and correctly. I agree. Applications have been written to run on mainframes, then minicomputers, then microcomputers, and most recently on local area networks and database servers. The complexities of implementation increased, while communication costs decreased with speed increases, particularly over the Internet, and it became cost effective to centralize applications again.
To centralize these applications, there are several strategies that can be used. Consider the following: 1) Internet browser applications delivered as Software as a Service (SaaS), 2) hosted applications which we originally called Application Service Providers (ASP) and some now call SoSaaS (Same Old Software delivered as a Service), and 3) virtualized servers, desktops or applications hosted in secure data centers or in your own business.