Virtualization and the Private Cloud

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Email this to someonePrint this pagePin on PinterestShare on TumblrBuffer this pageShare on RedditShare on StumbleUpon

Cloud computing may be the Next Big Thing, but many companies are not yet ready to fully embrace the notion of putting all of their data and applications in the public cloud and the loss of control that goes along with it. One might say the idea of a “private cloud” has been around at least since 1967 when Mick Jagger sang, “Hey, you – got off of my cloud.” But when it comes to computing, the term has only recently been popularized.

Cloud computing itself was hinted at in the 1960, too, when John McCarthy predicted that one day computing might be organized as a public utility. And indeed, today’s cloud computing concept is based on a utility-like model for computing resources, whereby economies of scale allow providers to operate IT services more efficiently and effectively than individual companies can do on their own, and offer those services at reasonable prices.

Evolution of the Utility Model
We are in the habit of thinking of utility providers as governmental, quasi-governmental or government-regulated entities, but the utility model evolved over time. Once upon a time, every house had its own individual water well, but then families that lived close together formed water co-ops, and today most of us get our water from municipal water departments or water districts that serve entire cities or groups of cities. That frees individual households from the necessity of digging and maintaining a well. At the same time, though, it makes us less self-sufficient and more dependent on the entity that provides the service – and for that reason some households still, in this modern age, prefer to maintain their own water supplies. It might cost more and it might require more work, but it gives them more control over water quality/treatment and makes them less dependent on the public entity, and if someone poisons the public water supply, theirs is still safe. In addition, they aren’t at the mercy of a public entity that can raise their rates at any time.

Likewise, today most organizations are very cognizant of the potential cost savings and administrative benefits of cloud computing but many of them don’t yet trust the public cloud. They want to know exactly where their data is residing, just as the well owner wants to know exactly where his water is coming from. They want control over what encryption protocols are used to protect it, as the well owner wants control over what chemicals, in what amounts, are put into his water. They don’t want to be vulnerable to an attack on the public network that could bring their business to a standstill, and they don’t want to be at the mercy of a cloud provider who locks them in and then raises their rates.

What they do want is the on-demand nature of the utility model, where applications and data are accessible when and where they’re needed and computers “just work” – where users aren’t continually frustrated by systems that won’t boot and IT personnel aren’t constantly troubleshooting various problems on hundreds or thousands of PCs.

The Private Cloud
How can companies reap some of the benefits of the cloud without giving up control? The answer is an emerging solution that attempts to offer the best of both worlds: the private cloud. In this networking architecture, the company doesn’t have to give up its on-premise datacenter, but the datacenter adopts the technologies and practices that are used in the public cloud. For some organizations, this may prove to be the best permanent solution. For others, as with the water co-ops that were directly owned and run by the members who got their water from it, the private cloud may serve as a temporary solution, to ease the transition from the traditional company-owned and managed datacenter to the public cloud.

Some say the private cloud is just a new name for the corporate datacenter, but the difference lies in the underlying architecture and how the computing services are provided. First we have to understand that the term “cloud” has been redefined in current meaning. At one time, the “cloud” was used to refer to the telephone network, then it came to be a common analogy for the Internet itself; in fact, in most network diagrams, the Internet is represented by a picture of a cloud. However, the advent of “cloud computing” (which some say is nothing but a new name for Software as a Service or SaaS, which itself was a new name for Application Service Providers or ASP) has given the term much more specific meaning. Today the cloud represents not just where computing resources are located, or how they are provided. Some elements of cloud computing include:

  • Dynamic resource allocation
  • Dynamic provisioning/reprovisioning
  • Metered services

All of these can take place over the Internet (public cloud), or within a local network (private cloud). When an enterprise implements a private cloud, the datacenter effectively becomes a service provider and other departments/divisions within the company become its “customers.” The private cloud differs from its public counterpart in that its customers are limited and the whole “cloud” infrastructure resides behind the corporate firewall – but it relies heavily on the same technologies used by public cloud providers.

Where Virtualization Comes In
Virtualization is an important factor for some of the major players in the public cloud space, especially those who offer “infrastructure as a service.” For example, Amazon’s EC2 platform is built on Xen. In the private cloud, virtualization becomes even more of a key enabler. The private cloud can benefit from most types of virtualization:

  •  Server OS virtualization: Running servers on virtual operating systems allows companies to consolidate multiple servers on fewer physical machines for savings on hardware, power and heating/cooling costs. It also makes for easier disaster recovery as virtual machines can be moved to a different physical machine when hardware fails, with little downtime.
  • Desktop virtualization: Virtual desktop infrastructure (VDI) products allow for server-hosted desktop operating systems to be streamed to thin or thick clients, which gives more control to administrators and makes for easier maintenance and better accessibility, as the user can access his/her desktop from any computer.
  • Application virtualization: With application virtualization client software (e.g., Microsoft’s App-V client) installed on the client computer, applications can be streamed or cached locally and executed in a sandbox, for centralized management and ease of deployment as well as better scalability.
  • Presentation virtualization: Also called screen scraping, refers to products such as Microsoft Terminal Services/Remote Desktop Services, where the OS or application(s) run on a remote server and only the user interface is transmitted over the network to the thick or thin client machine.
  • The private cloud may combine any or all of these virtualization technologies to provide cost-effective, easily managed and updated desktops and applications to users while maximizing hardware utilization. Virtualization plays a key role in bringing the benefits of cloud computing “inside,” where the company has complete control.

The 2X Solution
2X has the private cloud covered, working with industry leading products such as Microsoft Hyper-V, Microsoft Terminal Services/Remote Desktop Services, VMware (ESX, ESXi, vSphere), Sun VirtualBox, Citrix XenServer, Parallels and Virtual Iron and serving Windows, Linux and Mac clients. A major problem with many business networks is that they are hybrid networks that “just grew that way,” with server and client products from different vendors deployed at different times or in different departments and all of it strung together with the electronic equivalent of duct tape. Other networks are in transition, in the process or planning stages of moving from one platform to another. There are obvious advantages to a vendor-independent solution.

Building the private cloud on Microsoft Terminal Services/Remote Desktop Services can be a very cost-effective solution, because the feature is included as a server role in most versions of Windows Server. The really big cost savings come when you use thin clients, or older and less powerful machines running older desktop operating systems, to connect to the terminal server. In a large organization, upgrading client operating systems in order to run the latest applications can result in a large capital expenditure, and supporting those local desktops can incur major administrative overhead. Delivering virtual desktops/applications allows your users to experience all the benefits of an upgraded client OS without the necessity to undertake a company-wide deployment.

The problem with TS/RDS alone is that it doesn’t have all the features of something like Citrix Presentation Server. 2X fills in those gaps, at a lower cost. And even though Windows 2008/2008 R2 includes the ability to publish applications, it lacks the resource-based load balancing and server farm capabilities of 2X, and doesn’t give you the centralized management for published applications nor the dynamic web portal.

A problem with some third party solutions is that they only run on the latest version of Windows Server, so you have to upgrade your infrastructure to use them, incurring a large expense, possible downtime and loss of productivity during the learning curve. Or they only work with certain client operating systems, so that you have to migrate your Linux and Mac users to Windows in order to take advantage of them. 2X will run on Windows 2000, 2003, or 2008/2008 R2 for application publishing, and you can bring your published applications to Linux and Mac OS X computers running the 2X Client software as well as those running Windows.

Virtual Desktops
If it’s not done right, virtualization can be confusing for end users. Having to launch a virtual machine or manually connect to a Remote Desktop server with an RDP client puts an extra burden on users who aren’t tech savvy. The 2X VirtualDesktopServer (VDS) integrates the remote desktop and applications with the users’ local desktop and taskbar so that they don’t have to even know which applications are local and which are remotely hosted. The remote apps are accessible from the local taskbar and/or desktop just like those that are installed on the local machine.

From the administrative side, control is what the private cloud is all about, and the 2X management console gives you the ability to manage settings for all your users and configure 2X VDS or 2X ApplicationServer and Load Balancer from a centralized graphical interface. Admins don’t have to worry about physically visiting individual PCs for updating or troubleshooting; it’s easy to provide remote support when users have problems and if the problem can’t be fixed, you can simply delete the virtual machine and replace it with a new one that works.

Application Serving
If the users in your private cloud implementation don’t need the full virtual desktop experience, but only need to be able to run applications from the terminal server/Remote Desktop server on their existing client computers, the 2X ApplicationServer and LoadBalancer lets you publish applications that seamlessly integrate into the users’ local desktops, using the native Remote Desktop Protocol. Publishing gives you control over which applications users will be able to use, and they will only be able to see the apps to which you’ve given them access.

The Publishing Wizard makes it easy to publish applications, as well as folders, desktops and documents. You can publish a “pre-defined” application, which includes common built-in Windows applications such as Windows Explorer, Control Panel, Printers & Faxes, Network Connections, and so forth. Or you can publish single or multiple applications that are already installed on the server from a list of installed applications. Or you can publish a single standalone application by navigating to that program’s executable in the file system hierarchy. If you have multiple servers in the farm, you can choose which individual server or server group to publish the application(s) from.

Admins can control whether the applications are to open in a normal window, full screen, or minimized when launched by the user, and you can create shortcuts on the client’s desktop and/or in the client’s Start Menu folder. You can also associate particular file extensions with the published application on the client machine, and you can allow users to start only one instance of the app, or isolate each application to one session so that if the same app is launched twice, it runs in the same isolated session.

You have complete control over who can access the application and what machine(s) can be used to access it. You create filters to publish the application to specific users, groups of users, clients, individual IP addresses or address ranges.

Load Balancing
In organizations that have more than one terminal server/Remote Desktop server, the 2X LoadBalancer component queries the 2X Terminal Server Agent software that you install on each of the terminal servers. The LoadBalancer gets back data regarding each server’s system resources and current sessions, and then uses either the round robin or resource-based method to allocate the workload across the terminal servers. Resource-based load balancing is particularly effective because when a new client connects, the connection is sent to whichever terminal server has the most resources available.

Not only does 2X support server farms, it even allows clients to connect to multiple farms, and the single sign-on module relieves users of the extra “hassle factor” of trying to remember different passwords for accessing different applications. It’s inevitable that from time to time in a busy network environment, user sessions may become disconnected. The 2X LoadBalancer has the ability to query all of the terminal servers and reconnect the user to the right terminal server. You can also configure the LoadBalancer to limit users to one session per desktop, so that when the user reconnects to a terminal server that still has an active session for that user, the user will be connected to that active session.

You can also enable the CPU Load Balancer, which has the ability to control processes that are using excessive CPU resources and give those processes a low priority, so that other sessions and applications can continue to work normally.

You have a great deal of control over the load balancing options and can create rules, so that, for example, connections from a specific IP address will be load balanced to a particular terminal server, to a server group, or to all servers in the farm. You can also deny connections from a specified IP address by configuring the Load Balancing rule to forward those connections to “None.”

Cloud computing is gaining traction, but there is still a great deal of resistance to the idea of “going public” with sensitive data and mission critical applications. For many companies, a much more attractive solution is to reap the benefits of cloud-like services while maintaining control, thus the private cloud was born. A private cloud solution requires high levels of reliability, accessibility and security – along with ease of deployment and transparency for users. A private cloud built on popular virtualization solutions such as those from Microsoft and VMware, with desktops and/or applications distributed to users over presentation virtualization technologies such as Microsoft Terminal Services/Remote Desktop Services or Citrix, accomplishes all this – but these solutions still have features missing or high costs.

2X Software provides the answer to this problem, working with Hyper-V, VMware ESX and more, filling in the feature gaps left by Terminal Services/Remote Desktop Services and coming in at a far lower cost than Citrix.

About The Author
DEBRA LITTLEJOHN SHINDER, MCSE, MVP is a technology consultant, trainer and writer who has authored a number of books on computer operating systems, networking, and security. Deb is a tech editor, developmental editor and contributor to over 20 additional books, is lead author, blogger and newsletter editor for, and and edits the popular WXPnews and Win7News newsletters. She has authored training material, corporate whitepapers, marketing material, training courseware, and product documentation for Microsoft Corporation and other software and hardware companies. Deb currently specializes in security issues and Microsoft products and is a Microsoft MVP.

Giorgio Bonuccelli is a Marketing and Communications Director at Parallels. Giorgio has extensive experience in cloud computing and virtualization, with a background of many years in multinational corporations (Dell, EMC and McAfee). In his career he has filled different roles, from sales to training and marketing. This wide-ranging experience and flexibility helps him simplify concepts and write content that is easy to read and understandable even by newcomers to the subject. As a blogger and technical writer he has published more than 1000 papers.

  • event planning business

    hi!,I like your writing so so much!