Part 5 of a 5-part series based on my more than a decade of virtualization experience with large enterprises and service providers, and my time running strategic planning for one of the largest 2 virtualization vendors, this blog series covers 10 types of situations when you should consider not virtualizing some of your applications.
Reason 9: When you want to save money on all desktops by virtualizing them
Servers cost more than cheap desktops. You still have to buy a PC or tablet or thin client – and manage and secure it too. Virtual desktops are great for security and compliance, but they are not a lower cost option for all types of employees.
Reason 10: When you are running virtualization platform components
If you virtualization platform and hypervisors rely on Active Directory or DNS servers, and those servers are virtualized, you’ll run into a catch-22 situation where you can’t start the hypervisor because it’s waiting for services from the VMs that run on it. Ouch. You also need to check out whether your virtualization management software (vCenter, etc.) can and should be run on servers it manages.
So there you have it, the 10 reasons you might not want to virtualize an application or a server. To recap, here are all 10 from start to finish:
- When you have static, predictable computing needs
- When you can’t get a virtualization friendly license
- When it just won’t work very well
- When time drift will hurt your apps
- When you work for a cheapskate
- When you’re already running servers at high capacity
- When you don’t have a way to manage encryption keys
- When you use clustered apps with built in failover
- When you want to save money on all desktops by virtualizing them
- When you are running virtualization platform components
I hope this was helpful for those of you who are actually in the trenches planning virtualization and cloud implementations. Don’t drink the virtualization kool-aid until you’ve done your diligence against this list of reasons not to virtualize!