End-user computing aims to provide a seamless experience to workers as they use various devices to access a range of applications. The IT infrastructure services needed to provide this experience, however, are anything but seamless.
The storage and networking roadblocks to virtual desktop infrastructure (VDI) are well-known. Converged infrastructure and advancements in bandwidth optimization address many of these challenges, but virtual desktop and application management has only grown more complex. Meanwhile, mobility forces organizations to rethink their approaches to network access, data storage, security and more. And many IT professionals don't understand the full extent of this required transformation to IT infrastructure services before they deploy end-user computing technology.
"Most organizations don't know what they're getting into until they find themselves standing in knee-deep mud," said Bob Egan, CEO of Sepharim Group, a mobility analyst firm in Falmouth, Mass.
The cost of external arrays with enough I/O and capacity to support virtual desktops traditionally hindered widespread adoption of the technology. And network latency created a subpar user experience (UX) that caused many a VDI project to fail.
Those issues are no longer the impediments they once were. Data deduplication and all-flash arrays took significant strides toward improving storage capacity, performance and cost for VDI. Regarding network latency, advancements in remote display protocols and client-side graphics processing addressed most of the UX problems it caused.
The real game-changer was the emergence of converged and hyper-converged infrastructure. These systems integrate the appropriate amount of server, storage and networking resources required to support a specific number of virtual desktops, making VDI simpler to implement and easier to scale.
Christian Mohnsenior solutions architect, Proact
"You just buy enough of them to hit your user count," said Alastair Cooke, a virtualization trainer and consultant in New Zealand.
But like a game of Whac-A-Mole, as some IT infrastructure services problems disappear, others have popped up to further complicate the VDI picture.
For one, organizations often overlook the need for hardware-accelerated graphics to support their virtual desktops and applications, said Christian Mohn, senior solutions architect at Proact, a solutions provider based in Oslo, Norway. Graphics processing units (GPUs) were originally designed for image-intensive applications, but their high processing power now lends itself to all types of applications, both physical and virtual.
"Even Microsoft Office or your browser use hardware-accelerated graphics now," Mohn said. "More and more software utilizes the GPU."
This reliance on GPU limits the hardware that organizations can run VDI on, because not all servers support 3D graphics cards, Mohn said. Such technology also adds to the cost of a VDI project, he added. IT departments should consider these factors and plan for their costs in advance so that they don't get caught by surprise midway through a VDI deployment.
Additionally, management of virtual desktops and applications has grown more complex. Modern VDI shops use some combination of monitoring, application layering, data management, user personalization and other tools. Working with so many products, which often come from multiple vendors, can be just as frustrating as dealing with the networking and storage problems that plagued VDI in its early days.
In this regard, organizations should not view the cloud as a magic pill. Desktop as a service hosts desktops and their support infrastructure in the cloud, but it doesn't offer all of today's advanced management capabilities as a service, Cooke said.
Rethink mobile access
Enterprise mobility also raises concerns around IT infrastructure services, although not to the same extent as VDI.
The BYOD trend forced many organizations to examine and upgrade their wireless bandwidth capacity, which would have happened anyway as the security benefits of Wi-Fi, such as default encryption, became clearer, Mohn said.
"Earlier, Wi-Fi was an add-on," he said. "Now, it's a primary delivery method."
And the sheer amount of data that users create -- especially when it comes to photos and videos that replace or augment written documents -- means organizations will have to re-evaluate their storage strategies. That doesn't necessarily mean buying more storage, but developing policies regarding what types of data to retain and for how long, Cooke said.
Systems management is nothing new, but data center administrators now have an abundance of tools available to help accomplish this important work.
The bigger issue is that IT departments must rethink their approach to how users access corporate systems. Today's workers use multiple device types running various operating systems over wired, wireless and cellular networks. The typical method of managing and securing individual connections to corporate resources is untenable at this scale, Egan said.
Instead, organizations should create policies that govern which applications and data users can access based on their role, device, network, location and other factors.
"Do you have the systems in place to exert that control?" Egan asked.
Those systems include identity management and enterprise mobility management software, plus other technologies designed to store and secure data and detect and prevent intrusion.
Organizations typically buy IT infrastructure services on an as-needed basis, not as part of a grand plan. So they don't always see how big the big picture is until it's too late, said Ira Grossman, CTO at MCPc, an IT solutions provider in Cleveland. As a result, more of MCPc's customers are investing in asset management and telecom expense management software to ensure that their investments in end-user computing are worth it, he said.
"The process of allocating cost per user has gotten more complex," he added.
How to improve IT infrastructure services
Make sure your VDI project is a success
Sort through the EMM market