Altitude Technology Group prides itself on providing “a higher perspective” on technology trends. On the macro level, to us that means observing technology trends by decade rather than month. Observing the market over the past 10 years, it’s obvious that we are completing a massive consolidation phase. More relevantly, over the past five years matured burst innovation technologies have generated new market momentum and the big fish have circled and swallowed most of the front runners in an attempt to grow slowing revenues, penetrate new markets, and capture a full-stack “soup to nuts” solution for our end-users.
The bad news is that when this happens, generally those leading product’s performance, support, and development all crawl to a stop while the corporations and suits figure out who to fire, what to keep, and what to throw out. At best, a few years later you end up with a duct taped and poorly integrated implementation that has lost all it’s unique feature and value.
Our current market however is even more out of whack. The burst “market opportunities” of what was perceived as “VDI” and “cloud” computing didn’t actually exist. And vendors ran out to create companies servicing them, neglecting that the end-user market hadn’t digested the technologies, matured to adopt them, and in most cases hadn’t even defined the terms themselves.
Over the past weeks we’ve been deep-diving on a number of soon to be release cloud storage, computing, and virtulization products.
Who’s running “VDI” in the “CLOUD”?
Who’s leveraging a “fully dynamic resource responsible virtualized on-demand computing environment?” Nobody. But at the heart of it, what’s the huge difference between VDI and more traditional methods of Server Based Computing…. not all that much. Maybe 5% of the large enterprises in the world are pushing the cutting edge in these areas. Why? Because for most of the world, those are marketing terms, in most cases pushing products that should be in beta or first version iteration at best. But they’re effective marketing engines. They’re here before their time and they’re imploring technologists everywhere to get with it or fall into uselessness and unemployment.
While VDI solutions and cloud services certainly exist, the great divide between adoption and rejection still resides in the nitty gritty. Can an enterprise currently reliant on traditional computing seamlessly leverage these next-generation technologies……. afraid not. In fact to crowbar, slap on, and crazy glue a solution together today would be near insanity for most production shops…. hence the perpetual science projects and lack of true traction for these obviously powerful infrastructure tools.
The year of 2010 – “Enter the Enablers”
The truth is there is a market opportunity because these technology deliveries do hold genuine merit. The problem is there’s a gap between the infrastructure of today and the ability to leverage the infrastructure of tomorrow. Over the past 18 months a slurry of new start-ups and redefinitions across software, appliance, storage, and hardware products are all targeting the enablement of virtualization and cloud technologies.
This is a good thing for end-users. The science project is being completed by these new entrants, providing real avenues to begin leveraging these next-generation technologies.
Over the next decade this will be delivered in two formats:
– The swiss army knife:
A slew of product solutions will hit the market over the next months providing heterogeneous support of cloud service provider products. Get your data to these devices and they will take care of the rest. Full integration with SAS and Cloud providers is integrated through baked API’s. Leaving the end-user to only configure once, deposit data, VMs, and applications, and forget. Or so they say. We’re tracking no less than six impressive vendor technologies in this space. We’ll introduce each as they hit the market and begin to prove out their value propositions. (i.e. Today Google assigns a team of developers to any enterprise google apps customer. This isn’t just to migrate customers off of existing “trusted” traditional computing, it’s also to provide added support and credibility to the cloud offering.)
-The standardized approach
With full acceptance that end-users must be led to water, standards bodies and protocol definitions are being built to enable leveraging the cloud and cloud providers. We’re paying close attention to the Cloud Data Management Interface standard from SNIA and similar protocol standardizations. Vendors that support these established protocols will have an early leg-up on cloud infrastructure delivery. (i.e. Within 12-18 months Google and Amazon will simply have to provide support for emerging standards, the rest will be plug and play vendor product support. This will provide solid foundation for more intense and efficient migration to cloud technologies where applicable.) Furthermore, this will lead to the cloud being leveraged independently of hypervisor enabled or hypervisor supported environments. Any IT function could technically interface with these new standards to seamlessly leverage an external and/or distributed flexible resource (Cloud).
“The Cloud Data Management Interface defines the functional interface that applications will use to create, retrieve, update and delete data elements from the Cloud. As part of this interface the client will be able to discover the capabilities of the cloud storage offering and use this interface to manage containers and the data that is placed in them. In addition, metadata can be set on containers and their contained data elements through this interface.
This interface is also used by administrative and management applications to manage containers, accounts, security access and monitoring/billing information, even for storage that is accessible by other protocols. The capabilities of the underlying storage and data services are exposed so that clients can understand the offering.”
-The swiss army knife:
A group of “end-point management” technologies will release heterogeneous support for flexible end-point computing delivery. Most “VDI” opportunities today, when handled correctly, become a mixture of heterogeneous vendor technologies. Building successful true VDI implementations is still somewhat of an art, requiring point solutions working in conjunction. Most enterprises find that 50-70% of “What VDI means” to them is end-point management, user management, application management, and data management, and abstracting the components that combine all those to deliver end-point computing (profile, OS, application) into independently manageable parts. Near term solutions will wrap all of that up in a bow and provide a unified interface to manage the multiple components delivering the solution.
-The standardized approach
In a few years, once the components of delivering end-point IT productivity are abstracted, they can truly be independently delivered through standard delivery. Thin, thick, laptop, and mobile end-points will be centrally managed AND provisioned, leveraging the correct delivery stack where appropriate. The key caveat being a unified management, provisioning, and efficiencies of scale. Newly ratified protocols fit to deliver this new content over existing networks will quickly gain market and product support extending this delivery flexibility to the entire enterprise.
Here’s the good news. This “enablement” phase in the market will serve as a mini-innovation phase, finally delivering on multiple promises of technology value to the market in tangible and production ready implementations.
End users will leverage these technologies where applicable. Net-new buildouts that fit the requirements can and should be based on these soon to be standard delivery models. Over-arching management suites will be leveraged to enable consolidated management of heterogeneous environments. Server based computing delivery will increase as end-point productivity is defined and supported through traditional (SBC) and next-generation delivery (VDI, App Virtualization).
Early adopters will enjoy mainstream support for their environments while the hesitant and risk adverse will gain built-in support for these environments through traditional products maturing to support these new standards and associated delivery protocols. (Unified end-point flexibility)