Back in September 2009, the Burton Group (acquired by Gartner in January 2010) posted three predictions for the year to come. Three macro trends were determined and detailed: externalization of IT, consumerization of IT, and democratization of IT.
Consumerization of IT
“The growing trend of allowing information workers to choose their own equipment imposes significant burdens on the enterprise; it’s hard to secure equipment you don’t own, and it’s hard to manage and support a very diverse hardware and software base.” (Chris Howard)
Currently, just behind cloud computing in “hotness” on the “buzzword bingo” ranking list, but catching up fast, are IT trends related to Consumerization of IT. These are and will become even more challenging times for corporate IT, keeping up with and controlling shadow IT within the enterprise.
There are several examples of how consumerization is here already, although some are in a premature state. For instance, the whole bring your own device (BYOD) trend enables employees to select the endpoint device (laptop, tablet, mobile) they feel most comfortable with. This is in contrast to forcing employees to use only an endpoint that has been mandated by the organization in order to be able to control the end-to-end IT landscape. But control is actually harder to manage at the endpoint. So creating a controlled mechanism to access the landscape, like virtual desktops, frees up the endpoint from some areas of control, but not all.
The BYOD trend also is a good solution for organizations to keep up with the mobile revolution, where connectivity, access, and participation are growing rapidly, and smart devices are becoming the primary route to get connected. Devices are getting smarter as they are increasingly enriched by mobile apps. IBM, with a very mobile work force, also has a BYOD strategy, which is really about supporting employees in the way they want to work.
I want to take this one step further; I actually see that consumerization will extend to bring your own application (BYOA) in the near future, where employees also select the application they are most comfortable with. This is the whole “app” culture nowadays. Although everyone has a preference, which app you use does not matter, as long as it gets the job done.
As an example, let’s say you want to navigate from A to B. Maybe you prefer using your purchased navigation application such as TomTom or Navigon, but if for some reason it is not functioning, you can use another one such as Google Maps… as long as it gets the job done. I see this trend especially as a result of mergers and mainly for commoditized functionality, such as CRM is this year.
The expression “consumer is king” shows that consumerization is expected to stay, but will it also be financially viable for the organizations implementing it for their employees? I think already is, or at least will become in the near future for the following reasons:
- Applications are increasingly consumed on a pay-for-use basis. You will not pay double because half of your customers use application A, and the other half use B.
- In the merger scenario, no additional effort has to be put in educating users on being able to use the single “enterprise-selected” application. Nor is effort needed to make the non-selected application obsolete.
- Integration “as a service” and standardized cloud APIs are evolving. IBM, for instance, has Cast Iron, enabling data, process, and user interface integration right out of the box—available for “private cloud” implementations as software product or appliance, and “as a service” from our “shared cloud” IBM SmartCloud Enterprise or IBM SmartCloud Enterprise+.
So, that is my blog posting about IT trends that match the macro trend consumerization of IT. I’m looking forward to reading what your opinion is on this trend or blog post.
Watch for the last blog posting in this series, where I discuss democratization of IT and how it is influencing us on a daily basis.