Consumerization is the reorientation of product and service designs to focus on (andmarket to) theend user as an individualconsumer, in contrast with an earlier era of onlyorganization-oriented offerings (designed solely forbusiness-to-business orbusiness-to-government sales). Technologies whose firstcommercialization was at the inter-organization level thus have potential for later consumerization. The emergence of the individual consumer as the primary driver of product and service design is most commonly associated with theIT industry, as large business and government organizations dominated the early decades ofcomputer usage and development. Thus themicrocomputer revolution, in which electroniccomputing moved from exclusively enterprise and government use to includepersonal computing, is a cardinal example of consumerization. But many technology-based products, such ascalculators andmobile phones, have also had their origins in business markets, and only over time did they become dominated by high-volume consumer usage, as these productscommoditized and prices fell. An example of enterprisesoftware that became consumer software isoptical character recognition software, which originated with banks and postal systems (to automatecheque clearing andmail sorting) but eventually becamepersonal productivity software.
In a different sense,consumerization of IT is the proliferation of personally owned IT at the workplace (in addition to, or even instead of, company-owned IT), which originates in the consumer market, to be used for professional purposes.[1] Thisbring your own device trend has significantly changed corporate IT policies, as employees now often use their ownlaptops,netbooks,tablets, andsmartphones on thehardware side, andsocial media,web conferencing,cloud storage, andsoftware as a service on thesoftware side.
Consumerization has existed for many decades, as, for example, the consumerization ofrefrigeration occurred in the 1910s through 1950s. The consumerization of IT is believed to have been first regularly called by that term by Douglas Neal and John Taylor of theLeading Edge Forum in 2001; the first known published paper on this topic was published by the LEF in June 2004.[2] The term is now used widely throughout the IT industry, and is the topic of numerous conferences and articles. One of the first articles was special insert in "The Economist" magazine on October 8, 2011.[3] Later,Consumerization of IT has been used ambiguously. In an effort to structure the amorphous nature of the term, researchers suggested to take three distinct perspectives: an individual, organizational and market perspective.[4]
The technology behind the consumerization of computing can be said to have begun with the development of eight-bit, general-purposemicroprocessors in the early 1970s and eventually thepersonal computer in the late 1970s and early 1980s. Thus, themicrocomputer revolution, in which electroniccomputing moved from exclusively enterprise and government use to includepersonal computing, is the cardinal example of consumerization. However, it is significant that the great success of theIBM PC in the first half of the 1980s was driven primarily by business markets. Business preeminence continued during the late 1980s and early 1990s with the rise of theMicrosoft Windows PC platform. Meanwhile, other technology-based products, such ascalculators,fax machines, andmobile phones, also had their origins in business markets, and only over time did they become dominated by high-volume consumer usage, as these products commoditized and prices fell.
It was the growth of the World Wide Web in the mid-1990s that began to reverse this pattern. In particular the rise of free, advertising-based services such asemail andsearch from companies such asHotmail andYahoo began to establish the idea that consumer IT offerings based on a simple Internetbrowser were often viable alternatives to traditional business computing approaches. Meanwhile, it is argued that consumerization of IT embodies more than consumer IT diffusion, but a chance for considerable productivity gains. It "reflects how enterprises will be affected by, and can take advantage of, new technologies and models that originate and develop in the consumer space, rather than in the enterprise IT sector".[5]
The primary impact of consumerization is that it is forcing businesses, especially large enterprises, to rethink the way they procure and manage IT equipment and services. Historically, central IT organizations controlled the great majority of IT usage within their firms, choosing or at least approving of the systems and services that employees used. Consumerization enables alternative approaches. Today, employees and departments are becoming increasingly self-sufficient in meeting their IT needs. Products have become easier to use, andcloud-based,software-as-a-service offerings are addressing an ever-widening range of business needs in areas such asvideo-conferencing,digital imaging, businesscollaboration,sales force support, systemsback-up, and other areas.
Similarly, there is increasing interest in so-calledBring Your Own Device strategies, where individual employees can choose and often own the computers and/orsmart phones they use at work. The AppleiPhone andiPad have been particularly important in this regard. Both products were designed for individual consumers, but their appeal in the workplace has been great. They have demonstrated that elements of choice, style and entertainment are now critical computer industry dimensions that businesses cannot ignore.
Equally important, large enterprises have become increasingly dependent upon consumerized services as search, mapping, and social media. The capabilities of firms such as Google, Facebook, and Twitter are now essential components of many firm's marketing strategies. One of the most important consumerization questions going forward is to what extent such advertising-based services will spread into major corporate applications such asemail,Customer Relationship Management (CRM), andIntranets.
One of the more serious negative implications of consumerization is that security controls have been slower to be adopted in the consumer space. As a result, there is an increased risk to the information assets accessed through these less trustworthy consumerized devices. In a recent CSOOnline article by Joan Goodchild she reported a survey that found "when asked what are the greatest barriers to enabling employees to use personal devices at work, 83 percent of IT respondents cited "security concerns"[6] This shortcoming may soon be remedied by the chip manufacturers with technologies such asIntel's "Trusted Execution Technology"[7] andARM's "Trust Zone"[8]—these technologies being designed to increase the trustworthiness of both enterprise and consumer devices.
In addition to the mass market changes above, consumer markets are now changing large-scale computing as well. The giantdata centers that have been and are being built by firms such as Google,Apple,Amazon and others are far larger and generally much more efficient than the data centers used by most large enterprises. For example, Google is said to support over 300 millionGmail accounts, while executing more than 1 billion searches per day.
Supporting these consumer-driven volumes requires new levels of efficiency and scale, and this is transforming many traditional data center approaches and practices. Among the major changes are reliance on low cost, commodityservers,N+1 system redundancy, and largely unmanned data center operations. The associated software innovations are equally important in areas such asalgorithms,artificial intelligence, andBig data. In this sense, consumerization seems likely to transform much of the overall computing stack, from individual devices to many of the most demanding large-scale challenges.