Photograph by Opto When the Harvard Business Review (HBR) published “IT Doesn’t Matter” in May 2003, the point was to start an argument, or, as they say in the more genteel world of academia, a debate. The provocative title of the article and its timing — at the tail end of a long slump in technology spending — ensured that a dustup would ensue. The resulting debate has been impassioned and often revealing, and is still going on.
For those who may have missed it or might welcome a reminder, the central point of the essay, written by Nicholas G. Carr, then editor at large of HBR and now a consultant and author, was that there is nothing all that special about information technology (IT). He declared that information technology is inevitably going the way of the railroads, the telegraph, and electricity, which all became, in economic terms, just ordinary factors of production, or “commodity inputs.” “From a strategic standpoint, they became invisible; they no longer mattered,” Mr.
“That is exactly what is happening to information technology today.” The reaction was swift. Within weeks, Mr. Carr was branded a heretic by many technologists, consultants, and — especially — computer industry executives.
Intel’s Craig Barrett, Microsoft’s Steve Ballmer, IBM’s Sam Palmisano, and others felt compelled to weigh in with varying degrees of fervor to reassure corporate customers. Their message: Don’t listen to this guy. Keep the faith in IT’s power to deliver productivity gains, cost savings, and competitive advantage. And the reaction continued.
HBR got so many responses that it set aside a portion of its Web site to accommodate them, and Mr. Carr kept the controversy bubbling on his own Web site.
He became a traveling celebrity of sorts, defending his stance in forums across the country, from the Harvard Club in New York City to the Moscone Convention Center in San Francisco, where he traded verbal jabs with Sun Microsystems’ Scott McNealy. The article became fodder for countless columns in newspapers, business magazines, and trade journals. In the interest of full disclosure, I should note that I contributed to the phenomenon.
I did not know Mr. Carr before his article was published, but HBR had sent me an advance copy of the manifesto, which I quoted in a long Sunday business piece for the New York Times on the maturing of the IT industry. To the best of my knowledge, it was the first mention of Mr. Carr’s article in the press. Two weeks later, I cited Mr. Carr again in a piece headlined “Has Technology Lost Its ‘Special’ Status?” When “IT Doesn’t Matter” was published in HBR, I thought Mr.
Carr had delivered an important, thought-provoking reconsideration of the role of IT in the economy and inside companies. Now that his analysis has been expanded to book length, I still do. This time, his ideas are packaged with a less incendiary title, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage (Harvard Business School Press, 2004). But his message is unchanged, though more fleshed out and nuanced. Carr’s thinking, in my view, is flawed — at times seriously flawed — but not necessarily in ways that undermine his essential thesis.
So let’s first examine what his fundamental point is, and what it is not. The title of the original HBR article was misleading. Carr is not arguing that information technology doesn’t matter. Of course it does. Among other things, IT improves productivity by reducing communications, search, and transaction costs, and by automating all sorts of tasks previously done by humans. Carr asserts that as IT matures, spreads, and becomes more standardized, the strategic advantage any individual firm can gain from technology diminishes. Paradoxically, the more the economy gains from technology, the narrower the window of opportunity for the competitive advantage of individual companies.
This was the pattern for railroads, electricity, and highways, which all became utilities. In the IT world, Mr.
Carr sees evidence of mature standardization all around him. The strategic implication, according to Mr.
Carr, is clear. “Today, most IT-based competitive advantages simply vanish too quickly to be meaningful,” he writes. Thus, IT strategy for most companies should become a game of defense. The shrewd executive, Mr.
Carr writes, will in most cases keep his or her company focused on the trailing, rather than the leading, edge of technology. He offers four guidelines for IT strategy: “Spend less; follow, don’t lead; innovate when risks are low; and focus more on vulnerabilities than opportunities.” In Mr. Carr’s view, there are two kinds of technologies: “proprietary technologies” and “infrastructural technologies.” The first yields competitive gain, whereas the second is just plumbing, at least from a strategic standpoint. Technologies shift from proprietary to infrastructure as they mature.
When a technology is young, companies can gain a big strategic advantage, and Mr. Carr deftly describes how companies like Macy’s, Woolworth, and Sears, Roebuck exploited the new economics of retailing made possible by rapid, long-distance shipments by rail, and how a new breed of national high-volume manufacturers like American Tobacco, Pillsbury, Procter & Gamble, Kodak, and Heinz sprang up by gaining advantage from modern transportation, the telegraph, and electricity. Once a technology moves into the infrastructure category, however, corporate opportunity wanes. In IT these days, Mr. Carr sees just about everything being folded into the infrastructure, including the Internet, Linux, Web services, and Windows.
Carr is particularly insightful on the subject of enterprise software, such as SAP’s enterprise resource planning offerings and Siebel’s customer relationship management programs. As he does throughout the book, he succinctly draws the analogy between the present and an earlier technology. In this case, enterprise software is depicted as the modern version of machine tools. Before the 20th century, machine tools were bespoke gadgets made by each factory for its own requirements. But then machine-tool vendors emerged. Their economies of scale brought lower costs and standardization to the machine-tool industry. Innovation continued, but it was the vendors who developed and distributed those innovations for all manufacturers — and thus no competitive advantage accrued to any individual manufacturer.
Carr sees a similar “vendorization” in enterprise software, where core business processes like supply chain management and customer relationship management are handled by standard software packages. The result is a straitjacket of standardization, leaving little room for a company to distinguish itself. Small wonder, Mr.
Carr writes, that in the late 1990s enterprise systems came to be called “companies-in-a-box.” Even the companies that seem to be IT-based success stories — notably Dell Computer and Wal-Mart — are not, Mr. Carr tells us. Yes, Wal-Mart was a leader in using advanced computing and private networks to link sales, inventory, and supply information. But Wal-Mart’s real edge today, Mr. Carr says, is the scale of its operation, which enables it to strong-arm suppliers and zealously pursue efficiencies everywhere in its operations. And Dell, he contends, has an edge over rivals because of its direct marketing and build-to-order strategy. “It’s true that IT has buttressed Dell’s advantage, but it is by no means the source of that advantage,” Mr.
More generally, Mr. Carr observes, strategic advantage derives not from technology itself but “from broad and tightly integrated combinations of processes, capabilities, and, yes, technologies.” Translation: How you use technology, not the technology itself, is the crucial variable. “Indeed,” Mr. Carr writes in his preface, “as the strategic value of the technology fades, the skill with which it is used on a day-to-day basis may well become even more important to a company’s success.” It has the ring of innocuous truism, but wait a moment: Does that statement really apply to a utilitylike infrastructure technology? Does the skill with which we use electricity, commuter rail service, or the telephone have anything to do with corporate success or failure?
No one seeks insights from research firms, like Gartner, or advice from consultants, now including Mr. Carr, on how to use real infrastructure technologies. This suggests that information technology may be a bit different after all.
The main difference between computing and the industrial technologies Mr. Carr cites is that the stored-program computer is a “universal” tool, which can be programmed to do all manner of tasks.
The general-purpose nature of computing — especially software, a medium without material constraints — makes it more like biology than like railroads or electricity. It has the ability to evolve and take new forms. Speech recognition, natural language processing, and self-healing systems are just three of the next evolutionary steps on the computing horizon.
Carr might dismiss such comments as romanticized nonsense — and he certainly could be right. Yet understanding the nature of the technology is crucial to determining whether computing is truly graying or, more likely, whether some parts of the industry are maturing while new forms emerge further up the computing food chain.
System components are sized using the System-Based Design concept, which applies the ASHRAE-endorsed Heat Extraction Methodology to link system performance to building thermal loads. Energy analysis uses full 8760 hours-per-year analysis to evaluate the operation of a wide variety of air handling and plant equipment. Operating costs are computed based on energy use and demand charges. Carrier e20 software free download. Thermal loads are calculated using the ASHRAE® 1-endorsed Transfer Function load method.
Are we seeing old age — or merely the end of one stage in a continuing cycle of renewal? Carr notes that the technology bubble of the 1990s resembled the booms and busts of railway and telegraph investment, which marked the passing of youthful exuberance in those industries. In the computer industry, however, there already had been two previous boom-and-bust cycles — in the late 1960s, when mainframe time-sharing services appeared to be the computing utilities of their day, and in the mid-1980s, when legions of personal computer companies were founded and soon perished.
Again, the pattern seems to be cyclical and evolutionary, as innovations accumulate and eventually cross a threshold, opening doors to broader market opportunities. Let’s take one potential example, Web services. The nerdy term refers to the software protocols that could allow a new stage of automation as data and applications become able to communicate with each other over the Internet. More broadly, Web services are seen as the building blocks of a new “services-based architecture” for computing.
Carr briskly brushes Web services into his “vendorization” bucket. He writes, “Here, too, however, the technical innovations are coming from vendors, not users.” The vendors — IBM, Microsoft, Sun Microsystems, and others — are working jointly only on the alphabet soup of software protocols: XML, UDDI, WSDL, and so on.
Yet when technologists talk of a services-based architecture, they are speaking of a new computing platform that they see as the next big evolutionary step in decentralizing the means and tools of innovation — much as the minicomputer was a new platform that decentralized computing away from the mainframe, and then the personal computer put power in many more users’ hands. Computer scientists regard the Web as a “dumb” medium in a sense. It is, to be sure, a truly remarkable low-cost communications tool for search, discovery, and transactions, but the Web is mostly raw infrastructure because it is not very programmable.
Web services hold the promise of making the Internet a programmable computing platform, which is where differentiation and potentially strategic advantage lie. I cite this as only one example of where Mr. Carr’s desire to fit everything neatly into his thesis leads him astray. There are others. He mentions Linux, and its adoption by Internet pacesetters such as Google and Amazon, as proof that commodity technology is plenty good enough for almost any need. Linux, the open source operating system, does allow those companies to build vast computing power plants on low-cost hardware from the PC industry.
But the other great appeal of Linux — and open source software in general — is that it also frees those companies from the vendors. The rocket scientists at Google and Amazon can tweak the software and change it without seeking permission from Microsoft or Sun Microsystems or anyone else. Today, Google is both a brand name and verb. But technological differentiation has been the bedrock of its comparative advantage. It is the better mousetrap in Internet search. As an example, Google undermines, rather than supports, Mr. Carr’s point.
His thesis is often the same kind of straitjacket of standardization that packaged software, as he says, is for companies. Carr approvingly cites studies showing a random relationship between total IT spending and corporate profits. But these merely demonstrate that aggregate technology spending is neither the only nor the crucial variable in determining corporate profitability.
That is hardly surprising. Again, it is how companies use the technology — integrating the tools with people and processes — that counts the most. Carr can be quite selective in citing the work of others. He points to research from Paul Strassmann, an industry consultant, that supports his case while gliding over the fact that Mr.
Strassmann was a prominent critic of Mr. Carr’s original HBR article.
Still, these can all be seen as quibbles. They do not necessarily shake the accuracy of Mr. Carr’s central point — that the period of sustainable advantage a company can derive from technology is diminishing. But is that really surprising?
Everything, it seems, moves faster than it did 10, 20, or 30 years ago, including technology. To say that the advantages technology gives a business are more fleeting than they once were is not to say those advantages aren’t worth pursuing. Dawn Lepore, vice chairman in charge of technology at Charles Schwab, estimates that a lead in new IT-based financial products lasts from one to 18 months. “You still get competitive advantage from IT, but there is no silver bullet,” she observes. Carr’s book is a thoughtful, if at times overstated, critique of faith-based investment in technology, and it makes a real contribution to the field of technology strategy.
Carr understates the strategic importance of defense. The old adage in baseball is that defense and pitching win championships; in basketball it is defense and rebounding. In business, if you don’t make the defensive technology investments to keep up with the productivity and efficiency gains of your industry peers, you simply lose. The drift toward more standardized technology that Mr. Carr describes also points to a different kind of pursuit of strategic advantage. It may not be IT-based, but it is certainly dependent on technology.
This is what Irving Wladawsky-Berger, a strategy executive at IBM, calls the “post-technology era.” The technology still matters, but the steady advances in chips, storage, and software mean that the focus is less on the technology itself than on what people and companies can do with it. The trend is already evident in companies and in universities. The elite business schools and computer science programs are increasingly emphasizing multidisciplinary approaches, educating students not only to be fluent in technology, but also in how to apply it. In companies, the same is true. The value is not in the bits and bytes, but up a few levels in the minds of the skilled businesspeople using the tools. Large chunks of the technology may be commoditizing, but how you use it isn’t.
That is where competitive advantage resides. 04213 Author Profile: Steve Lohr , who covers technology for the New York Times, is the author of a history of computer programming, Go To: The Story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts — The Programmers Who Created the Software Revolution (Basic Books, 2002). Knowledge Review/Books in Brief by David K.
Hurst Strategy Maps: Converting Intangible Assets into Tangible Outcomes By Robert S. Kaplan and David P. Norton Harvard Business School Press, 2004 324 pages, $35.00 Strategy Maps is the third in the “balanced scorecard” series of books by the originators of this now well-known concept of performance measurement.
Kaplan, a professor at the Harvard Business School, and David P. Norton, a consultant, write for managers who are leading or implementing strategic change, and here they introduce and develop the “strategy map” as a tool to bridge the gap between strategy formulation and execution. This is a consultant’s casebook, based upon hundreds of examples from both private and public sectors, and is a much finer-grained representation of the process than were their previous books. The authors specify four generic strategic positions a firm can choose: operational excellence, customer intimacy, product leadership, and system lock-in (when the firm creates an industry standard that all must follow). In their previous book The Strategy-Focused Organization: How Balanced Scorecard Companies Thrive in the New Business Environment (Harvard Business School Press, 2000), the authors went to some lengths to emphasize that although the description of strategy could be scientific, its formulation was an art.
The reader could easily conclude that strategy formulation is a onetime event at the top of the organization, whereas its implementation by those below will continue indefinitely. But in a world of disruptive, tectonic change, when the earth is shifting beneath our feet, where does that leave the managers and workers who must implement change — the users of the strategy maps?
The metaphor, together with the complexity of the diagrams in the book, reminds readers that the scale of the map they choose is crucial to effective navigation: Too little detail leaves you lost, but fine-grained detail can leave you paralyzed. The people on the ground must have some creative latitude, because there will come a time when the features on a map are unrecognizable in the real world. At that point, everybody, regardless of his or her position in the organization, will need an artist’s intuitive sense of direction if the corporation is to navigate successfully.
Fools Rush In: Steve Case, Jerry Levin, and the Unmaking of AOL Time Warner By Nina Munk HarperBusiness, 2004 68 pages, $26.95 Approximately 70 percent of business mergers fail, with the only beneficiaries being the investment bankers, lawyers, other advisors on the deal, and — sometimes — the shareholders of the organization being sold. The reasons mergers fail are legion, but the most prevalent is the clash of corporate cultures.
Business journalist Nina Munk has written a compelling story of one of the largest such fiascos, the acquisition of Time Warner by America Online. In Fools Rush In, Ms. Munk documents the colossal collision of these companies, calling on extensive interviews with most of the personalities involved in the deal. The story is factual but reads like fiction, giving readers the dramatic sense that they are present at the scene.
The individual corporations were hardly homogeneous. Time Warner had been formed in 1990, when the patricians of Time Inc.
Had tried to transform their stodgy Wall Street image by acquiring high-flying Warner Communications, fending off a hostile takeover offer from Gulf & Western in the process. The resulting corporate culture was one of feuding fiefdoms.
Jerry Levin, who would later agree to sell Time Warner to AOL, emerged out of the chaos as an accomplished corporate infighter, and set about ensuring his own rise to the top. AOL’s Steve Case, on the other hand, was a serial entrepreneur, focused on a messianic vision of his young company: supplying dumbed-down Internet services to the masses. As AOL had grown, however, he had surrounded himself with managers who did not share his vision. Many were short-term operators, and some seem simply to have been hustlers who wanted to get rich quick.
As the Internet bubble continued to expand at the end of 1999, Steve Case, with exquisite timing, parlayed the grossly overvalued price of AOL stock into the acquisition of Time Warner. Although the stock price deflated somewhat before the deal was signed, AOL would end up with 55 percent of the joint company. The resulting combination of operations shows why a “merger of equals” is seldom, if ever, seen in such deals: For every job there are at least two candidates, and disputes are usually resolved on the basis of “who bought whom.” In the testosterone-fueled struggles that followed, as managers tried to meet the fantastical forecasts they had concocted for the joint operation, $200 billion in shareholder value was vaporized — vastly abetted, of course, by the collapse of the dot-com and telecom sectors of the economy. The only shareholders who came out ahead were those who sold their stock early. The Future of Work: How the New Order of Business Will Shape Your Organization, Your Management Style, and Your Life By Thomas W.
Malone Harvard Business School Press, 2004 304 pages, $29.95 The Future of Work is an ambitious book whose title promises more than it delivers. Malone is a professor of management at MIT’s Sloan School of Management and an entrepreneur in the software industry. His expertise is in understanding and designing organizational processes using software concepts such as object-oriented programming and managing systems dependencies through the use of coordination theory. In this book, which seems to be pitched at business school students rather than practitioners, he lifts this specialized framework out of its narrow IT context and applies it to both societies and corporations.
Nicholas G Carr
The primary insight — an “amazing pattern,” in his view — is that over time we have moved from living in small, independent hunting bands to centralized kingdoms and that we are now moving back into decentralized democracies. This is hardly new; it was the central thesis of Alvin Toffler’s 1980 book, The Third Wave.
The framework seems to put far too much stress both on formal decision making as the central organizational dynamic and on the reduction in communication costs as the prime cause of this change. No evidence is shown, however, that the high cost of communication has ever been a constraint on decentralization.
In addition, the concepts of decentralization and centralization may be a good deal more complex than they seem. Organizations that appear highly centralized to people at the top often seem quite decentralized to those below, and vice versa. The central message is that managers must move from a philosophy of command-and-control to one of coordinate-and-cultivate. Whether they make this move is largely a matter of choice; there is no technical imperative to do so. Professor Malone is at his best when he is discussing the ways in which technology can facilitate such moves — for example, by setting up internal markets for manufacturing capacity.
But the suspicion lingers that the balance between centralized and decentralized management may in fact simply be a part of adaptation as corporations organize in one way to take advantage of one set of circumstances and then reorganize in another way when the contexts change. IT Resources: Works mentioned in this review. Nicholas G.
Carr, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage (Harvard Business School Press, 2004), 208 pages, $26.95.
Does IT Matter? An HBR Debate. Letters to the Harvard Business Review:. IT Doesn’t Matter, responses, articles, and resources related to Nicholas Carr:. Steve Lohr, “Is the Technology Business Still a Growth Industry?” Originally published in the New York Times:. Remarks by Bill Gates, chairman and chief software architect, Microsoft Corporation, CEO Summit 2003, Redmond, Wash. Articles published in strategy+business do not necessarily represent the views of the member firms of the PwC network.
Reviews and mentions of publications, products, or services do not constitute endorsement or recommendation for purchase. Strategy+business is published by certain member firms of the PwC network. All rights reserved.
PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see for further details. Mentions of Strategy& refer to the global team of practical strategists that is integrated within the PwC network of firms. For more about Strategy&, see. No reproduction is permitted in whole or part without written permission of PwC. “strategy+business” is a trademark of PwC.
IEEE Engineering Management Review
To beat your competitors, are you devoting more than 50% of your capital expenditures to information technology? Drivers side door won't open. If so, you’re not alone.
Businesses worldwide pump $2 trillion a year into IT. But like many broadly adopted technologies—such as railways and electrical power—IT has become a commodity. Affordable and accessible to everyone, it no longer offers strategic value to anyone. Scarcity—not ubiquity—makes a business resource truly strategic. Companies gain an edge by having or doing something others can’t have or do. In IT’s earlier days, forward-looking firms trumped competitors through innovative deployment of IT; for example, Federal Express’s package-tracking system and American Airlines’ Sabre reservation system.
Now that IT is ubiquitous, however, we must focus on its risks more than its potential strategic advantages. Consider electricity. No company builds its strategy on its electrical usage—but even a brief lapse in supply can be devastating. Today, an IT disruption can prove equally paralyzing to your company’s ability to make products, deliver services, and satisfy customers.
But the greatest IT risk is overspending—putting your company at a cost disadvantage. Make IT management boring. Instead of aggressively seeking an edge through IT, manage IT’s costs and risks with a frugal hand and pragmatic eye—despite any renewed hype about its strategic value. Worrying about what might go wrong isn’t glamorous, but it’s smart business now. The Idea in Practice To avoid overinvesting in IT: Spend Less. Rigorously evaluate expected returns from IT investments.
Separate essential investments from discretionary, unnecessary, or counterproductive ones. Explore simpler and cheaper alternatives, and eliminate waste. Example: Businesses buy 100 million+ PCs annually—yet most workers use PCs for simple applications that require a fraction of their computing power. Start imposing hard limits on upgrade costs—rather than buying new computers and applications every time suppliers roll out new features.
Negotiate contracts ensuring long-term usefulness of your PC investments. If vendors balk, explore cheaper solutions, including bare-bones network PCs. Also assess your data storage, which accounts for 50%+ of many companies’ IT expenditures—even though most saved data consists of employees’ e-mails and files that have little relevance to making products or serving customers. Follow, Don’t Lead. Delay IT investments to significantly cut costs and decrease your risk of buying flawed or soon-to-be obsolete equipment or applications. Today, smart IT users hang back from the cutting edge, buying only after standards and best practices solidify.
They let more impatient rivals shoulder the high costs of experimentation. Then they sweep past them, paying less while getting more. Focus on Risks, not Opportunities. Many corporations are ceding control over their IT applications and networks to vendors and other third parties.
The consequences of moving from tightly controlled, proprietary systems to open, shared ones? More and more threats in the form of technical glitches, service outages, and security breaches. Focus IT resources on preparing for such disruptions—not deploying IT in radical new ways. In 1968, a young Intel engineer named Ted Hoff found a way to put the circuits necessary for computer processing onto a tiny piece of silicon. His invention of the microprocessor spurred a series of technological breakthroughs—desktop computers, local and wide area networks, enterprise software, and the Internet—that have transformed the business world.
Today, no one would dispute that information technology has become the backbone of commerce. It underpins the operations of individual companies, ties together far-flung supply chains, and, increasingly, links businesses to the customers they serve.
Hardly a dollar or a euro changes hands anymore without the aid of computer systems. As IT’s power and presence have expanded, companies have come to view it as a resource ever more critical to their success, a fact clearly reflected in their spending habits. In 1965, according to a study by the U.S. Department of Commerce’s Bureau of Economic Analysis, less than 5% of the capital expenditures of American companies went to information technology. After the introduction of the personal computer in the early 1980s, that percentage rose to 15%.
By the early 1990s, it had reached more than 30%, and by the end of the decade it had hit nearly 50%. Even with the recent sluggishness in technology spending, businesses around the world continue to spend well over $2 trillion a year on IT. But the veneration of IT goes much deeper than dollars. It is evident as well in the shifting attitudes of top managers. Twenty years ago, most executives looked down on computers as proletarian tools—glorified typewriters and calculators—best relegated to low level employees like secretaries, analysts, and technicians. It was the rare executive who would let his fingers touch a keyboard, much less incorporate information technology into his strategic thinking.
Today, that has changed completely. Chief executives now routinely talk about the strategic value of information technology, about how they can use IT to gain a competitive edge, about the “digitization” of their business models. Most have appointed chief information officers to their senior management teams, and many have hired strategy consulting firms to provide fresh ideas on how to leverage their IT investments for differentiation and advantage. Behind the change in thinking lies a simple assumption: that as IT’s potency and ubiquity have increased, so too has its strategic value. It’s a reasonable assumption, even an intuitive one. But it’s mistaken.
What makes a resource truly strategic—what gives it the capacity to be the basis for a sustained competitive advantage—is not ubiquity but scarcity. You only gain an edge over rivals by having or doing something that they can’t have or do. By now, the core functions of IT—data storage, data processing, and data transport—have become available and affordable to all. 1 Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none. IT is best seen as the latest in a series of broadly adopted technologies that have reshaped industry over the past two centuries—from the steam engine and the railroad to the telegraph and the telephone to the electric generator and the internal combustion engine. For a brief period, as they were being built into the infrastructure of commerce, all these technologies opened opportunities for forward-looking companies to gain real advantages.
But as their availability increased and their cost decreased—as they became ubiquitous—they became commodity inputs. From a strategic standpoint, they became invisible; they no longer mattered. That is exactly what is happening to information technology today, and the implications for corporate IT management are profound. Vanishing Advantage Many commentators have drawn parallels between the expansion of IT, particularly the Internet, and the rollouts of earlier technologies. Most of the comparisons, though, have focused on either the investment pattern associated with the technologies—the boom-to-bust cycle—or the technologies’ roles in reshaping the operations of entire industries or even economies.
Little has been said about the way the technologies influence, or fail to influence, competition at the firm level. Yet it is here that history offers some of its most important lessons to managers.
A distinction needs to be made between proprietary technologies and what might be called infrastructural technologies. Proprietary technologies can be owned, actually or effectively, by a single company. A pharmaceutical firm, for example, may hold a patent on a particular compound that serves as the basis for a family of drugs. An industrial manufacturer may discover an innovative way to employ a process technology that competitors find hard to replicate.
A company that produces consumer goods may acquire exclusive rights to a new packaging material that gives its product a longer shelf life than competing brands. As long as they remain protected, proprietary technologies can be the foundations for long-term strategic advantages, enabling companies to reap higher profits than their rivals. Infrastructural technologies, in contrast, offer far more value when shared than when used in isolation. Imagine yourself in the early nineteenth century, and suppose that one manufacturing company held the rights to all the technology required to create a railroad.
If it wanted to, that company could just build proprietary lines between its suppliers, its factories, and its distributors and run its own locomotives and railcars on the tracks. And it might well operate more efficiently as a result.
But, for the broader economy, the value produced by such an arrangement would be trivial compared with the value that would be produced by building an open rail network connecting many companies and many buyers. The characteristics and economics of infrastructural technologies, whether railroads or telegraph lines or power generators, make it inevitable that they will be broadly shared—that they will become part of the general business infrastructure.
In the earliest phases of its buildout, however, an infrastructural technology can take the form of a proprietary technology. As long as access to the technology is restricted—through physical limitations, intellectual property rights, high costs, or a lack of standards—a company can use it to gain advantages over rivals. Consider the period between the construction of the first electric power stations, around 1880, and the wiring of the electric grid early in the twentieth century. Electricity remained a scarce resource during this time, and those manufacturers able to tap into it—by, for example, building their plants near generating stations—often gained an important edge. It was no coincidence that the largest U.S. Manufacturer of nuts and bolts at the turn of the century, Plumb, Burdict, and Barnard, located its factory near Niagara Falls in New York, the site of one of the earliest large-scale hydroelectric power plants.
Companies can also steal a march on their competitors by having superior insight into the use of a new technology. The introduction of electric power again provides a good example.
Until the end of the nineteenth century, most manufacturers relied on water pressure or steam to operate their machinery. Power in those days came from a single, fixed source—a waterwheel at the side of a mill, for instance—and required an elaborate system of pulleys and gears to distribute it to individual workstations throughout the plant. When electric generators first became available, many manufacturers simply adopted them as a replacement single-point source, using them to power the existing system of pulleys and gears.
Smart manufacturers, however, saw that one of the great advantages of electric power is that it is easily distributable—that it can be brought directly to workstations. By wiring their plants and installing electric motors in their machines, they were able to dispense with the cumbersome, inflexible, and costly gearing systems, gaining an important efficiency advantage over their slower-moving competitors.
In addition to enabling new, more efficient operating methods, infrastructural technologies often lead to broader market changes. Here, too, a company that sees what’s coming can gain a step on myopic rivals. In the mid-1800s, when America started to lay down rail lines in earnest, it was already possible to transport goods over long distances—hundreds of steamships plied the country’s rivers. Businessmen probably assumed that rail transport would essentially follow the steamship model, with some incremental enhancements.
In fact, the greater speed, capacity, and reach of the railroads fundamentally changed the structure of American industry. It suddenly became economical to ship finished products, rather than just raw materials and industrial components, over great distances, and the mass consumer market came into being. Companies that were quick to recognize the broader opportunity rushed to build large-scale, mass-production factories. The resulting economies of scale allowed them to crush the small, local plants that until then had dominated manufacturing. The trap that executives often fall into, however, is assuming that opportunities for advantage will be available indefinitely. In actuality, the window for gaining advantage from an infrastructural technology is open only briefly. When the technology’s commercial potential begins to be broadly appreciated, huge amounts of cash are inevitably invested in it, and its buildout proceeds with extreme speed.
Railroad tracks, telegraph wires, power lines—all were laid or strung in a frenzy of activity (a frenzy so intense in the case of rail lines that it cost hundreds of laborers their lives). In the 30 years between 1846 and 1876, reports Eric Hobsbawm in The Age of Capital, the world’s total rail trackage increased from 17,424 kilometers to 309,641 kilometers. During this same period, total steamship tonnage also exploded, from 139,973 to 3,293,072 tons. The telegraph system spread even more swiftly. In Continental Europe, there were just 2,000 miles of telegraph wires in 1849; 20 years later, there were 110,000.
The pattern continued with electrical power. The number of central stations operated by utilities grew from 468 in 1889 to 4,364 in 1917, and the average capacity of each increased more than tenfold. (For a discussion of the dangers of overinvestment, see the sidebar “Too Much of a Good Thing.”). As many experts have pointed out, the overinvestment in information technology in the 1990s echoes the overinvestment in railroads in the 1860s. In both cases, companies and individuals, dazzled by the seemingly unlimited commercial possibilities of the technologies, threw large quantities of money away on half-baked businesses and products. Even worse, the flood of capital led to enormous overcapacity, devastating entire industries. We can only hope that the analogy ends there.
The mid-nineteenth-century boom in railroads (and the closely related technologies of the steam engine and the telegraph) helped produce not only widespread industrial overcapacity but a surge in productivity. The combination set the stage for two solid decades of deflation. Although worldwide economic production continued to grow strongly between the mid-1870s and the mid-1890s, prices collapsed—in England, the dominant economic power of the time, price levels dropped 40%. In turn, business profits evaporated. Companies watched the value of their products erode while they were in the very process of making them.
As the first worldwide depression took hold, economic malaise covered much of the globe. “Optimism about a future of indefinite progress gave way to uncertainty and a sense of agony,” wrote historian D.S.
It’s a very different world today, of course, and it would be dangerous to assume that history will repeat itself. But with companies struggling to boost profits and the entire world economy flirting with deflation, it would also be dangerous to assume it can’t.
By the end of the buildout phase, the opportunities for individual advantage are largely gone. The rush to invest leads to more competition, greater capacity, and falling prices, making the technology broadly accessible and affordable. At the same time, the buildout forces users to adopt universal technical standards, rendering proprietary systems obsolete. Even the way the technology is used begins to become standardized, as best practices come to be widely understood and emulated. Often, in fact, the best practices end up being built into the infrastructure itself; after electrification, for example, all new factories were constructed with many well-distributed power outlets.
Both the technology and its modes of use become, in effect, commoditized. The only meaningful advantage most companies can hope to gain from an infrastructural technology after its buildout is a cost advantage—and even that tends to be very hard to sustain. That’s not to say that infrastructural technologies don’t continue to influence competition. They do, but their influence is felt at the macroeconomic level, not at the level of the individual company.
If a particular country, for instance, lags in installing the technology—whether it’s a national rail network, a power grid, or a communication infrastructure—its domestic industries will suffer heavily. Similarly, if an industry lags in harnessing the power of the technology, it will be vulnerable to displacement. As always, a company’s fate is tied to broader forces affecting its region and its industry. The point is, however, that the technology’s potential for differentiating one company from the pack—its strategic potential—inexorably declines as it becomes accessible and affordable to all. The Commoditization of IT Although more complex and malleable than its predecessors, IT has all the hallmarks of an infrastructural technology. In fact, its mix of characteristics guarantees particularly rapid commoditization. IT is, first of all, a transport mechanism—it carries digital information just as railroads carry goods and power grids carry electricity.
And like any transport mechanism, it is far more valuable when shared than when used in isolation. The history of IT in business has been a history of increased interconnectivity and interoperability, from mainframe time-sharing to minicomputer-based local area networks to broader Ethernet networks and on to the Internet. Each stage in that progression has involved greater standardization of the technology and, at least recently, greater homogenization of its functionality. For most business applications today, the benefits of customization would be overwhelmed by the costs of isolation. IT is also highly replicable.
Indeed, it is hard to imagine a more perfect commodity than a byte of data—endlessly and perfectly reproducible at virtually no cost. The near-infinite scalability of many IT functions, when combined with technical standardization, dooms most proprietary applications to economic obsolescence. Why write your own application for word processing or e-mail or, for that matter, supply-chain management when you can buy a ready-made, state-of-the-art application for a fraction of the cost?
But it’s not just the software that is replicable. Because most business activities and processes have come to be embedded in software, they become replicable, too. When companies buy a generic application, they buy a generic process as well. Both the cost savings and the interoperability benefits make the sacrifice of distinctiveness unavoidable.
The arrival of the Internet has accelerated the commoditization of IT by providing a perfect delivery channel for generic applications. More and more, companies will fulfill their IT requirements simply by purchasing fee-based “Web services” from third parties—similar to the way they currently buy electric power or telecommunications services.
Most of the major business-technology vendors, from Microsoft to IBM, are trying to position themselves as IT utilities, companies that will control the provision of a diverse range of business applications over what is now called, tellingly, “the grid.” Again, the upshot is ever greater homogenization of IT capabilities, as more companies replace customized applications with generic ones. (For more on the challenges facing IT companies, see the sidebar “What About the Vendors?”) What About the Vendors? Just a few months ago, at the 2003 World Economic Forum in Davos, Switzerland, Bill Joy, the chief scientist and cofounder of Sun Microsystems, posed what for him must have been a painful question: “What if the reality is that people have already bought most of the stuff they want to own?” The people he was talking about are, of course, businesspeople, and the stuff is information technology.
With the end of the great buildout of the commercial IT infrastructure apparently at hand, Joy’s question is one that all IT vendors should be asking themselves. There is good reason to believe that companies’ existing IT capabilities are largely sufficient for their needs and, hence, that the recent and widespread sluggishness in IT demand is as much a structural as a cyclical phenomenon. Even if that’s true, the picture may not be as bleak as it seems for vendors, at least those with the foresight and skill to adapt to the new environment. The importance of infrastructural technologies to the day-to-day operations of business means that they continue to absorb large amounts of corporate cash long after they have become commodities—indefinitely, in many cases. Virtually all companies today continue to spend heavily on electricity and phone service, for example, and many manufacturers continue to spend a lot on rail transport. Moreover, the standardized nature of infrastructural technologies often leads to the establishment of lucrative monopolies and oligopolies.
Many technology vendors are already repositioning themselves and their products in response to the changes in the market. Microsoft’s push to turn its Office software suite from a packaged good into an annual subscription service is a tacit acknowledgment that companies are losing their need—and their appetite—for constant upgrades. Dell has succeeded by exploiting the commoditization of the PC market and is now extending that strategy to servers, storage, and even services.
(Michael Dell’s essential genius has always been his unsentimental trust in the commoditization of information technology.) And many of the major suppliers of corporate IT, including Microsoft, IBM, Sun, and Oracle, are battling to position themselves as dominant suppliers of “Web services”—to turn themselves, in effect, into utilities. This war for scale, combined with the continuing transformation of IT into a commodity, will lead to the further consolidation of many sectors of the IT industry. The winners will do very well; the losers will be gone. Finally, and for all the reasons already discussed, IT is subject to rapid price deflation. When Gordon Moore made his famously prescient assertion that the density of circuits on a computer chip would double every two years, he was making a prediction about the coming explosion in processing power. But he was also making a prediction about the coming free fall in the price of computer functionality. The cost of processing power has dropped relentlessly, from $480 per million instructions per second (MIPS) in 1978 to $50 per MIPS in 1985 to $4 per MIPS in 1995, a trend that continues unabated.
Similar declines have occurred in the cost of data storage and transmission. The rapidly increasing affordability of IT functionality has not only democratized the computer revolution, it has destroyed one of the most important potential barriers to competitors. Even the most cutting-edge IT capabilities quickly become available to all. It’s no surprise, given these characteristics, that IT’s evolution has closely mirrored that of earlier infrastructural technologies. Its buildout has been every bit as breathtaking as that of the railroads (albeit with considerably fewer fatalities).
Consider some statistics. During the last quarter of the twentieth century, the computational power of a microprocessor increased by a factor of 66,000.
In the dozen years from 1989 to 2001, the number of host computers connected to the Internet grew from 80,000 to more than 125 million. Over the last ten years, the number of sites on the World Wide Web has grown from zero to nearly 40 million. And since the 1980s, more than 280 million miles of fiber-optic cable have been installed—enough, as BusinessWeek recently noted, to “circle the earth 11,320 times.” (See the exhibit “The Sprint to Commoditization.”). The Sprint to Commoditization As with earlier infrastructural technologies, IT provided forward-looking companies many opportunities for competitive advantage early in its buildout, when it could still be “owned” like a proprietary technology. A classic example is American Hospital Supply. A leading distributor of medical supplies, AHS introduced in 1976 an innovative system called Analytic Systems Automated Purchasing, or ASAP, that enabled hospitals to order goods electronically.
Developed in-house, the innovative system used proprietary software running on a mainframe computer, and hospital purchasing agents accessed it through terminals at their sites. Because more efficient ordering enabled hospitals to reduce their inventories—and thus their costs—customers were quick to embrace the system. And because it was proprietary to AHS, it effectively locked out competitors. For several years, in fact, AHS was the only distributor offering electronic ordering, a competitive advantage that led to years of superior financial results. From 1978 to 1983, AHS’s sales and profits rose at annual rates of 13% and 18%, respectively—well above industry averages. AHS gained a true competitive advantage by capitalizing on characteristics of infrastructural technologies that are common in the early stages of their buildouts, in particular their high cost and lack of standardization.
Within a decade, however, those barriers to competition were crumbling. The arrival of personal computers and packaged software, together with the emergence of networking standards, was rendering proprietary communication systems unattractive to their users and uneconomical to their owners. Indeed, in an ironic, if predictable, twist, the closed nature and outdated technology of AHS’s system turned it from an asset to a liability. By the dawn of the 1990s, after AHS had merged with Baxter Travenol to form Baxter International, the company’s senior executives had come to view ASAP as “a millstone around their necks,” according to a Harvard Business School case study. Myriad other companies have gained important advantages through the innovative deployment of IT. Some, like American Airlines with its Sabre reservation system, Federal Express with its package-tracking system, and Mobil Oil with its automated Speedpass payment system, used IT to gain particular operating or marketing advantages—to leapfrog the competition in one process or activity.
Others, like Reuters with its 1970s financial information network or, more recently, eBay with its Internet auctions, had superior insight into the way IT would fundamentally change an industry and were able to stake out commanding positions. In a few cases, the dominance companies gained through IT innovation conferred additional advantages, such as scale economies and brand recognition, that have proved more durable than the original technological edge. Wal-Mart and Dell Computer are renowned examples of firms that have been able to turn temporary technological advantages into enduring positioning advantages.
But the opportunities for gaining IT-based advantages are already dwindling. Best practices are now quickly built into software or otherwise replicated. And as for IT-spurred industry transformations, most of the ones that are going to happen have likely already happened or are in the process of happening. Industries and markets will continue to evolve, of course, and some will undergo fundamental changes—the future of the music business, for example, continues to be in doubt.
But history shows that the power of an infrastructural technology to transform industries always diminishes as its buildout nears completion. While no one can say precisely when the buildout of an infrastructural technology has concluded, there are many signs that the IT buildout is much closer to its end than its beginning. First, IT’s power is outstripping most of the business needs it fulfills. Second, the price of essential IT functionality has dropped to the point where it is more or less affordable to all. Third, the capacity of the universal distribution network (the Internet) has caught up with demand—indeed, we already have considerably more fiber-optic capacity than we need.
Fourth, IT vendors are rushing to position themselves as commodity suppliers or even as utilities. Finally, and most definitively, the investment bubble has burst, which historically has been a clear indication that an infrastructural technology is reaching the end of its buildout. A few companies may still be able to wrest advantages from highly specialized applications that don’t offer strong economic incentives for replication, but those firms will be the exceptions that prove the rule. At the close of the 1990s, when Internet hype was at full boil, technologists offered grand visions of an emerging “digital future.” It may well be that, in terms of business strategy at least, the future has already arrived. From Offense to Defense So what should companies do? From a practical standpoint, the most important lesson to be learned from earlier infrastructural technologies may be this: When a resource becomes essential to competition but inconsequential to strategy, the risks it creates become more important than the advantages it provides. Think of electricity.
Today, no company builds its business strategy around its electricity usage, but even a brief lapse in supply can be devastating (as some California businesses discovered during the energy crisis of 2000). The operational risks associated with IT are many—technical glitches, obsolescence, service outages, unreliable vendors or partners, security breaches, even terrorism—and some have become magnified as companies have moved from tightly controlled, proprietary systems to open, shared ones. Today, an IT disruption can paralyze a company’s ability to make its products, deliver its services, and connect with its customers, not to mention foul its reputation. Yet few companies have done a thorough job of identifying and tempering their vulnerabilities. Worrying about what might go wrong may not be as glamorous a job as speculating about the future, but it is a more essential job right now.
(See the sidebar “New Rules for IT Management.”) New Rules for IT Management. With the opportunities for gaining strategic advantage from information technology rapidly disappearing, many companies will want to take a hard look at how they invest in IT and manage their systems. As a starting point, here are three guidelines for the future: Spend less. Studies show that the companies with the biggest IT investments rarely post the best financial results. As the commoditization of IT continues, the penalties for wasteful spending will only grow larger. It is getting much harder to achieve a competitive advantage through an IT investment, but it is getting much easier to put your business at a cost disadvantage. Follow, don’t lead.
Moore’s Law guarantees that the longer you wait to make an IT purchase, the more you’ll get for your money. And waiting will decrease your risk of buying something technologically flawed or doomed to rapid obsolescence.
In some cases, being on the cutting edge makes sense. But those cases are becoming rarer and rarer as IT capabilities become more homogenized. Focus on vulnerabilities, not opportunities. It’s unusual for a company to gain a competitive advantage through the distinctive use of a mature infrastructural technology, but even a brief disruption in the availability of the technology can be devastating. As corporations continue to cede control over their IT applications and networks to vendors and other third parties, the threats they face will proliferate. They need to prepare themselves for technical glitches, outages, and security breaches, shifting their attention from opportunities to vulnerabilities. In the long run, though, the greatest IT risk facing most companies is more prosaic than a catastrophe.
It is, simply, overspending. IT may be a commodity, and its costs may fall rapidly enough to ensure that any new capabilities are quickly shared, but the very fact that it is entwined with so many business functions means that it will continue to consume a large portion of corporate spending.
For most companies, just staying in business will require big outlays for IT. What’s important—and this holds true for any commodity input—is to be able to separate essential investments from ones that are discretionary, unnecessary, or even counterproductive. When a resource becomes essential to competition but inconsequential to strategy, the risks it creates become more important than the advantages it provides. At a high level, stronger cost management requires more rigor in evaluating expected returns from systems investments, more creativity in exploring simpler and cheaper alternatives, and a greater openness to outsourcing and other partnerships. But most companies can also reap significant savings by simply cutting out waste. Personal computers are a good example.
Every year, businesses purchase more than 100 million PCs, most of which replace older models. Yet the vast majority of workers who use PCs rely on only a few simple applications—word processing, spreadsheets, e-mail, and Web browsing. These applications have been technologically mature for years; they require only a fraction of the computing power provided by today’s microprocessors. Nevertheless, companies continue to roll out across-the-board hardware and software upgrades. Much of that spending, if truth be told, is driven by vendors’ strategies.
Big hardware and software suppliers have become very good at parceling out new features and capabilities in ways that force companies into buying new computers, applications, and networking equipment much more frequently than they need to. The time has come for IT buyers to throw their weight around, to negotiate contracts that ensure the long-term usefulness of their PC investments and impose hard limits on upgrade costs. And if vendors balk, companies should be willing to explore cheaper solutions, including open-source applications and bare-bones network PCs, even if it means sacrificing features. If a company needs evidence of the kind of money that might be saved, it need only look at Microsoft’s profit margin. In addition to being passive in their purchasing, companies have been sloppy in their use of IT. That’s particularly true with data storage, which has come to account for more than half of many companies’ IT expenditures. The bulk of what’s being stored on corporate networks has little to do with making products or serving customers—it consists of employees’ saved e-mails and files, including terabytes of spam, MP3s, and video clips.
Computerworld estimates that as much as 70% of the storage capacity of a typical Windows network is wasted—an enormous unnecessary expense. Restricting employees’ ability to save files indiscriminately and indefinitely may seem distasteful to many managers, but it can have a real impact on the bottom line. Now that IT has become the dominant capital expense for most businesses, there’s no excuse for waste and sloppiness. Given the rapid pace of technology’s advance, delaying IT investments can be another powerful way to cut costs—while also reducing a firm’s chance of being saddled with buggy or soon-to-be-obsolete technology. Many companies, particularly during the 1990s, rushed their IT investments either because they hoped to capture a first-mover advantage or because they feared being left behind. Except in very rare cases, both the hope and the fear were unwarranted. The smartest users of technology—here again, Dell and Wal-Mart stand out—stay well back from the cutting edge, waiting to make purchases until standards and best practices solidify.
They let their impatient competitors shoulder the high costs of experimentation, and then they sweep past them, spending less and getting more. Some managers may worry that being stingy with IT dollars will damage their competitive positions. But studies of corporate IT spending consistently show that greater expenditures rarely translate into superior financial results. In fact, the opposite is usually true. In 2002, the consulting firm Alinean compared the IT expenditures and the financial results of 7,500 large U.S. Companies and discovered that the top performers tended to be among the most tightfisted. The 25 companies that delivered the highest economic returns, for example, spent on average just 0.8% of their revenues on IT, while the typical company spent 3.7%.
A recent study by Forrester Research showed, similarly, that the most lavish spenders on IT rarely post the best results. Even Oracle’s Larry Ellison, one of the great technology salesmen, admitted in a recent interview that “most companies spend too much on IT and get very little in return.” As the opportunities for IT-based advantage continue to narrow, the penalties for overspending will only grow. Studies of corporate IT spending consistently show that greater expenditures rarely translate into superior financial results. In fact, the opposite is usually true. IT management should, frankly, become boring. The key to success, for the vast majority of companies, is no longer to seek advantage aggressively but to manage costs and risks meticulously. If, like many executives, you’ve begun to take a more defensive posture toward IT in the last two years, spending more frugally and thinking more pragmatically, you’re already on the right course.
The challenge will be to maintain that discipline when the business cycle strengthens and the chorus of hype about IT’s strategic value rises anew. “Information technology” is a fuzzy term. In this article, it is used in its common current sense, as denoting the technologies used for processing, storing, and transporting information in digital form.
ONE year ago, Nicholas Carr was just another 40-something junior editor at the Harvard Business Review—interested in information technology ( IT), sometimes contributing as a writer, but otherwise as unknown to the outside world as such editors tend to be. Then, last May, he published a simple, jargon-free, eight-page article in the HBR, called “ IT doesn't matter”. “I figured I'd ruffle a few feathers for a week or two,” he recalls. What happened instead remains puzzling to this day, not least to Mr Carr himself. The entire trillion-dollar IT industry, it seemed, took offence and started to attack Mr Carr's argument. Chief information officers ( CIOs), the people in charge of computer systems at large companies, heard the noise and told their secretaries to dig out the article and put it on their desks. Analysts chimed in.
Rebuttals were rebutted. Suddenly, Mr Carr was the hottest number for anyone organising a techie conference.
Within months, he was expanding the original article into a book, “Does IT matter?”, which is coming off the presses this month. Already its detractors and supporters are lining up for round two of the controversy. Part of Mr Carr's trick, it seems, is simply choosing great titles. “ IT doesn't matter” is viscerally threatening to people such as Scott McNealy, the chief executive of Sun Microsystems, a maker of fancy and expensive computers.
Like other tech bosses, Mr McNealy basked in gee-whiz celebrity status during the dotcom bubble but has been spending the past three years defending Sun's relevance to suddenly sceptical customers and investors. A confident showman, he challenged Mr Carr to a debate, on stage and on webcast. It was a debacle. “Sun does matter,” Mr McNealy seemed to be arguing, or even “I still matter.” Even Mr Carr's critics in the audience wondered whether Mr McNealy had actually bothered to read the article. And this is the other explanation for Mr Carr's great impact. His argument is simple, powerful and yet also subtle.
He is not, in fact, denying that IT has the potential to transform entire societies and economies. On the contrary, his argument is based on the assumption that IT resembles the steam engine, the railway, the electricity grid, the telegraph, the telephone, the highway system and other technologies that proved revolutionary in the past. For commerce as a whole, Mr Carr is insistent, IT matters very much indeed. But this often has highly ironic implications for individual companies, thinks Mr Carr.
Electricity, for instance, became revolutionary for society only when it ceased to be a proprietary technology, owned or used by one or two factories here and there, and instead became an infrastructure—ubiquitous, and shared by all. Only in the early days, and only for the few firms that found proprietary uses for it, was electricity a source of strategic—ie, more or less lasting—advantage. Once it became available to all firms, however, it became a commodity, a factor of production just like office supplies or raw materials, a cost to be managed rather than an edge over rivals, a risk (during black-outs) rather than an opportunity. Computer hardware and software, Mr Carr argues, have been following the same progression from proprietary technology to infrastructure.
In the past, American Airlines, for example, gained a strategic advantage for a decade or two after it rolled out a proprietary computerised reservation system in 1962, called Sabre. In time, however, its rivals replicated the system, or even leap-frogged to better ones. Today, the edge that a computer system can give a firm is fleeting at best. IT, in other words, has now joined history's other revolutionary technologies by becoming an infrastructure, not a differentiator. In that sense, and from the point of view of individual firms, “ IT no longer matters.” And what's IT all about? Surely though, Mr Carr's critics counter, IT is different from electricity or steam engines.
Even if hardware tends to become a commodity over time, software seems, like music or poetry, to have infinite potential for innovation and malleability. True, it may have, answers Mr Carr, but what matters is not whether a nifty programmer can still come up with new and cool code, but how quickly any such programme can be replicated by rival companies. Besides, today's reality in the software industry has nothing to do with poetry or music.
Many companies are furious about the bug-ridden, pricey and over-engineered systems that they bought during the bubble era and are doing their best to switch to simple, off-the-shelf software, offered in “enterprise-resource planning” packages and the like. If there is any customisation at all, it tends to be done by outside consultants, who are likely to share their favours with other clients. But surely Mr Carr does not appreciate the impressive pipeline of new technologies that is about to hit the market—the wireless gadgets, the billions of tiny radio-frequency identity tags that will turn Aspirin bottles, shirt collars, refrigerator doors and almost everything else into smart machines, and so on?
Those are impressive indeed, says Mr Carr. But again, the issue is whether they will be proprietary technologies or open infrastructures. And everything points to the latter.
This is not a debate for the ivory tower. Since IT can no longer be a source of strategic advantage, Mr Carr urges CIOs to spend less on their data-centres, to opt for cheaper commodity equipment wherever possible, to follow their rivals rather than trying to outdo them with fancy new systems, and to focus more on IT's vulnerabilities, from viruses to data theft, than on its opportunities. As it happens, CIOs have been taking exactly this approach for the past three years, and their bosses like it that way. Perhaps all they needed was an erudite way to justify this new sobriety. “It seemed like there was this role open and I just wandered into it,” says Mr Carr.