McKinsey Quarterly

Are you ready for the resource revolution?

| Artigo

Most cars spend more than 95 percent of their time sitting in garages or parking lots. When in use, the average occupancy per vehicle is well below two people, even though most cars have five seats. Roads are likewise extremely inefficient. Freeways can operate at peak throughput (around 2,000 cars a lane per hour) only when they are less than 10 percent covered by cars. Add more, and congestion lowers speeds and reduces throughput. Most roads reach anything like peak usage only once a day and typically in only one direction (exhibit).

The story is similar for utilities. Just 20 to 40 percent of the transmission and distribution capacity in the United States is in use at a given time, and only about 40 percent of the capacity of power plants. The heat-rate efficiency of the average coal-fired power plant has not significantly improved in more than 50 years—an extreme version of conditions in many industries over the past century. Automotive fuel-efficiency improvement, for example, has consistently lagged behind economy-wide productivity growth.

Stuck in neutral: cars are notoriously underutilized and inefficient.

Underutilization and chronic inefficiency cannot be solved by financial engineering or offshoring labor. Something more fundamental is required. We see such challenges as emblematic of an unprecedented opportunity to produce and use resources far more imaginatively and efficiently, revolutionizing business and management in the process. Indeed, rather than facing a crisis of resource scarcity, the world economy will be revitalized by an array of business opportunities that will create trillions of dollars in profits.

To put this new era in context, think back to Adam Smith’s The Wealth of Nations (1776), which identified three primary business inputs: labor, capital, and land (defined broadly as any resource that can be produced or mined from land or disposed of as waste on it). The two industrial revolutions the world has thus far seen focused primarily on labor and capital. The first gave us factories and limited-liability corporations to drive growth at scale. The second, from the late 1800s to the early 1900s, added petroleum, the electric grid, the assembly line, cars, and skyscrapers with elevators and air-conditioning, and it created scientific management, thus enabling corporate globalization. But neither revolution focused on Smith’s third input: land and natural resources.

Our argument is relatively simple:

  • Combining information technology, nanoscale-materials science, and biology with industrial technology yields substantial productivity increases.
  • Achieving high-productivity economic growth in the developing world to support the 2.5 billion new members of the middle class presents the largest wealth-creation opportunity in a century.
  • Capturing these opportunities will require new management approaches.

Rather than settling for historic resource-productivity improvement rates of one to two percentage points a year, leaders must deliver productivity gains of 50 percent or so every few years.

The outlines of this next industrial revolution are starting to come into sharper focus: resource productivity is the right area of emphasis, and the opportunities for companies are extraordinary. In this article, we’ll explore the business approaches most likely to unlock the potential and then highlight ways senior managers can integrate tomorrow’s new technologies, customers, and ways of working with the realities of today’s legacy business environment.

Winning the revolution

We believe the businesses that capitalize most successfully on the resource revolution will employ five distinct approaches, either individually or in some combination. We explore all five of them in our new book, Resource Revolution, but focus here on three: substitution (the replacing of costly, clunky, or scarce materials with less scarce, cheaper, and higher-performing ones); optimization (embedding software in resource-intensive industries to improve, dramatically, how companies produce and use scarce resources); and virtualization (moving processes out of the physical world). The remaining two are circularity (finding value in products after their initial use)1 and waste elimination (greater efficiency, achieved by means including the redesign of products and services). For more on the waste-elimination approach, see “Bringing lean thinking to energy.”

Businesses that have harnessed these five models include Tesla Motors, Uber, and Zipcar (now owned by Avis) in transportation; C3 Energy, Opower, and SolarCity in power; Hampton Creek Foods and Kaiima in agriculture; and Cree, DIRTT, and Nest Labs in buildings. As we show in our book, these companies have the potential to upend traditional competitors and create previously unimagined business models. For examples of what this might look like at scale, download “Twelve companies of tomorrow” (PDF–522KB).

Substitution

The guiding principle for substitution is to consider every resource a company uses in its core products and every resource customers use or consume and then to look for higher-performing and less expensive, less risky, or less scarce materials that might work as substitutes. But don’t think of the new resources as replacements for the current bill of materials. Look instead at how substitution might deliver superior overall performance, much as electric motors are more efficient and provide better safety and acceleration than traditional internal-combustion ones. Carbon fiber, for instance, not only saves weight but allows companies to build quieter, better-performing, more efficient, more comfortable, and more beautiful cars (Tesla) or airplanes (Boeing’s Dreamliner).

These opportunities are extraordinary because many new materials have begun to reshape industrial and consumer products. A much richer understanding of materials science at the nanoscale level, combined with advanced computer-processing power, has catalyzed a broad revolution in surface properties, absorption characteristics, and optical and electrical properties.

For example, activated carbon, typically made of nanoparticles with custom-engineered pore sizes, is dramatically improving the efficiency of water filters, electrodes in batteries, and potentially even power-plant exhaust scrubbers. For the first time since the development of leaded crystal, centuries ago, glass is being reinvented—from high-bandwidth optical-networking fiber to Corning’s Gorilla Glass, which allows touch screens to capture the imagination in portable devices and, soon, on larger interactive screens. A company called View is even creating “dynamic glass,” which changes its visible- and infrared-light transmission characteristics so that windows can be programmed to block the sun on hot days but to capture sunlight in the depths of winter. That would reduce the need for heating and air-conditioning in Mediterranean climates, where cool nights mix with hot days.

Substitution extends even to food production. Hampton Creek Foods, for instance, has developed a plant-based egg substitute for baked and processed foods. Called Beyond Eggs, it uses peas, sorghum, beans, and other plants to make a product that tastes like eggs and has the same nutritional properties. The company says its process is already nearly 20 percent less expensive than the production of eggs, and costs will fall as scale increases. Hampton Creek also says its product will suffer less from drought. At the moment, about 70 percent of an egg’s cost comes from corn, a crop susceptible to drought and increasingly linked to the price of oil, while Hampton Creek uses hardier crops and therefore does not compete with biofuels (or risk salmonella infections). So, Hampton Creek’s egg substitute may cut costs and risks for major food producers.

Spotting substitution opportunities takes hard work. Apple and GE have gone through the periodic table element by element, assessing which ones pose the biggest risks for supply, costs, and regulation. These companies have developed substitution opportunities for each risky element. Similarly, we recently completed a review for a major oil company, looking at the resource risk in its supply chain, and found that the lack of available water would probably cut its growth sharply below expectations over the next decade. Looking a decade ahead gives companies a time advantage over competitors in responding to potential constraints.

Optimization

Another way for companies to boost the productivity of existing resources is to optimize their use—for instance, by integrating software into traditional industrial equipment or providing heavy equipment as a service, something most businesses can do at every level of activity.

GE, for example, outfits its jet engines with advanced software and sensors that yield important real-time maintenance data midflight. As a result, planes can radio ahead with spare parts and servicing requirements before they land. GE often prices its maintenance per hour of flight, so anticipating and streamlining maintenance activities is critical to business profitability.

Komatsu, the industrial-equipment manufacturer, goes even further, optimizing the use of its equipment essentially by creating a market that lets customers rent to and from each other. Need a $300,000 earth mover for just a few days? Komatsu will help find one that would otherwise be sitting idle. Have unused equipment? Komatsu will help find a company to rent it.

Some methods of optimization are surprisingly straightforward. UPS reduced fuel consumption and improved safety and speed by rerouting its trucks to avoid left turns. We helped a large utility shave 30 percent off its meter-reading costs just by restructuring service routes to reflect new traffic conditions and customer-use patterns. And the US Air Force is optimizing fuel consumption by having some of its planes fly in convoys. The new patterns, which copy the way geese “vortex surf” in V-formation, save up to 20 percent on fuel—a huge amount for one of the world’s largest fuel consumers. Implementing the new configuration was not expensive. Maintaining the precise separations between planes required nothing more than changing a few lines of code in the autopilot. Pilots also needed some training not to override it manually.

As companies consider which opportunities have the most potential, the guiding principles should be these: What expensive assets could be integrated with software and sensors? Which pieces of equipment are used only for a small portion of the time? What energy-intensive equipment is active without performing a function? This could be construction equipment, shipping containers that go back empty, or simply planes circling airports waiting for congestion to clear. All lend themselves to IT solutions that optimize routing, timing, loading, or sharing.

Virtualization

As a thought experiment, create a list of physical objects or products that you no longer own or use, even though they were an everyday part of your life just five or ten years ago. For many people, that list might well include traditional calculators, paper calendars, cameras, alarm clocks, or photo albums. All of these have been rendered virtual by smartphone technology.

Virtualization means moving activities out of the physical world or simply not doing things, because they’ve been automated—and both challenge business models. Companies struggle to embrace virtualization because they don’t want people to stop doing things that generate revenue, which always seems to drop more than costs do when activities move into the virtual realm. Look at newspapers, which get from a digital ad just 16 percent of the revenues they got from a comparable print ad.

Likewise, car companies don’t want people to drive less, but that’s what’s happening in developed countries. Miles driven per capita peaked in 2004 in the United States and have declined steadily since. The reasons aren’t entirely clear yet: the decline started before the recent recession and has continued even as the economy rebounded. Higher gas prices are surely a factor, but probably more important is the fact that many people are doing things virtually that they used to do by hopping into cars. For example, the recent holiday shopping season demonstrated how much Americans now rely on online purchases. Even US teenagers have shown a declining interest in driving, according to statistics on the age when Americans get their first license (the ability to connect via social media being a possible reason). Skype and other video-chat applications further reduce the need to drive somewhere to see someone. Work, too, is becoming more virtual as people increasingly use online media and virtual private networks to connect productively without needing an office. Virtualization will happen whether companies want it or not, so they need to prepare themselves.

Nest Labs, a start-up purchased by Google, has already shown what’s possible. The company took a traditional, boring, analog piece of equipment—the thermostat—and turned it into a digital platform that provides dynamic energy and security services (and could one day deliver entertainment, health care, security, and communication services to homes). Several years ago, it would have been hard to imagine ordinary alarm clocks going virtual.

What’s next? Could everyday items like eyeglasses, keys, money, and wallets soon disappear in the same way? Do cars and trucks need drivers? Should drones deliver packages? Can IBM’s Watson and other expert systems provide better and safer maintenance advice in industrial settings?

The integration challenge

Making the most of any of these models represents a huge change to the way companies operate, organize, and behave. The influence of big technological changes, among them the rise of big data and the Internet of Things,2 guarantees that for most companies, the biggest initial challenge will be systems integration: embedding software in traditional industrial equipment. Building and running these systems represents one of the biggest managerial challenges of the 21st century.

Going far beyond the current networks of phones, roads, and the like, the most complicated and powerful network yet is now being built. In it, devices embedded in power lines, household appliances, industrial equipment, and vehicles will increasingly talk to one another without the need for any human involvement. For example, by the end of the decade, cars will communicate directly with each other about speeds, direction, and road conditions.

The reach of these integration capabilities will go far beyond infrastructure and manufacturing. Today, for example, clinicians diagnose depression through a lengthy assessment. But simply matching call patterns and GPS signals on a phone to determine whether someone has become a hermit is a more accurate diagnostic approach, not to mention a better early-warning signal.3 To make the most of such opportunities, health-care companies must figure out how to integrate systems far beyond the hospital.

Systems integration has been a discipline for a long time, but, frankly, most companies aren’t very good at it. This is especially true in resource-intensive areas where technologies have been in place for decades or longer (the electric transformer outside your house, for example, was invented in the 1880s). One reason is that the problems are intrinsically hard, often involving billions more data permutations and combinations. Systems integration is more like trying to manage an ever-evolving ecosystem than solving the sort of finance problem one encounters in business school.

Despite the challenges, companies can do three things to increase the odds of success greatly: create simple software building blocks, expand frontline analytical talent, and apply computational-modeling techniques whenever possible—then test, test, test to learn and refine.

Recognize the scope

Simply realizing that systems are subtle and that lots of variables are interacting simultaneously will give any company a head start. Starting with a few simple software building blocks lays the foundation for success. The case of US power distribution is instructive.

The build-out of the US electric grid has been called the 20th century’s greatest engineering achievement, but the grid’s basic technology has changed little since the time of Edison and Westinghouse. The average circuit is 40 years old, and some have been around for more than a century. The grid is showing its age.

This translates into declining reliability and increasing costs and risks for utilities and their customers. The average utility generally learns about problems with its power lines when customers call in to complain rather than by receiving information on the problems directly. Issues at substations often have to be addressed by sending maintenance workers into the field to flip a switch, not by having someone in a central control room make the change—or, better yet, having the grid sense the problem and either fix it automatically or route electricity around it.

Utilities have to overcome their own inefficiencies and adapt to the rapidly shifting contemporary environment. Homeowners, for instance, are putting solar panels on their roofs, depriving utilities of many of their most profitable customers. Utilities will now have to figure out how to integrate into the grid the power these homes sometimes make available.

Once electric vehicles are deployed in large numbers, utilities will have to get used to the power equivalent of a commercial building unplugging, moving, and plugging back in somewhere else. Utilities must develop capabilities for integrating—in real time—not only what they are doing but also what all the related interconnected players are doing.

The era of big data will also have a huge effect. At the moment, the average utility collects about 60 million data points each year—five million customers and a dozen monthly bills. When smart meters, distributed generation, and electric vehicles come into widespread use, the average utility may have to handle five billion data points each day. The grid will almost need to be redesigned from scratch to get the full benefit of the new types of solid-state transformers, as well as the ability to sense problems and solve them automatically and, essentially, to have little power plants on millions of rooftops as solar prices keep coming down.

Expand frontline analytical capabilities

Mastering the building blocks of the resource revolution will also require intelligent organizational design and excellent talent management. In some cases, the specialized knowledge and know-how won’t be at hand, because companies are dealing with new problems, but each manager will need to find any expertise available. Software skills, specialized engineering, nanotechnology, and ultralow-cost manufacturing are just four of the many areas where talent will be scarce. In some instances, it will make sense for companies to form partnerships with businesses in other industries to gain access to specialized expertise.

In other cases, companies will have to develop new management skills from scratch. Some of the need will occur at the top of organizations, among leaders. The leadership skills required to deliver 10 to 15 percent annual productivity gains for a decade or more are a far cry from the incremental-improvement skills that marked the generation of leaders after World War II. Business-model innovation will no longer be just for start-ups or technology companies.

Frontline workers too will have to learn how to use massive amounts of analytical data to perform heavy industrial tasks. These frontline workers will need to be educated, whether by schools, the government, or employers, to undertake this technical work. For example, resource productivity requires frontline gas-leak detection teams to make sophisticated decisions based on big data and advanced analytics, leveraging technology to find and fix leaks rather than just walking the block with the technological equivalent of a divining rod. Many traditional frontline workers need a knowledge worker’s skills, such as the ability to analyze data, evaluate statistics, identify the root causes of problems, set parameters on machines, update algorithms, and collaborate globally.

The good news is that while the search for new organizational models and new talent in new places will be extraordinarily taxing, just about all of the competition will face the same problems. The sooner management starts confronting the gaps a company is facing, the sooner it is likely to close them—and gain an edge on the ones that don’t.

Model, then test

Because systems are so complex, the only way to know for sure whether a process works is to test it. But, these days, a company can do an awful lot of that testing through computer models. For instance, the US national labs—notably Lawrence Livermore, Los Alamos, and Sandia—have maintained the nation’s nuclear capabilities without testing live warheads for decades, by using advanced computational methods. Now companies can deploy these same techniques to accelerate product development. One defense contractor used computer modeling to test thousands of potential new materials at the atomic level to find a few superlight, high-performance, and very reliable composites for next-generation jet engines. The best manufacturers of batteries can test their performance for thousands of hours, across an extremely broad range of operating conditions, in the Argonne National Laboratory battery-testing facility outside Chicago, dramatically accelerating product innovation.

For example, when ATMI, a materials-technology company, went looking for a better way to extract gold from electronic waste than traditional smelting methods or baths of toxic acids, it resorted to computational modeling of combinatorial chemistries. The resulting eVOLV process uses a water-based solution that’s safe to drink and is dramatically cheaper than the traditional methods. Moreover, the process allows the collected computer chips to be reused, since they are never exposed to high temperatures or acids (the toxic solder is collected as a by-product). The equipment can even be placed on a truck for processing e-waste at collection sites. This is what we mean when we say a resource revolution will open up solutions that are not only cheaper and more efficient but also better.


The resource revolution represents the biggest business opportunity in a century. However, success requires new approaches to management. Companies that try to stick to the old “2 percent solution” (just improve performance by 2 percent annually and you will be fine) are going to become obsolete quickly. Businesses that can deliver dramatic resource-productivity improvements at scale will become the great companies of the 21st century.

Explore a career with us