Introductory Overview to Our Four-Part Series
As the world recovers from Covid-19, business leaders more than ever need strategies that profitably deliver superior value to customers. The power of applying––and downside of neglecting––these principles of value delivery are better illustrated by few companies than General Electric and its history.
Jack Welch, GE’s legendary CEO––1981-2001––famously urged businesses to, “Control your own destiny.” By 2001, GE had become the world’s most valued and admired company, apparently fulfilling Welch’s vision of destiny-controlled. However, in retrospect, the company’s greater and more lasting achievements came in the prior, partially forgotten first century––1878-1980. Using principles of value-delivery strategy, GE built a corporate giant on a base of great product-innovation.
Then, however, despite its much celebrated, apparent great triumph, the Welch strategy and its continuation under successor Jeff Immelt largely abandoned those strategic principles. Instead, the company chased self-deceiving illusions, ultimately losing control of its destiny, and declining precipitously––though stable now. The four-part Value-Delivery-Blog series introduced here aims to identify the still-relevant key principles of GE’s historic strategy that, in its first century, made it so important––but which, by their later neglect and decay, eventually led to its major decline and missed-opportunities.
Note from the author: special thanks to colleagues Helmut Meixner and Jim Tyrone for tremendously valuable help on this series.
Part One: Customer-Focused, Science-Based, and Synergistic Product-Innovation (1878-1980)
GE built a corporate giant in its first century via major product-innovations that profitably delivered real value to the world. These innovations followed three principles of value-delivery strategy, discussed in Part One––they were customer-focused, science-based, and where possible, powerfully synergistic among GE businesses.
Following the first, most fundamental of these three principles, GE innovations in that first century were customer-focused––integrated around profitably delivering superior value to customers. Thus, GE businesses delivered winning value propositions––superior combinations of customer-experiences, including costs. We term a business managed in this way a “value-delivery-system,” apt description for most GE businesses of that era.
Part One discusses two key examples of these GE value-delivery-systems successfully created and expanded in that century after 1878. First is the company’s electric lighting-and-power value-delivery-system. It included incandescent-lamps––light bulbs––and the supply of electricity to operate the lighting. Edison and team studied indoor-lighting users’ experiences––with gas-lighting––and designed ways to improve those experiences.
Thus, users were offered a superior value proposition––proposing that they replace their gas-lighting system with Edison’s electric one. As discussed in Part One, users in return got a superior set of experiences, including greater safety, more comfortable breathing––the air no-longer “fouled” by gas––with higher-quality light, shorter-but-adequate lamp-life, and equally convenient operation, all at lower total cost than gas lighting. Profitably delivering this value propositionwas enabled by key elements of lamp-design, more efficient electricity generation-and-distribution, and other important inventions.
This brilliant, integrated system was GE’s founding and most important innovation. Aside from the lamps, consider just electric-power. Discussing the world’s infrequent, truly-breakthrough advances in technology, historian of science Vaclav Smil writes:
And there has been no more fundamental, epoch-making modern innovation than the large-scale commercial generation, transmission, distribution, and conversion of electricity.… the electric system remains Edison’s grandest achievement: an affordable and reliably available supply of electricity has opened doors to everything electrical, to all great second-order innovations ranging from gradually more efficient lighting to fast trains, from medical diagnostic devices to refrigerators, from massive electrochemical industries to tiny computers governed by microchips.
Then, in the first fifteen years of the twentieth century, the company expanded this electric lighting-and-power value-delivery-system. Its new power-innovations made electricity more widely available by helping enable long-distance transmission. Its new lighting innovations produced major advances in lamp efficiency.
A second key GE value-delivery-system discussed in Part One is medical-imaging. Starting with advances to the X-ray tube, GE much later develops the first practical magnetic-resonance-imaging (MRI) machine. These innovations delivered a superior value proposition––first making diagnoses more accurate and reliable, and later, via MRI, safer.
In the great product-innovations of its first century, GE’s first strategic principle was to be customer-focused. As a result, it repeatedly created winning value-delivery-systems––integrating activity around profitable delivery of superior value propositions.
Its second strategic principle was taking a science-based approach to its product-innovation efforts. In that era, GE’s major product-innovations were typically led by highly-knowledgeable physicists, chemists, or other scientists, making extensive use of mathematics and data. The company led all industry in establishing a formal laboratory devoted to scientific research in the context of developing practical, valuable products. This science-based approach increased the likelihood that GE’s product-innovations would result in truly different, meaningfully superior customer experiences. It also helped make delivery of these experiences profitable and proprietary (e.g., patent-protected) for the company.
Rigorous application of scientific theory and data enabled many GE innovations. Edison’s lighting-and-power project developed its key insight––concerning electrical-resistance––by using the early science of electronics. Other examples of the importance of GE’s science-based approach included the work enabling alternating current (AC), improvements to the X-ray tube, use of tungsten-knowledge to improve lamps, and later the extensive physics and engineering work to make MRI a practical reality. Notably, it also transformed GE Plastics–––from providing secondary support for the electrical businesses, to a major business––which we also discuss in Part One.
GE also applied a third key principle––where possible, it identified or built, then reinforced, powerful synergies among its businesses. Based on reapplication of technologies and customer-insights from one GE business to another, these were shared strengths, enabling new GE innovations, further profitably delivering superior value.
In its first century, GE had great success developing synergies related to its electrical and electromagnetic roots. All its major businesses––except plastics––shared these electrical roots. They thus synergistically shared valuable knowledge and insights that were customer-related and science-based. The company’s founding innovation––the Edison lighting-and-power value-delivery-system––was inherently synergistic. Power enabled lighting; lighting justified and promoted power. GE later emulated this synergy by developing its household-appliance business. As discussed in Part One, perhaps the company’s greatest example of synergies between seemingly unrelated businesses was the sharing of turbine technology, first developed for power, then reapplied to aviation.
GE would also fail to realize some synergistic opportunities tied to its electrical roots. Most surprisingly, the company gave up easily on the crucially important markets for semi-conductors and computers, while embracing plastics, its first major non-electrical business. This subtle drifting away from its electrical roots foreshadowed GE’s later, ultimately problematic focus on non-electrical businesses––especially the financial kind. Still, GE’s first century was a tour de force of value-delivery-driven product innovation. After 1981, Welch and team should have better understood and reinvigorated those past strengths.
Part Two: Unsustainable Triumph of the Welch strategy (1981-2010)
For over 20 years after 1981, GE triumphed with Jack Welch’s radical new strategy that included a significant reliance on financial services, a strategy continued by Welch successor Jeff Immelt until about 2010. As discussed in Part Two, this complex strategy ingeniously melded some strengths of GE’s huge and reliably profitable traditional businesses with its aggressive, new financial ones. It enabled the latter to benefit uniquely from GE’s low corporate cost-of-borrowing––due to its huge, profitable, and exceptionally credit-worthy traditional businesses.
This hybrid strategy was largely not available to financial competitors as they lacked GE’s huge traditional product businesses. Thus, with this crucial cost advantage, the new GE financial businesses could profitably deliver a clearly superior value proposition to financial customers. Therefore, these financial businesses grew dramatically and profitably in the Welch era, leading GE’s overall exceptional profitable growth, making GE the world’s most valued and admired corporation by 2001. Yet, there were longer-term flaws hidden in this triumphant strategy.
* * *
Through our value-delivery lens we see many businesses’ strategies as what we term “internally-driven.” Managers of an internally-driven business think inside-out, deciding what products and services to offer based on what they believe their business does best, or best leverages its assets and “competencies.” They are driven by their own internal agenda, more than what customers would potentially value most. Even though GE’s financial businesses delivered a profitable, superior value proposition for two decades, the overall Welch strategy increasingly became internally-driven.
Instead of insisting on new product-innovation in the traditional industrial businesses, the old core businesses were used more to help enable the rapid, highly profitable growth of the financial business. Moreover, the hybrid strategy’s advantages and resulting growth were inherently unsustainable much past 2000. Its heady successes likely led Welch’s and later Immelt’s teams to embrace the illusion that GE and they could master financial or any other business, no matter how new and complex. However, the high returns in GE Capital (GEC, its finance business) reflected high risk, not just financial innovation, and higher risk than GE fully understood. These risks very nearly wrecked the company in 2008-09.
In addition, as will be discussed in Part Two, the unique cost advantage of the GE financial businesses (GEC) could not drive much-faster growth for GEC than the non-financial businesses indefinitely. Given the accounting and regulatory rules of the time, GEC’s cost advantage was only available if GEC’s revenues remained less than fifty-percent of total GE. That threshold was reached in 2000. After that point, GE either needed to reduce GEC’s growth rate, or dramatically increase the growth rate for the traditional businesses. Welch was likely aware of this problem in 2000 when he tried to make the gigantic acquisition of Honeywell, but the European regulators rejected this move, on antitrust terms. The limitations of the strategy, and pressures on it, were clearly emerging by 2000.
Thus, the Welch and then Immelt teams needed to replace the GEC-centered strategy, creating a new surge of product-innovation in the non-financial businesses, in order to grow the aging company. Complicating this challenge, however, GE’s internally-driven instinct focused increasingly on financial engineering, efficiency, and optimization. Eventually, its historically crucial capabilities of customer-focused, science-based product innovation, with shared synergies, all weakened. GE’s industrial product-innovation capabilities had atrophied. As USC School of Business Gerard Tellis writes:
Ironically, GE rose by exploiting radical innovations: incandescent light bulbs, electric locomotives, portable X-ray machines, electric home appliances, televisions, jet engines, laser technologies, and imaging devices. Sadly, in the last few decades, that innovation-driven growth strategy gave way to cash and efficiency-focused acquisitions.
R&D spending continued, but the historical science-based focus on product-innovation breakthroughs gave way to a focus on cost, quality and information technology (IT). Moreover, a new synergy was achieved between the traditional and financial businesses, but the historic emphasis on synergy in product-innovation declined.
Welch’s high-octane strategy was not only flawed by being an internally-driven approach but was also unsound in that it was what we term “customer-compelled.” Again, through our value-delivery lens, we see many businesses trying to avoid the obvious risk of internally-driven thinking––the risk of neglecting or even ignoring what customers want. Therefore, they often follow a well-meaning but misguided customer-compelled strategic map, just as GE did during and after Welch. They strive to “be close and listen” to customers, promising “total satisfaction,” but they may fail to discover what experiences customers would most value, often because customers do not know what they would most value. That logical error was a limitation in one of Welch’s key, most widely influential initiatives––the Six Sigma model.
This demanding, data-intensive technique can be very powerful in reducing variances in product performance, highly useful in achieving manufactured products (or services) that much more precisely meet customers’ requested performance. However, it is only effective if we understand correctly what the most valuable dimensions of that performance are. Otherwise, we only get a customer-compelled result, not a very valuable result for the customer. Moreover, the Six Sigma approach could also be used in an internally-driven way. A business may select a dimension of performance for which it knows how to reduce variance, but which may not be a dimension of highest value to the customer. The technique will then lead again to less variance, but not more value for the customer.
Welch was sometimes hailed not only for his––ultimately too clever––strategic hybrid of the industrial and financial businesses. He was also lionized, until the recent years of GE’s decline, for his major emphasis on efficiency, reducing cost and increasing marginal quality, including Six Sigma. These initiatives achieved meaningful cost reduction––a part of delivering value to customers. But by the 2010s (after the financial crisis) many business thinkers changed their tone. In 2012, for example, Ron Ashkenas in HBR writes:
Six Sigma, Kaizen, and other variations on continuous improvement can be hazardous to your organization’s health. While it may be heresy to say this, recent evidence from Japan and elsewhere suggests that it’s time to question these methods… Looking beyond Japan, iconic six sigma companies in the United States, such as Motorola and GE, have struggled in recent years to be innovation leaders. 3M, which invested heavily in continuous improvement, had to loosen its sigma methodology in order to increase the flow of innovation… As innovation thinker Vijay Govindarajan says, “The more you hardwire a company on total quality management, [the more] it is going to hurt breakthrough innovation. The mindset that is needed, the capabilities that are needed, the metrics that are needed, the whole culture that is needed for discontinuous innovation, are fundamentally different.”
In 2000 Jeff Immelt took over as GE CEO. During 2000-2007, Immelt’s GE increased R&D and produced incremental improvements in product cost, quality, and performance, and advances in IT. However, GE needed to return, more fundamentally, to its historical strengths of customer-focused, science-based product-innovation. The company needed to fundamentally rethink its markets and its value-delivery-systems––deeply studying the evolution of customer priorities and the potential of technology and other capabilities to meet those priorities.
They might have discovered major new potential value propositions, and implemented new, winning value-delivery-systems, perhaps driven again by product-innovation as in GE’s first century. However, Immelt and team seemed to lack the mindset and skills for strategic exploration and reinvention. They sensed a need to replace the Welch strategy but depended on GEC for profits. They thus proved, not surprisingly, too timid to act until too late, resulting in near-fatal losses in the 2008 financial crisis. This flirting with disaster was followed by a decade of seemingly-safe strategies, lacking value-delivery imagination.
Part Three: Digital Delusions(2011-18)
Thus, we saw that the Welch finance-focused strategy which seemed so triumphant in 2001 turned out to be tragically flawed, longer-term. It had led the company to largely abandon its earlier, historical commitment to value-delivery-based product-innovation in the traditional, non-financial businesses. In addition, the shiny new finance-focused strategy was riskier than the company understood, until it was too late. Finally, the strategy’s apparent triumph must have encouraged the Welch and Immelt teams to believe they could succeed in any business. That belief was nurtured by some successes without electrical roots––especially GE Plastics, and perhaps NBC––and further by the Management Training program in Crotonville. The myth of GE invincibility in any business would be dismantled, first by the financial crisis but then further in the years after 2011.
Once the GEC-centered strategy imploded, Immelt and GE hoped to replace it with a grand, industrial-related strategy that could drive nearly as much growth as the Welch strategy had. However, such a strategy needed to identify and deliver value propositions centrally important to industrial customers’ long-term success, including by product innovation. GE did focus on a technology for marginally optimizing operational efficiency. Though not the hoped-for sweeping growth strategy, this opportunity could have had value, but unfortunately GE converted it into a grandiose, unrealistic vision it could not deliver.
In 2011, the company correctly foresaw the emerging importance, combined with Big Data Analytics, of what it coined the “industrial internet of things” (IIoT). This technology, GE argued, would revolutionize predictive maintenance––one day eliminating unplanned downtime––saving billions for GE and others. However, no longer a financial powerhouse and now refocusing on its industrial business, GE did not want to be seen as a big-iron dinosaur; it wanted to be cool––it wanted to be a tech company.
So, focused on the IIoT, GE dubbed itself the world’s “first digital industrial company.” It would build the dominant IIoT software platform, on which GE and others would then develop the analytics-applications needed for its predictive-maintenance vision. Immelt told a 2014 GE managers’ meeting, “If you went to bed last night as an industrial company, you’re going to wake up this morning as a software and analytics company.” This initiative was applauded as visionary and revolutionary. HBS’ Michael Porter noted:
“It is a major change, not only in the products, but also in the way the company operates…. This really is going to be a game-changer for GE.”
However, GE’s game-changer proved to be both overreach and strategically incoherent, ultimately changing little in the world. Going back to the Welch era, GE leaders had long believed they could master any business, using GE managerial principles and technique (e.g., Welch’s vaunted systems of professional management and development, the celebrated Six Sigma, its focus on dominant market position in every market, and others). Even though overconfidence had already previously contributed to GE’s financial-business disaster, perhaps we still shouldn’t be surprised by their 2016 stated “drive to become what Mr. Immelt says will be a ‘top 10 software company’ by 2020.” As leadership tells the MIT Sloan Review in 2016:
GE wants to be “unapologetically awesome” at data and analytics… GE executives believe [GE] can follow in Google’s footsteps and become the entrenched, established platform player for the Industrial Internet — and in less than the 10 years it took Google.
The company implausibly declared its intent to lead and control the IIoT/analytics technology, including its complex software and IT. Such control would include application to all industrial equipment owned by GE, its customers, and its competitors. This vision was quixotic, including its inherent conflicts (e.g., competitors and some customers were uninterested in sharing crucial data, or otherwise enabling GE’s control).
GE’s Big Data initiative did offer a value proposition––eliminating unplanned downtime, which sounds valuable––to be achieved via IIoT/analytics. However, how much value could be delivered, relative to its costs and tradeoffs, and the best way to perform such analytics, will obviously vary dramatically among customers. The initiative seemed more rooted in a vision of how great GE would be at IIoT/analytics than in specific value propositions that would be valued by specific segments. It thus betrayed the same internal focus that had plagued the company earlier.
In contrast, Caterpillar––as discussed in Part Three––used IIoT/analytics technology strategically, to better deliver its clear, core value proposition. Developed by deeply studying and living with its customers, Cat’s value proposition focuses on lower total-life equipment cost, for construction and mining customers, provided via superior uptime. Cat also understood, realistically, that it was not trying to become the Google of the industrial internet, but rather would depend heavily on a partner with deep expertise in analytics, not imagining they could do it all themselves. GE’s IIoT initiative was very enthusiastic about the power of data analytics, but it seems plausible that GE never grasped or acted on the importance of in-depth customer-understanding that Cat demonstrated with success.
In addition, the GE strategy did not seem intended to help develop major new product-innovations, as part of delivering winning value propositions. IIoT/analytics technology could possibly enhance continuous improvement in efficiency, cost, and quality, but not help return the company’s focus, as needed, to breakthrough product-innovation. Innovation-consultant Greg Satell noted that GE saved billions in cost by “relentlessly focusing on continuous improvement.” At the same time, he attributes a scarcity in significant inventions at GE since the 1970s to a “lack of exploration.” He writes, “Its storied labs continuously improve its present lines of business, but do not venture out into the unknown, so, not surprisingly, never find anything new.”
Value can be delivered using IIoT/analytics. GE hired hundreds of smart data scientists and other capable people; no doubt GE delivered value for a few customers and could have done more if they had continued investing. Yet, it spent some $4 billion, made some illusory claims to supplying––and controlling––everything in the industrial internet, and ultimately achieved minimal productive result. GE found it could not compete with the major cloud suppliers (e.g., Amazon and Microsoft). More important, effective analytics on the IIoT required vastly greater customization, to specific sectors and individual customers, than GE had assumed––not a single, one-size-fits-all platform. Before giving up in 2018, GE had only convinced 8% of its industrial customers to use its Predix platform.
Part Four: Energy Misfire (2001-2019)
So, after 2001 GE first tried to ride the GEC tiger and was nearly eaten alive in the process. Then it chased the illusion of IIoT dominance, with not much to show for it. Meanwhile, GE was still not addressing its most pressing challenge––its need for new, synergistic strategy for its industrial businesses, to replace the growth engine previously supplied by the finance businesses. Especially important were its core but inadequately understood energy-related markets. In these, GE harbored costly, internally-driven illusions, instead of freshly and creatively identifying potentially winning, energy-related value propositions.
After the financial crisis, the eventual exit from the financial businesses (except for financing specifically to support the sale of GE industrial products) was already very likely. With two-thirds of the remaining company being energy-related, strategy for this sector clearly should have been high priority. However, in its industrial businesses, including energy, GE had long been essentially internally-driven––prone to pursue what it was comfortable doing. (Admittedly, the IIoT venture was an aberration, where GE was not comfortable––for good reason––developing new software, big data analytics, or AI.) Otherwise, however, in the energy related businesses, GE would stay close to what it saw as core competencies, rather than rethinking and deeply exploring what these markets would likely most value in the foreseeable future.
In these markets, GE focused on fuels based on immediate-customers’ preferences, and where GE saw competitive advantage. Although GE developed a wind power business, it largely stayed loyal to fossil fuels. A 2014 interview in Fast Company explains why:
Immelt’s defining achievement at GE looked to be his efforts to move the company in a greener direction [i.e., its PR-marketing campaign, “Ecomagination”].… But GE can only take green so far; this organization fundamentally exists to build or improve the infrastructure for the modern world, whether it runs on fossil fuels or not. Thus, GE has simultaneouslyenjoyed a booming business in equipment for oil and gas fracking and has profited from the strong demand in diesel locomotives thanks to customers needing to haul coal around the country.… In the end, though, it will serve its customers, wherever they do business.
Serving immediate customers can be a good practice. However, it can become counter-productively customer-compelled––if a business ignores unmistakable trends in technology costs, and the preferences of many end-users. With a seemingly safe, internally driven view, GE (in 2014) lost a myopic bet on fossil fuels, acquiring the gas-turbine operations of French company Alstom. GE badly underestimated the rise of renewable energy, resulting in a $23 billion write-down.
As Part Four will review in some detail, this loss on gas turbines capped GE’s more fundamental, long-term failure, starting in about 2000, to realize its great historic opportunity––creatively leading and capitalizing on the global transition to zero-emission energy, especially renewables. The value proposition that most energy end-users worldwide wanted was increasingly clear after 2000––energy with no tradeoffs in reliable, safe performance, zero (not fewer) emissions, and lower cost.
In this failure in the energy market GE no doubt saw itself as following its customer’s demands. In customer-compelled fashion, GE kept listening to its more immediate customers’ increasing hunger for fossil-fuel based energy generation. The company would have needed a much more customer-focused, science-based perspective to see, and help catalyze, the market’s direction. It needed to study end-users, not just power-generators, and project the increasingly inevitable major reductions in renewable-energy costs, to anticipate the shift away from gas and other fossil-fuels––by many years, not just a few months sooner.
Given its historical knowledge and experience, GE was uniquely well positioned to benefit long-term from the world’s likely multi-trillion-dollar investment in this value proposition by 2050. We can acknowledge that, starting in the early 2000s, GE touted its “Ecomagination” campaign, and built a good-sized wind-power business. However, taking this value proposition seriously––leading a global energy transformation––would have required that GE take a more proactively-creative strategic approach. That would have meant designing comprehensive value-delivery-systems needed to lead, accelerate, and long-term profitably capitalize on that global energy transition.
Fundamentally important, those new value-delivery-systems would have needed to include an aggressive return to the central focus on major product-innovation that built the company in its first century. Many people, in the first decade after 2000, including most energy experts, were highly skeptical that a global transition to zero-emission energy was possible in less than most of the current century. Yet, the basic technology needed for the energy transition in electricity and ground-transportation––a crucial bulk of green-house emissions––had been identified by 2000. Renewable energy, especially solar and wind, bolstered by battery for storage––were already identified. As has largely been confirmed in the last twenty years, that technology––not major new, unimagined breakthroughs––only needed to be massively refined and driven down the experienced-based cost curves.
As will be discussed in Part Four, key parts of solar energy technologies not only dropped in cost faster than anyone expected, but also became unprofitable commodities. As GE and many others learned in the early 2000s, producing solar panels in competition with the Chinese makers became a mostly losing business. However, there are many other elements of renewable energy––wind, storage (e.g., batteries), long-distance transmission, etc. Some of these will inevitably be profitable, even if cranking out commodity solar panels is never among them. The increasingly clear reality since 2000 has been that renewable energy will dominate the global market, and GE should have been playing a leading role.
Other technology, perhaps including hydrogen and some as-yet developed battery technologies, will be needed for some industrial and other sources of emissions not likely replaceable by solar or wind. However, for electricity and ground transport, today we know that these now-proven renewable-energy technologies are lower cost than the fossil-fuel incumbents. In the wide range of product innovations needed to enable this part of the energy transition, GE could and should have been leading the world these past twenty years, rather than just issuing PR lip-service with meagre substance.
Such actions to lead the energy transition, discussed in Part Four, would have included aggressive, imaginative value-delivery-systems by GE, involving not just power generation, but also long-distance transmission, electrification of energy uses (e.g., heating, cooling, transportation, industrial processes), storage––especially battery––technology, and energy-focused financing. Sharing a zero-emissions value proposition could have created great synergy across these GE product-innovations and related businesses. GE could have also used its once-great credibility, to influence public policy, including support for economically rational zero-emission technologies. This excludes asking end-users to accept higher costs or other tradeoffs to save the planet. GE could have proactively, creatively generated great corporate wealth if policy had evolved to allow zero-emission solutions to deliver their potentially superior value.
More strategically fundamental than GE’s inept lost bet after 2013 on natural gas turbines, GE’s historic lost opportunity in its core energy markets was this enormous and inevitable transition to zero-emissions. The same customer-focused, science-based, synergistic strategy, including major product-innovation, that characterized GE’s first century––could have and should have been central to the company since 2000.
Eventual impact of GE’s last four decades––on GE and the world
After 1981, GE achieved a twenty-year meteoric rise behind the Welch strategy, but fundamentally failed to extend the creative, persistent focus on superior value-delivery via product innovation that drove the company in its first century. The cumulative effect of the company’s strategic illusions and inadequate value-delivery was that GE eventually lost control of its destiny, along with most of its market-value and importance. Even after the company’s recent strengthening, its market value is still only about 20% of its 2000 peak after the Welch era, and less than 50% of its 2016 value––its post-financial-crisis peak.
The world also incurred losses, including regrettable influences on business. Rushing to emulate GE, many embraced Six Sigma’s focus on continuous marginal improvement, and an idolatry of maximizing shareholder value. For GE and shareholders, these practices yielded some benefits. However, they also coincided with reduced value delivery, including a sharp decline in major product-innovation. We can see some results of this influence in the world’s lagging innovation and productivity, notwithstanding the often-overrated inventions of the finance and tech sectors. Also lost for the world was the value that GE might have delivered, had it seriously acted on its key opportunity in energy. These losses were the tragedy in GE’s lost control of its destiny, in the four decades after 1981.
Accordingly, after this Introductory Overview, the series continues with four parts:
Part One: Customer-Focused, Science-Based Product Innovation (1878-1980)
Part Two: Unsustainable Triumph of the Welch Strategy (1981-2010)
Part Three: Digital Delusions (2011-2018); and
Part Four: Energy Misfire (2001-2019)
Footnotes––GE Series Introductory Overview:  though of course not using our present-day, value-delivery terminology
Big Data Can Help Execute
But Not Discover New Growth Strategies
Business investment in “big data analytics” (or “big data”) continues growing. Some of this growth is a bet that big data will not only improve operations, but reveal hidden paths to new breakthrough growth strategies. This bet rests partly on the tech giants’ famous use of big data. We have recently seen the emergence, in part also using big data, of major new consumer businesses, e.g., Uber and Airbnb––discussed in this second post of our data-and-growth-strategy series. Again using big data, some promising growth-strategies have also emerged recently in old industrials, including GE and Caterpillar (discussed in our next post). Many thus see big data as a road map to major new growth.
Typical of comments on the new consumer businesses, Maxwell Wessell’s HBR article, How Big Data is Changing Disruptive Innovation, called Uber and others “data-enabled disruptions.” Indefatigable big-data-fan Bernard Marr wrote that Uber and Airbnb were only possible with “big data and algorithms that drive their individual platforms… [or else] Uber wouldn’t be competitive with taxi drivers.” McKinsey, late 2016, wrote, “Data and analytics are changing the basis of competition. [Leaders] launch entirely new business models…[and] the next wave of disruptors [e.g. Uber, Airbnb are] … predicated on data and analytics.” MIT’s SMR Spring 2017 report, Analytics as Source of Business Innovation, cited Uber and Airbnb as “poster children for data-driven innovation.”
Like a compass, big data only helps once you have a map––a growth-strategy. To discover one, first “become the customer”: explore and analyze their experiences; then creatively imagine a superior scenario of experiences, with positives (benefits), costs, and any tradeoffs, that in total can generate major growth. Thus, formulate a “breakthrough value proposition.” Finally, design how to profitably deliver (provide and communicate) it. Data helps execute, not discover, new growth strategies. Did Uber and Airbnb use data to discover, not just execute, their strategies? Let’s see how these discoveries were made.
Garrett Camp and Travis Kalanick co-founded Uber. Its website claims that, in Paris in late 2008, the two “…had trouble hailing a cab. So, they came up with a simple idea—tap a button, get a ride.” Kalanick later made real contributions to Uber, but the original idea was Garrett Camp’s, as confirmed by Travis’ 2010 blog, and his statement at an early event (quoted by Business Insider) that “Garrett is the guy who invented” the app.
However, the concept did not just pop into Camp’s mind one evening. As described below, the idea’s genesis was in his and friends’ frustrating experiences using taxis. He thought deeply about these, experimented with alternatives, and imagined ideal scenarios––essentially the strategy-discovery methodology mentioned above, and that which we call, “become the customer.” He then recognized, from his own tech background, the seeds of a brilliant solution.
Camp was technically accomplished. He had researched collaborative systems, evolutionary algorithms and information retrieval while earning a Master’s in Software Engineering. By 2008 he was a successful, wealthy entrepreneur, having sold his business, StumbleUpon, for $75M. This “discovery engine” finds and recommends relevant web content for users, using sophisticated algorithms and big data technologies.
As Brad Stone’s piece on Uber in The Guardian recounted earlier this year, the naturally curious Camp, then living in San Francisco, had time to explore and play with new possibilities. He and friends would often go out for dinner and bar hopping, and frequently be frustrated by the long waits for taxis, not to mention the condition of the cars and some drivers’ behavior. Megan McArdle’s 2012 Atlantic piece, Why You Can’t Get a Taxi, captured some of these complaints typical of most large cities:
Why are taxis dirty and uncomfortable and never there when you need them? Why is it that half the time, they don’t show up for those 6 a.m. airport runs? How come they all seem to disappear when you most need them—on New Year’s Eve, or during a rainy rush hour? Women complain about scary drivers. Black men complain about drivers who won’t stop to pick them up.
The maddening experience of not enough taxis at peak times reflected the industry’s strong protection. For decades, regulation limited taxi licenses (“medallions”), constraining competition, and protecting revenue and medallion value. The public complained, but cities’ attempts to increase medallions always met militant resistance; drivers would typically protest at city hall. And if you didn’t like your driver or car, you could leave no tip or complain, but not with any real impact.
Irritated, Camp restlessly searched for ways around these limits. As hailing cabs on the street was unreliable, he tried calling, but that was also problematic. Dispatchers would promise a taxi “In 10 minutes” but it often wouldn’t show; he’d call again but they might lie or not remember him. Next, he started calling and reserving them all, taking the first to arrive, but this stopped working once they blacklisted Camp’s mobile phone.
He then experimented with town-cars (or “black cars”). These were reliable, clean and comfortable, with somewhat more civil drivers, though more expensive. However, as he told Stone, their biggest problem exacerbating their cost was filling dead-time between rides, which to a lesser degree also affected regular taxis. As McArdle wrote, “Drivers turn down [some long-distant fares since] they probably won’t get a return fare, and must instead burn time and gas while the meter’s off” which can wipe out the day’s profit.
Camp could now imagine much better ride-hailing experiences, but how to implement this vision? Could technology somehow balance supply and demand, and improve the whole experience for riders and drivers? At that moment (as he later told Stone) Camp recalled a futuristic James Bond scene he loved and often re-watched, from the 2006 Casino Royale. While driving, Agent 007’s phone shows an icon of his car, on a map, approaching The Ocean Club, his destination. Camp immediately recognized that such a capability could bring his ride-hailing vision to life.
When iPhone was launched in 2007 it not only included Google Maps, but Camp knew it also had an “accelerometer” which let users know if their car was moving. Thus, the phone could function like a taxi meter, charging for time and distance. And in Summer of 2008, Apple had also just introduced the app store.
Camp knew this meant he could invent a “ride-hailing app” that would deliver benefits––positive experiences. With it, riders and drivers would digitally book a ride, track and see estimated arrival, optimize routes, make payments, and even rate each other. The app would also use driver and rider data to match supply and demand. This match would be refined by dynamic (“surge”) pricing, adjusting in real time as demand fluctuates. Peak demand would be a “tradeoff”: higher price but, finally, reliable supply of rides.
At this point, Camp grew more excited, sensing how big the concept might be, and pressed his friend Travis Kalanick to become CEO (Camp would refocus on StumbleUpon). Despite Kalanick’s famous controversies (ultimately leading to his stepping down as CEO) many of his decisions steered Uber well. Camp’s vision still included fleets of town-cars, which Kalanick saw as unnecessary cost and complexity. Drivers should use their own cars, while Uber would manage the driver-rider interface, not own assets. Analyzing the data, Kalanick also discovered the power of lower pricing: “If Uber is lower-priced, then more people will want it…and can afford it [so] you have more cars on the road… your pickup times are lower, your reliability is better. The lower-cost product ends up being more luxurious than the high-end one.”
With these adjustments, Uber’s strategy was fully developed. Camp and later Kalanick had “become the customer,” exploring and reinventing Uber’s “value propositions” ––the ride-hailing experiences, including benefits, costs and tradeoffs, that Uber would deliver to riders and drivers. These value propositions were emerging as radically different from and clearly superior to the status quo. It was time to execute, which required big data, first to enable the app’s functionalities. Kalanick also saw that Uber must expand rapidly, to beat imitators into new markets; analytics helped identify the characteristics of likely drivers and riders, and cities where Uber success was likely. Uber became a “big data company,” with analytics central to its operations. It is still not profitable today, and faces regulatory threats; so, its future success may increasingly depend on innovations enabled by data. Nonetheless, for today at least, Uber is valued at $69B.
Yet, to emulate Uber’s success, remember that its winning strategy was discovered not by big data, but by becoming its users. Uber studied and creatively reinvented key experiences––thus, a radically new value proposition––and designed an optimal way to deliver them. Now let’s turn to our second, major new consumer business, frequently attributed to big data.
Airbnb’s launch was relatively whimsical and sudden. It was not the result of frustration with an existing industry, as with ride-hailing. Rather, the founders stumbled onto a new concept that was interesting, and appealing to them, but not yet ready to fly. Their limited early success may have helped them stay open to evolving their strategy.
They embarked on an extended journey to “become” their users, both hosts and guests; they would explore, deeply rethink, and reinvent Airbnb’s key customer-experiences and its value proposition. Big data would become important to Airbnb’s execution, but the evolutionary discovery of its strategy was driven by capturing deep insight into customer experiences. Providing and communicating its value proposition, Airbnb outpaced imitators and other competitors, and is valued today at over $30B. Like Uber, Airbnb is a great example of the approach and concepts we call “value delivery,” as discussed and defined in our overview, Developing Long-Term Growth Strategy.
In 2007, Joe Gebbia and Brian Cheskey, both twenty-seven and friends from Design school, shared a San Francisco apartment. They hoped for entrepreneurial opportunities, but needed cash when their rent suddenly increased. A four-day design conference was coming to town, and most hotels were booked. Gebbia suggested “turning our place into ‘designers bed and breakfast’–offering…place to crash…wireless internet…sleeping mat, and breakfast. Ha!” Next day, they threw together a web site, airbedandbreakfast.com. Surprisingly, they got three takers; all left happy, paying $80 per night (covering the rent). They all also felt rewarded by getting to hear each other’s stories; the guests even offered advice on the new business. Cheskey and Gebbia gave each other a look of, “Hmmm…” and a new business had been born.
The concept seemed compatible with the founders’ values; as The Telegraph later wrote, “Both wanted to be entrepreneurs, but [not] ‘create more stuff that ends up in landfill.’ …a website based on renting something that was already in existence was perfect…”
However, they first underestimated and may have misunderstood their concept. An early headline on the site read, “Finally, an alternative to expensive hotels.” Brian Chesky says:
We thought, surely you would never stay in a home because you wanted to…only because it was cheaper. But that was such a wrong assumption. People love homes. That’s why they live in them. If we wanted to live in hotels, more homes would be designed like hotels.
They were soon joined by a third friend, engineer Nathan Blecharczyk, who says that this mix of design and engineering perspectives [view 01:19-01:40 in interview]:
…was pretty unusual and I actually attribute a lot of our success to that combination. We see things differently. Sometimes it takes a while to reconcile those different perspectives but we find when we take the time to do that we can come up with a superior solution, one that takes into account both points of view.
Soon after the initial launch, they used the site frequently, staying with hosts, and gathering insights into experiences. Two key early discoveries were: 1) Payments, then settled in cash from guest to host, created awkward “So, where’s my money?” moments, and should be handled instead with credit card, through the Airbnb site; and 2) while they originally assumed a focus on events, when hotels are over-booked and expensive, users surprised them by asking about other travel, so they realized that they had landed in the global travel business. A 2008 headline read, “Stay with a local when traveling.”
In August of 2008, the team thought they had made a breakthrough. Obama would accept the Democratic nomination before 100,000 people in Denver, which only had 30,000 hotel rooms. So, Airbnb timed its second launch for the convention. Sure enough, they saw a huge booking spike…but it promptly dropped back to near-zero, days later.
Searching for a promotional gift for hosts, they pulled off a scrappy, startup stunt. They designed and hand-assembled 500 boxes of cereal, with covers they convinced a printer to supply for a share of sales: Obama Os (“Hope in every bowl”) and Cap’n McCain’s (“Maverick in each bite”). CNN ran a story on it, helping sell some at $40 per box, for over $30K total––enough to survive a few more months.
But mid-2009, they were still stalled, and about to give up. About fifteen investors all ignored Airbnb, or saw nothing in it. Then Paul Graham, a founder of “Y Combinator” (YC, a highly-regarded, exclusive start-up-accelerator), granted Airbnb his standard five-minute interview, which he spent telling them to find a better idea (“Does anyone actually stay in one of these? …What’s wrong with them?”) But on the way out the door, thinking all was lost anyway, Chesky handed a box of Obama Os to Graham, who asked, “What’s this?” When told, Graham loved this story of scrappy, resourceful unwillingness to die. If they could sell this cereal, maybe they can get people to rent their homes to strangers, too.
Joining YC led to some modest investments, keeping Airbnb alive. Still, weekly revenues were stuck at $200. Paul pushed them to analyze all their then-forty listings in their then-best market, New York. Poring over them, Gebbia says, they made a key discovery:
We noticed a pattern…similarity between all these 40 listings…the photos sucked…People were using their camera phones [of poor quality in 2009] or…their images from classified sites. It actually wasn’t a surprise that people weren’t booking rooms because you couldn’t even really see what it is that you were paying for.
Paul urged them to go to New York immediately, spend lots of time with the hosts, and upgrade all the amateur photography. They hesitated, fearing pro photography was too costly to “scale” (i.e., to use large-scale). Paul told them to ignore that (“do things that don’t scale”). They took that as license to simply discover what a great experience would be for hosts, and only worry later about scale economics. A week later, results of the test came in, showing far better response, doubling total weekly revenue to $400. They got it.
The team reached all 40 New York hosts, selling them on the (free) photos, but also building relationships. As Nathan Blecharczyk explains [view 18:50-21:27 in his talk], they could follow up with suggestions that they could not have made before, e.g., enhancements to wording in listings or, with overpriced listings, “start lower, then increase if you get overbooked.”
Of course, hi-res photography is common on websites today, even craigslist, and seems obvious now, as perhaps it is to think carefully about wording and pricing in listings. However, these changes made a crucial difference in delivering Airbnb’s value proposition, especially in helping hosts romance and successfully rent their home. This time-consuming effort greatly increased success for these hosts. After that, “people all over the world started booking these places in NY.” The word spread; they had set a high standard, and many other hosts successfully emulated this model.
To even more deeply understand user experiences, the team used “becoming the customer,” or what Gebbia calls, “being the patient,” shaped by his design-thinking background.
[As students, when] working on a medical device we would go out [and] talk with…users of that product, doctors, nurses, patients and then we would have that epiphany moment where we would lay down in the bed in the hospital. We’d have the device applied to us…[we’d] sit there and feel exactly what it felt like to be the patient…that moment where you start to go aha, that’s really uncomfortable. There’s probably a better way to do this.
As Gebbia explained, “being the patient” is still an integral piece of Airbnb’s culture:
Everybody takes a trip in their first or second week [to visit customers, document and] share back to the entire company. It’s incredibly important that everyone in the company knows that we believe in this so much…
They gradually discovered that hosts were willing to rent larger spaces, from air beds, to rooms, entire apartments, and houses. They also further expanded Airbnb’s role, such as hosting reviews and providing a platform for host/guest communications.“Becoming the customer,” they discovered a “personal” dimension of the experience; in 2011, Gebbia recounted [view 19:41-20:36] being a guest in “an awesome West Village apartment,” owned by a woman named Katherine (away then), and he was greeted with:
…a very personalized welcome letter…a Metro Card for me, and [menus from] Katherine’s favorite take-out places…I just felt instantly like, ‘I live here!’ [And on the street] felt like I have an apartment here, I’m like a New Yorker! So, it’s this social connection…to the person and their spaces; it’s about real places and real people in Airbnb. This is what we never anticipated but this has been the special sauce behind Airbnb.
This idea of personal connection may have helped address Airbnb’s crucial problem of trust (“who is this host, or guest?”). Again, they thought deeply about the problem, both redesigning experiences and applying digital solutions. One was “Airbnb Social Connections,” launched in 2011. As TechCrunch wrote, a prospective guest can:
…hook up the service to your social graph via Facebook Connect. Click one button, opt-in, and [in] listings for cities around the world you’ll now see an avatar if a Facebook friend of yours is friends with the host or has reviewed the host. It’s absolutely brilliant.” [Cheskey said it also helps guests and hosts] “have something in common…and helps you find places to stay with mutual friends, people from your school or university…
To further build trust, users were required to “verify, meaning share their email…online and offline identify.” Hosts were asked “to include large photos of themselves on their profiles.” Hosts and guests were urged to communicate before each stay.
Next, the team searched for yet more dimensions of users’ positive experiences. As Leigh Gallagher wrote in Fortune, the team (in 2012) pondered, “Why does Airbnb exist? What’s it purpose?” The global head of community interviewed 480 employees, guests, and hosts, and they found that guests don’t want to be “tourists” but to engage with people and culture, to be “insiders.” The idea of “belonging” emerged, and a new Airbnb mission: “to make people around the world feel like they could ‘belong anywhere.’” Cheskey explains, “cities used to be villages. But…that personal feeling was replaced by ‘mass-produced and impersonal travel experiences,’ and along the way, ‘people stopped trusting each other.’”
They adopted the “Bélo” (as at right in the expanded icon)
to echo this idea. Some mocked all this. TechCrunch
declared it a “hippy-dippy concept”; others suggested that users just wanted a “cheap and cool place to stay,” not “warm and fuzzy” feelings. But Gallagher argues that “belonging” can be more substantive than, “having tea and cookies with [your host]”:
It was much broader: It meant venturing into neighborhoods that you might not otherwise be able to see, staying in places you wouldn’t normally be able to, bunking in someone else’s space, [experience] “hosted” for you, regardless of whether you ever laid eyes on him or her.
“Belonging” may have been a little warm and fuzzy, or even hokey, but Gallagher cites evidence that the idea seemed to resonate with many users. In late 2012, wanting to build further on this notion and having read an article in Cornell Hospitality Quarterly, Cheskey began thinking that Airbnb should focus more on “hospitality.” He read a book by Chip Conley, founder of a successful boutique-hotel chain. Conley wanted guests to check out, after three days, as a “better version of themselves”; he talked of democratizing hospitality, which had become “corporatized.” Airbnb hired him.
Conley gave talks to Airbnb hosts and employees worldwide, and established a “centralized hospitality-education effort, created a set of standards, and started a blog, a newsletter, and an online community center where hosts could learn and share best practices.” He also started a mentoring program in which experienced hosts could teach new ones about good hospitality. Examples of the new standards included:
Airbnb took longer than Uber to discover their strategy, but they got there, again by climbing into the skin of users, “becoming the customer” (or “being” the patient”), living their users’ experiences. Like Uber, they then used big data extensively, to help execute. For example, building on the early New York experiments to help hosts set optimal prices, Airbnb used big data to create, as Forbes wrote, “Pricing Tips.” This “constantly updating guide tells hosts, for each day of the year, how likely it is for them to get a booking at the price they’ve currently chosen.” Airbnb’s machine-learning package further helps hosts quantitatively understand factors affecting pricing in their market.
In combination with the above initiatives to strengthen trust, personal connection, and even that sense of “belonging anywhere,” big data helped Airbnb continue to improve the value proposition it delivered. Its great strategic insights into user experiences, and its superb execution, allowed Airbnb to outdistance its imitative competitors.
So, both Uber and Airbnb made great use of big data. Yet, for both these game-changing businesses, to “become the customer” (or “being the patient”) was key to discovering the insights underlying their brilliant, breakthrough-growth strategies.
* * *
This post follows our first, Who Needs Value Propositions, in this data/strategy series. Our next post (#3–– Big Data as a Compass––Industrial/B2B Businesses) looks at the industrial examples mentioned earlier (GE and Caterpillar), to compare their lessons with those in the above consumer cases. We then plan three more posts in this series:
Posted June 14, 2017
In the past decade, “big data analytics” (or “big data”) has grown dramatically in business. Big data can help develop and execute strategy—as in Google and Facebook’s ad businesses, and Amazon and Netflix’s recommendation engines. Seeing these huge tech successes, many decided to just emulate how big data is used, hoping that big data analytics alone can drive development of long-term growth strategy. This belief is a delusion.
Winning strategies require that businesses discover new or improved experiences that could be most valued (though unarticulated) by customers, and redesign their businesses to profitably deliver these experiences. Big data can increase communication efficiency and short-term sales, or “clicks”, but changing the most crucial customer experiences can transform behaviors, attitudes, and loyalty, leading to major growth. Such insight is best found in many businesses by in-depth exploration and analysis of individual customers—and cannot be found in the correlations of big data. Some questions are easiest answered with big data, but availability of data should not drive what questions to ask. Data-driven priorities can obscure fundamental strategic questions, e.g. what could customers gain by doing business with us—what value proposition should we deliver, and how?
Discovering such insights requires deeply understanding customers’ usage of relevant products or services. In some businesses, such as online retailers, customers’ buying-experiences constitute usage of the service, so these businesses do have usage data, and can use big data in developing strategy. For most, such as product producers, however, usage happens only after purchase, so they have purchase but not usage data, and cannot properly use big data to develop strategy. Feeling compelled to use big data, such businesses may use it anyway, on the data they have, which can help achieve short-term sales, but not to develop long-term growth strategy. However, these businesses still can— and must—develop insights into what usage experiences to focus on changing, and how.
Digital marketing now plays a major role in developing business strategy, and heavily uses big data. Big data predictive algorithms analyze customers’ past transactions and purchase or shopping behaviors, to increase the efficiency of matching customers with marketing offers, and strengthen short-term sales. Sustained major growth requires more than ratcheting reach-efficiency and tweaking the week-end promotional tally. Sustained growth requires creative exploration of customers’ current experiences, to discover breakthrough value propositions, and design ways to profitably provide and communicate them. This post and follow-ups discuss these concerns and suggest solutions.
Predicting transactions is not strategy
As illustration, a Customer Experience Management (CEM) system by Sitecore helps fictional apparel maker “Azure” (Sitecore’s name) use big data to customize marketing to individual customers. Here, Azure intervenes with consumer “Kim” on her decision journey. When she visits their site anonymously, the data shows her matching their active-mother profile. Clicking on a shoes ad, she signs up for email alerts, providing her name and email. Azure begins building her profile. They email a belts promotion to customers predicted by the data as potentially interested—Kim buys one. Later, real-time location data shows Kim near an Azure store, so CEM texts an in-store discount on a new boots line; Azure is confident she’ll like it based on her past actions. Scanning the coupon on Kim’s phone, the CEM enables the clerk to offer Kim another product, a child’s backpack, based on Kim’s profile. Kim is impressed—Azure understands her interests, tracking her every action. She joins Azure’s loyalty program, giving her sneak peeks at coming products. With data showing that Kim most often accesses the site by smart phone, Azure offers her their new mobile app. Via big data, Azure has improved the shopping and buying experiences, and efficiently stimulated short-term sales.
In applications of big data for marketing and growth-strategy, data scientists search for previously unknown correlations among customer transactional and behavioral data. For growth strategy, however, more understanding and creative thought is needed about why customers do what they do, what the consequential experiences have been, what is imperfect in these experiences, and how the business might cause these new or different experiences. These are typically unarticulated opportunities for improved customer experiences. Identifying them requires skilled observation and creative interpretation of current experiences—not replicable in most businesses by data collection and analytics. Such analysis, including customers’ product-usage behaviors, not just purchase, is crucial to developing value propositions that can generate major new growth.
Urging us to “Use big data to create value for customers, not just target them,” Niraj Dawar said in HBR that big data holds out big promises for marketers, including answers to “who buys what, when?” Marketers “trained their big data telescopes at a single point: predicting each customer’s next transaction,” in detailed portraits of consumers’ media preferences, shopping habits, and interests, revealing her next move.
In the Azure narrative, Azure is “pretty confident” of what Kim will want, where and when, based on understanding her interests and interactions. In addition to targeting, big data allows “personalizing”—using our knowledge and references to customers’ past purchases and interests, to make our marketing more relevant and thus more effective in winning that next short-term promotional sale. This saga, of Kim’s “well-guided shopping journey” with Azure, leaves Kim happy (though not entirely of her own free will). In this way, it is reminiscent of Minority Report’s mall scene. The novel and 2002 film focused on prediction (“precognition”) of crimes not yet committed (supernaturally foreseen by “PreCogs”). We can hope this premonition is only a dystopic nightmare, but marketers may find the film’s futuristic marketing a utopian dream. The marketing is precisely targeted and highly personalized—ads and holographic greeter automatically recognize and call out the character’s name, reminding him of a recent purchase.
Fifteen years ago, the sci-fi film’s marketing technology was showing us the future—ever increasingly accurate predictions of each customer’s next purchase. Big data is thus a kind of commercial precognition. Data scientists are PreCogs using big data, not supernatural powers. Both narratives are fictional, but illustrate the big data logic for marketing and growth-strategy. Able to predict the customer’s next transaction, the CEM produces targeted marketing, more efficient in customer-reach. Personalized marketing is more relevant, helping it stimulate short-term sales. A fundamental problem with this paradigm is that growth strategy needs more than accurate predictions of transactions. Such strategy must transform behaviors, attitudes and loyalty of customers and other players in the chain, based on insights about the causality underlying correlations.
Summary: Strategy is More than Prediction
Marketers are right to have yearned for much more factual data on what customers do, i.e. their behaviors. However, with big data it has been easy and commonplace to overemphasize customers’ behavior, especially as related to their buying process, without adequately understanding and analyzing the rest of their relevant experience. Businesses must understand customers’ usage experience, not just buying. They must also explore what’s imperfect about this experience, how it could be improved for the customer, what value proposition the business should deliver to them, and how. Such exploration must discover the most powerful, unarticulated customer-opportunities for greater value delivery, and redesign the business to profitably realize such opportunities. These traits are essential to how strategy is different from prediction—strategy must focus on what we want to make happen and how, not just what we might bet will happen.
Kim’s past transactional behavior is analyzed to predict what she’ll likely want next, but needs to be pushed further, to discover experiences and value propositions that could influence her, and yield long-term growth. (See a similar complaint about limitations of data, from Clayton Christensen et al.) Actions—including product and service improvements, and intense focus of marketing communications on customer benefits—must then be designed to optimally deliver these value propositions.
Growth of big data in tandem with digital marketing
IDC estimates global revenue for business data analytics will exceed $200B by 2020. As a recent review said, this expansion was enabled by several trends: continued rapid expansion of data, doubling every three years, from online digital platforms, mobile devices, and wireless sensors; huge capacity increases and cost reductions in data storage; and major advances in analytic capabilities including computing power and the evolution of sophisticated algorithms. Previously, “industry poured billions into factories and equipment; now the new leaders invest…in digital platforms, data, and analytical talent.” This investment expands the ability to predict the next short-term transaction, increase marketing-communications efficiency and promotional impact. It also drains resources needed for the more difficult but, as argued here, more strategically crucial exploration of customers’ usage experiences, and discovery of breakthrough-growth value propositions.
Using digital technology to market products and services, the digital marketing function has risen rapidly. Last year for the first time, US digital ad-spending surpassed TV, the traditional dominant giant. And digital marketing, both the source of most big data and the easiest place to apply it, increasingly leads development of business strategy.
Efficiency and relevance: important but usually much less so than effectiveness
More efficient marketing is desirable, but only if it’s effective, which is often taken for granted in the focus on efficiency. Much digital marketing faith is put in the four-part promise of “the right ad or offer, to the right person, at the right time, via the right place” (see here, and here). Most big data focus on the last three, which mostly enhance efficiency, instead of the “right ad” which determines effectiveness.
Hunger for efficiency also drives the focus on targeting. Personalizing, when possible and affordable, can also make customers more willing to hear the message, increasing efficiency—and possibly effectiveness—by its greater relevance.
However, effectiveness is the far more crucial issue. If a message does not credibly persuade customers, it is still of little use to the business, even if “efficient.” But targeting and personalizing marketing typically do not identify what behavioral attitudes to change, or how to change them. This more fundamental strategic goal requires deeper understanding of the unarticulated value to customers of improved experiences, and detailed creative exploration of the business’ potential to profitably cause these improvements.
Reinforcing the predominant near-term and efficiency focus of big data in digital marketing is the nature of online sources typically available for big data. McKinsey estimated that, “so much data comes from short-term behavior, such as signing up for brand-related news and promotions on a smartphone or buying a product on sale. That short-term effect typically comprises 10 to 20 percent of total sales, while the brand…accounts for the rest.” This short-term nature of the readily available data biases marketers to focus on short-term transactional results.
Location-based, real-time big data—another advance in short-term promotion
It seems worth noting here that location-based marketing excites the digital marketing world, seeing the “next big thing.” Below are examples, from Skyy Vodka and Starbucks:
As location data gets more accurate (problematic today) this approach will again improve promotional efficiency. In one illustrative test recounted in Ad Age, Brown-Forman, suppliers of Herradura tequila, teamed with Foursquare (a search-and-discovery mobile app that helps find “perfect places [food, entertainment, etc.]”). Foresquare used Brown-Forman’s list of accounts where Herradura is sold, to target mobile and other Herradura ads to consumers whose mobile was close (or had been in) shops, bars, or restaurants in the account list. They saw 23% increased visits to accounts, a positive signal.
Since big data was applied early by direct marketing companies, big data today (further illustrated above by advances in location-based marketing) works more like direct-response marketing than demand-generation. The problem, as noted earlier, is that businesses more than ever also need the latter—demand-generating activity, creating loyalty, thus behavioral changes resulting in long-term growth. Some businesses don’t need these luxuries, when cheap, automated big-data options—digital PreCogs—proliferate.
But most businesses do need to make these serious strategic investments, in addition to and complementary with big data analytics. Having digitally captured countless petabytes of data describing Kim’s every action of shopping and buying, the business managers now need to spend time with Kim learning about her usage of that apparel. What were her experiences before and during usage of those shoes, the belt, and other items? And what of her daughter’s experiences with the backpack? What was imperfect, what could some better experiences be, what would be an improved superior value proposition, and what would it take to provide and communicate that proposition effectively and profitably? These intensively customer-focused actions can enable the discovery and activation of powerful insights for profitably influencing customers’ (and others’) behavior, a key basis for generating profitable major growth over time.
* * *
As mentioned above, this blog series will expand on these concerns about the way that big data analytics has evolved for use in growth strategy, including digital marketing; and will expand on the above recommended solutions for marketers and businesses, including how these solutions apply to most businesses.
Make Value Propositions the Customer-Focused Linchpin to Business Strategy
We suggest that Businesses should be understood and managed as integrated systems, focused single-mindedly on one thing – profitably delivering superior Value Propositions, what we call delivering profitable value (DPV). But most are not. Some readers may assume this discussion can only be a rehash of the obvious – surely everyone ‘knows’ what Value Propositions (VPs) are, and why they matter. But we suggest that most applications of the VP concept actually reflect fundamentally poor understanding of it, and fail to get the most out of it for developing winning strategies. In this post I’ll summarize 4 ways to improve on delivering profitable value, using the VP concept far more effectively – as the customer-focused linchpin to your strategy.
Delivering Profitable Value – Let’s first recap the key components of this approach:
Real & Complete Value Proposition – A key element of strategy; internal doc (not given to customers); as quantified as possible; makes 5 choices (discussed in depth here):
Deliver the chosen VP – A real VP identifies what experiences to deliver, not how; so manage each business as a Value Delivery System with 3 integrated high-level functions:
Profitable Value? – If customers conclude that a VP is superior to the alternatives, it generates revenues; if the cost of delivering it is less than those revenues, then the business creates profit (or shareholder wealth) – thus, it is delivering profitable value.
* * * * *
4 areas where many businesses can improve on delivering profitable value:
Now let’s consider each of these 4 areas in more detail:
The table below summarizes some common misperceptions about Value Propositions, followed by some discussion of the first two.
Of these misperceptions, the first two are perhaps most fundamental, being either:
It’s not your Elevator Speech – a VP is strategy, not execution – For much greater strategic clarity and cross-functional alignment, avoid the common misunderstanding that confuses and equates a VP with messaging. Execution, including messaging is of course important. A VP, as part of your strategy, should obviously drive execution, including messaging; but strategy and execution are not the same thing. If you only articulate a message, without the guiding framework of a strategy, you may get a (partial) execution – the communication element – of some unidentified strategy. But you forgot to develop and articulate the strategy! The execution might still work, but based more on luck than an insightful, fact-based understanding of customers and your market.
This common reduction in the meaning of a VP, to just messaging, not only confuses execution with strategy, but also only addresses one of the two fundamental elements of execution. That is, a VP must be not only Communicated, but also Provided – made to actually happen, such as via your products/services, etc. If customers buy into your messaging – the communication of your VP – but your business does not actually Provide that VP, customers might notice (at least eventually). Though some businesses actually attempt this approach – promising great things, but not making good – and may even get away with it for a limited time, a sustainable business obviously requires not only promising (Communicating) but actually Providing what’s promised.
And it’s not about us – focus VPs on customers, not our products, services, etc. – The other common misuse of the VP concept starts by treating it (rightly) as a strategic decision and internal document. But then (again missing the point) such a so-called VP is focused primarily on us, our internal functions and assets, rather than on the customer and resulting experiences we will deliver to them.
Here it’s helpful to recall the aphorism quoted by the great Marketing professor Ted Levitt, that people “don’t want quarter-inch drill bits, they want quarter-inch holes.” A real VP is focused on detailed description of the ‘hole’ – the resulting experiences due to using the drill bit – not on description of the drill bit. Of course, the drill bit is very important to delivering the VP, since the customer must use the drill bit, which must have the optimal features and performance attributes, to get the desired quarter-inch hole. But first, define and characterize the VP, in adequately detailed, measurable terms; then separately determine the drill-bit characteristics that will enable the desired hole.
It’s impossible to link Providing and Communicating value without a defined VP. However, even with a chosen VP, it is vital to explicitly link its resulting experiences, to the requirements for Providing it (e.g., product and service) and for Communicating it. Companies can improve market share and profitability by rigorously defining the VP(s) they aspire to deliver and then rigorously linking to the Providing and Communicating processes. Failure to make the right links leads to a good idea, not well implemented. (See more discussion of this Value Delivery framework here.)
Most businesses are in a value delivery chain – simple, or long and complex, e.g.:
Many companies survey their immediate, direct customer, asking them what they most value. Less often, they may ask what that customer thinks is wanted by players down the chain. They may rely for insight on that next customer, who may or may not be knowledgeable about others in the chain. There is often great value in exploring entities further down, to understand how each link in the chain can provide value to other links, including final customers, often with important implications for the manufacturer, among others. (See more discussion of Value Delivery Chains here.)
Businesses often conduct research, essentially asking customers, in various forms, to specify needs. A limitation of such research is that customers often filter their answers by what they believe are the supplier’s capabilities. We believe a better way is to deeply study what entities (at several links in the chain) actually do. First capture a virtual “Video One” documenting current experiences, including how an entity uses relevant products/services. Then create “Video Two,” an improved scenario enabled by potential changes by the business in benefits provided and/or price, but which somehow deliver more value to the entity than in Video One. Then construct a third virtual video capturing competing alternatives. Finally, extract a new, superior VP implied by this exploration. Results come much closer to a winning VP than asking customers what they want. (See here for more discussion of creatively exploring the market using this methodology.)
In his recent interview of Michael Lanning (shared in the previous post of this Blog), Brian Carroll asked Mike about the role of Value Props in B2B strategy; Brian wrote about this conversation in the B2B Lead Blog.
Recently Michael Lanning was interviewed by Brian Carroll, Executive Director, Revenue Optimization, MECLABS Institute, for the MECLABS MarketingExperiments Blog. Brian asked Mike, as the original creator of the Value Proposition concept in 1983 while a consultant for McKinsey & Company, about the evolution of this concept over the past three decades. Their discussion, on “Real Value Props, Direct from the Source,’ is here.
This Blog aims to help illuminate what makes for winning and losing business strategies, discussing what we hope are interesting and relevant examples. In this Blog we will especially view these topics through the lens that we––the DPV Group, LLC––call ‘Delivering Profitable Value (DPV)’ which contends that strategy should focus single-mindedly on profitable, superior ‘Value Delivery.’ That is, sustainable business success, while elusive and subject to good and ill fortune, can most realistically be achieved by a concentrated, deliberate effort to creatively discover and then profitably Deliver (i.e. Provide and Communicate) one or more superior Value Propositions to customers.
While many would perhaps say this idea is ‘obvious,’ most businesses do not actually develop or execute their strategies based on this understanding of the central role of the Value Proposition. A Value Proposition should not be a slogan or marketing device for positioning products/services, but rather the driving, central choice of strategy – fundamental to success or failure of the business. In this Blog we will try to bring that idea further to life for a variety of industries and markets.
As background, Michael Lanning first created and articulated/coined some of these concepts, including ‘Value Proposition,’ the ‘Value Delivery System’ and the related concept of ‘Value Delivery.’ He did so (after his first career, in Brand Management for Procter & Gamble) while a consultant for McKinsey & Company, working closely with Partner Ed Michaels in Atlanta in the early 1980s.
In life after McKinsey, he and his colleagues in the DPV Group built on those seminal concepts. Lanning did so first with then-professor at Stanford and Berkeley Business Schools, Prof Lynn Phillips, and later with his DPV Group colleagues. These include long-time senior P&G-manager Helmut Meixner, and previous McKinsey consultant and long-term executive in the paper industry Jim Tyrone, and others. The expanded concepts of Value Delivery include the Customer’s Resulting Experience and the notion of organizations that, rather than Value-Delivery-Focused, can be understood as ‘Internally-Driven’ and/or ‘Customer-Compelled.’
Today, the DPV Group tries to help clients appreciate, internalize, and apply these Value Delivery strategy concepts, to pursue profitable, breakthrough growth in a wide range of B2B and B2C businesses. We hope that this Blog will also contribute to that goal.