The Path to Dominant Design in Generative AI | by Geoffrey Williams | May, 2024

Musings on dominant design and the strategic factors driving the success or failure of generative AI technology in the race for dominance

Towards Data Science
Source: Image by the author with DALL-E

I. Introduction

The struggle to achieve dominant design within the lifecycle of technology innovation has been the topic of intense study over the last half century. These battles play out within research and development (R&D) labs, in discussions on strategy around commercialization and marketing, and in the media and consumer space, but ultimately in the hearts and minds of the customers who turn the tide of market share and product acceptance through their everyday selections of capability. It is why history remembers VHS and not Betamax, why we type on a QWERTY keyboard, and how the industry changed after the introduction of Google’s search engine or Apple’s iPhone. The emergence of ChatGPT was a signal to the market that we are in the midst of yet another battle for dominant design, this time involving generative AI.

Generative AI, capable of generating new content and performing complex actions, has the potential to revolutionize industries by enhancing creativity, automating tasks, and enhancing customer experiences. As a result, organizations are hastily investing in this ecosystem of capabilities in order to remain relevant and competitive. As business leaders, government agencies, and investors make decisions on technologies within the rapidly evolving field of generative AI, from chips to platforms to models, they should do so with the concept of dominant design in mind and how technologies eventually coalesce around this notion as they mature.

II. The Battle for Dominant Design

The concept of dominant design was first articulated by the Abernathy-Utterback Model[i][ii] in 1975, although the term was not formally coined until twenty years later[iii]. This concept went on to become so foundational that it continues to be taught to MBA students in schools across the country. At its most basic core, this business model describes how a product’s design and manufacturing processes evolve over time through three distinct phases: an initial Fluid Phase characterized by significant experimentation in both product design and process improvement with significant innovation as different approaches are developed and refined to meet market needs, a Transitional Phase where a dominant product design begins to emerge and the market gradually shifts towards increasing product standardization but with substantial process innovation, and a Specific Phase marked by standardization across product and process designs.

This work has since been expanded upon, most prominently by Fernando Suarez, to account for technological dominance dynamics that precede the introduction of a product to the market and can serve as a roadmap to navigate this process. In his integrative framework for technological dominance[iv], Suarez illustrates how product innovation progresses through five phases, covered below.

The Five Phases of Technological Dominance

  1. R&D Buildup: Starts as a mix of companies begin conducting applied research related to an emerging technological area. Given the diversity of technological trajectories, the emphasis is on technology and technological talent.
  2. Technical Feasibility: The creation of the first working prototype prompts all participating firms to evaluate their current research and their competitive positioning (e.g., continued independent pursuit, partnership/teaming, exit). Competitive dynamics emphasize firm-level factors and technological superiority, as well as regulation if applicable to the market.
  3. Creating the Market: The launch of the first commercial product sends a clear signal to the market and irreversibly changes the emphasis from technology to market factors. In this phase, technological differences between products become increasingly less important and strategic maneuvering by firms within the ecosystem becomes most important.
  4. The Decisive Battle: The emergence of an early front-runner from among several competitors with sizeable market share signals this phase. Of note, network effects (e.g. ecosystem built around a product) and switching costs in the environment begin to have a stronger impact. In addition, the size of the installed based and complementary assets become critical as mainstream market consumers who seek reliability and trustworthiness over performance and novelty determine the winner(s).
  5. Post Dominance: The market adopts one of the alternative designs and becomes the clear dominant technology, supported by a large installed base of users. This serves as a natural defense against new entrants, especially in markets with strong network effects and switching costs. This phase continues until the emergence of a new technological innovation to replace the current one, restarting the cycle.

A firm’s success in navigating these phases to achieve technological dominance is influenced by a number of firm-level factors (e.g., technological superiority, complementary assets, installed user base, strategic maneuvering) and environmental factors (e.g., industry regulation, network effects, switching costs, appropriability regime, market characteristics). Each of these factors have differing levels of importance in a given stage, with actions occurring too early or too late within the process having muted or unintended effects. Work has also been conducted to consider how sequential decisions on three key aspects (i.e., the market, the technology, complementary assets) can help determine success or failure in the battle for dominance[v]. The first decision relates to the market and the decisions needed to correctly visualize the market to drive the actions to achieve a superior installed user base. The second decision relates to whether the market standard is government or market driven and includes a consideration of a strategy of proprietary control vs one of openness. The third and final decision relates to the strategy to cultivate access to the complementary assets required to be competitive in a mainstream market.

An additional factor that one must consider is technological path dependency and the impact of previous outcomes (e.g., the cloud wars, AI chip investments) on the course of future events. Modern, complex technologies often operate within a regime of increasing returns to adoption, in that the more a technology is adopted, the more beneficial and entrenched it becomes[vi]. Within this context, small historical events can have a strong influence in which technology ultimately becomes dominant, despite the potential advantages of competing technologies. This results from several reinforcing mechanisms such as learning effects, coordination effects, and adaptive expectations that make switching to another technology costly and complex. Moreover, the transition of enterprise-scale generative AI from R&D to commercialized product with business and operational value is intertwined with cloud infrastructure dominance[vii]. This is necessitated by the need for a common set of capabilities coupled with large scale computational resources. To provide such capabilities, hyperscalers have seamlessly integrated cloud infrastructure, models, and applications into the cloud AI stack — accelerating the creation of complementary assets. It is through the lens of these considerations that the developments in generative AI are examined.

III. ChatGPT: The Shot Heard Around the World

The emergence of ChatGPT in November 2022 sent a clear signal that large language models (LLMs) had practical, commercial application on a broad scale. Within weeks, the term generative AI was known across generations of users, technical and non-technical alike. Even more profound was the realization by other participants in the market that they needed to either initiate or significantly accelerate their own efforts to deliver generative AI capability. This marked the transition from Phase 2: Technical Feasibility to Phase 3: Creating the Market. From there, the race was on.

In fairly quick succession, major technology providers began releasing their own generative AI platforms and associated models (e.g., Meta AI — February 2023, AWS Bedrock — April 2023, Palantir Artificial Intelligence Platform — April 2023, Google Vertex AI — June 2023, IBM — July 2023). Where technology and technological talent were of greatest importance in proving technical feasibility, this has given way to strategic maneuvering as firms work to position themselves for growth with a focus on building up the installed user base, developing complementary assets and ecosystems, and enhancing networking effects. This is leading to the current period of rapid expansion of strategic partnerships across hyperscaler ecosystems and with key AI providers as organizations seek to form the alliances that will help them weather the decisive battle for dominance. We are also seeing hyperscalers leverage their existing cloud infrastructure assets to pull their generative AI assets through regulatory hurdles at an accelerated pace in niche markets with little to no competition.

As this plays out, organizations should remain cognizant of various risks. For one, AI companies that invest in alternative approaches that do not become the dominant design may find themselves at a disadvantage. Consequently, adapting to or adopting the dominant design could require significant shifts in strategy, development, and investment beyond current sunk costs. Additionally, competition will continue to increase as the generative AI market’s potential becomes more evident, increasing pressure on all market participants and eventually leading to consolidation and exits. Lastly, the pervasiveness of AI is leading world governments and institutions to begin updating regulatory frameworks to promote security and the responsible deployment of AI. This is introducing added uncertainty as organizations may face new compliance requirements which can be resource-intensive to implement. In markets with heavy regulation, this may prove a barrier to being able to even provide generative AI tools, if those tools do not meet basic requirements.

However, this current period is not without opportunities. As the market begins to identify front runners in the battle for dominant design, organizations that can quickly align with the dominant design(s) or innovate within those frameworks are better positioned to capture significant market share. Also, even as the market begins to standardize around a dominant design, new niches can emerge within the AI field. If identified early and capitalized on, companies can establish a strong presence and enjoy first-mover advantages.

IV. Indicators of Technological Dominance

As the current race for dominant design continues, one can expect to observe the emergence of several indicators that can help predict which technologies or companies might establish market leadership and set the standard for generative AI applications within the current evolving landscape.

  1. Leadership Emergence in Market Share: AI companies and platforms able to secure a large installed user base with respect to the market may wield a front-runner status. This could be evidenced by widespread adoption of their platforms, increased user engagement, growing sales figures, or clients within a specific market. An early lead in market share can be a significant indicator of potential dominance.
  2. Development and Expansion of Ecosystems: Observation of the ecosystems surrounding different generative AI technologies may identify strong, expansive ecosystems, with complementary technologies that can enhance the value of a generative AI platform. The strength of these ecosystems often plays a crucial role in the adoption and long-term viability of a technology.
  3. Switching Costs: Switching costs associated with moving away from one generative AI platform to another can deter users from moving to competing technologies, thereby strengthening the position of the current leader. These could include data integration issues, the need for retraining machine learning models, or contractual and business dependencies.
  4. Size of the Installed Base: A large installed base of users and solutions improves network effects and provides a critical mass that can attract further users due to perceived reliability, the support ecosystem, interoperability, and learning effects. This also activates the bandwagon effect, attracting risk-adverse users who may otherwise avoid technology adoption[v].
  5. Reliability and Trustworthiness: Gauge market sentiment regarding the reliability and trustworthiness of different generative AI technologies. Market consumers often favor reliability and trustworthiness over performance and novelty. Brands that are perceived as reliable and receive positive feedback for user support and robustness are likely to gain a competitive edge.
  6. Innovations and Improvements: Company investments in innovations within their generative AI offerings may indicate dominance. While the market may lean towards established, reliable technologies, continuous improvement and adaptation to user needs will be crucial for continued competitiveness.
  7. Regulatory Compliance and Ethical Standards: Companies and organizations that lead in developing ethically aligned AI in compliance with increasing regulations could be favored by the market, particularly in heavily regulated industries. This is especially important within the Federal market, where network accreditations and unique security requirements play an outsized role in the technologies that can be leveraged for operational value.

By monitoring these indicators, organizations can gain insights into which technologies might emerge as leaders in the generative AI space during the decisive battle phase. Understanding these dynamics is crucial when making investment, development, or implementation decisions on generative AI technologies.

V. Conclusion

Establishment of a dominant design in generative AI is an important step for market stability and industry-wide standardization, which will lead to increased market adoption and reduced uncertainty among businesses and consumers alike. Companies that can influence or adapt to emerging dominant designs will secure competitive advantages, establishing themselves as market leaders in the new technological paradigm. However, selecting a product ecosystem that ultimately does not become the standard will lead to diminishing market share and exact switching costs upon the firms that will now need to transition to the dominant design.

As the industry moves from the fluid to the specific, flowing with increasing viscosity toward a dominant design, strategic foresight and agility become more critical than ever if organizations intend to create value from and deliver impact with technology. The necessity to anticipate future trends and swiftly adapt to evolving technological landscapes means that organizations must stay vigilant and flexible, ready to pivot their strategies in response to new developments in AI technology and shifts in consumer demands. Businesses that can envision the trajectory of technological change and proactively respond to it will not only endure but also stand out as pioneers in the new era of digital transformation. Those that cannot, will be relegated to the annals of history.

All views expressed in this article are the personal views of the author.


[i] J. Utterback, W. Abernathy, “A dynamic model of process and product innovation,” Omega, Vol. 3, Issue 6. 1975

[ii] W. Abernathy, J. Utterback, “Patterns on Innovation”, Technology Review, Vol. 80, Issue 7. 1978

[iii] F. Suárez, J. Utterback, “Dominant Designs and the Survival of Firms,” Strategic Management Journal, Vol. 16, №6. 1995, 415–430

[iv] F. Suárez, “Battles for technological dominance: an integrative framework,” Research Policy, Vol. 33, 2004, 271–286

[v] E. Fernández, S. Valle, “Battle for dominant design: A decision-making model,” European Research on Management and Business Economics, Vol. 25, Issue 2. 2019, 72–78

[vi] W.B. Arthur, “Competing Technologies, Increasing Returns, and Lock-In by Historical Events,” The Economic Journal, Vol. 99, №394, 1989, 116–131

[vii] F. van der Vlist, A. Helmond, F. Ferrari, “Big AI: Cloud infrastructure dependence and the industrialisation of artificial intelligence,” Big Data and Society, January-March: I-16, 2024, 1, 2, 5, 6

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here