It’s hard to believe that ChatGPT is only a year old. The number of exciting new product launches over the past 12 months has been astonishing — and there’s no sign of slowing down. In fact, quite the opposite. Earlier in November, OpenAI hosted DevDay, where the company announced extensive offerings across B2C and B2B markets. Cohere has doubled down on its knowledge search capabilities and private deployments. Amazon Web Services launched PartyRock, its no-code gen AI app-building playground. Generative AI Trends for 2024 you can expect to see. We believe that last month’s activity sets the stage for 2024 in the gen AI space. Here are six major trends happening across the space: While the technology’s possibilities continue to grow, we believe there are four principles for CEOs to consider as they drive their gen AI agendas. These principles draw from our experiences building gen AI applications with our clients throughout the year, as well as decades of delivering digital and analytics transformations. Be Intentional: Set Gen AI Strategy Top-Down Gen AI is a gold rush. Everyone from shareholders to employees to boards is scrambling to deploy the latest and most powerful gen AI tools, and many large organizations have over 150 gen AI use cases on backlog. While we share their excitement and admire their ambition, allowing dozens of gen AI projects to spawn across an organization puts at-scale value creation at risk. Generative AI Trends for 2024 With recent developments in the gen AI space, the proliferation of use cases and opportunities will continue to split the already divided attention of leadership teams. C-suites must bring focus with a top-down gen AI strategy, constantly asking how the technology can create enduring strategic distance between the organization and its competitors. Here are some examples from first movers: Smart organizations are taking a 2×2 approach: identifying two fast use cases to register quick wins and excite the organization while working on two slower, more transformational use cases that will change day-to-day business operations. Reimagine Entire Domains Rather Than Isolated Use Cases During 2023, most organizations began experimenting with gen AI, building one-off prototypes and buying off-the-shelf solutions. Yet, as these solutions are rolled out to end users, organizations are struggling to capture value. For example, some organizations that invested in GitHub Copilot have yet to figure out how the value capture is passed back to the business. Organizations need to reframe from isolated use cases to the full software delivery lifecycle. Scrum teams need to commit to shipping more product features, or sales need to offer more competitive pricing to win more business. Stopping at just buying a new shiny tool means the productivity gains will not translate to bottom-line gains. This often means reimagining entire workflows and domains. This serves two purposes: 1) it creates a more seamless end-user experience by avoiding point solutions; and 2) organizations can more easily track value against clear business outcomes. For example, an insurer we worked with is reimagining its end-to-end claims process — from first notice of loss to payment. For each step along the way, the insurer has identified gen AI, digital, and analytics opportunities, while never losing sight of the claims adjuster’s experience. Ultimately, this comprehensive approach made a step-change impact on end-to-end handling time. Buy Selectively, Build Strategically Matching the pace of innovation, many new startups and software offerings are entering the market, leaving enterprises with a familiar question: “Buy or build?” On the “buy” side, organizations are wary about investing in capabilities that will eventually be available for a fraction of the cost. These organizations are also skeptical of off-the-shelf solutions, unsure if the software will perform at scale without significant customization. As these solutions mature and prove their value, “buy” strategies will continue to play a central role in any gen AI strategy. Meanwhile, some organizations find compelling business cases to “build.” These players start by identifying use cases that create strategic competitive advantages against their peers by compounding existing strengths in their domain expertise, workflow integration, or regulatory know-how. For example, deploying gen AI to accelerate drug discovery has become standard in the pharmaceutical industry. Additionally, organizations are investing in data and IT infrastructure to enable their portfolio of gen AI use cases. For many organizations, there has been little to no investment in unstructured data governance. Now is the time. Build Products, Not Proofs of Concept (POCs) With the new tooling available, a talented engineer can build a proof-of-concept over a weekend. In some cases, this might be sufficient to serve an enterprise need (e.g., a summarization chatbot). However, for most use cases in a large enterprise context, proofs-of-concept are not sufficient. They do not scale well into production and their performance degrades without the appropriate engineering and experimentation. At OpenAI’s Dev Day, engineers demonstrated how hard it is to turn a POC into a production-grade product. Initially, a demo POC only achieved 45% accuracy for a retrieval task. After a few months and numerous experiments (e.g., fine-tuning, re-ranking, metadata tagging, data labeling, model self-assessment, risk guardrails), the engineers achieved 98% accuracy. Implications of Generative AI Trends for 2024 This has two implications. First, organizations cannot seek near-perfection on every use case. They need to be selective about when it is worthwhile to invest scarce engineering talent to develop high-performance gen AI applications. For some situations, 45% accuracy may be sufficient to deliver business benefits. Second, organizations need to scale their gen AI capabilities to meet their ambitions. Most organizations have identified hundreds of gen AI use cases. Therefore, organizations are turning to reusable code components to accelerate development. Dedicated engineers, often in a Center of Excellence (COE), codify best practices into these code components, allowing subsequent gen AI efforts to build off the lessons learned from pioneering projects. We have seen these components accelerate delivery by 25% to 50%. Throughout the past year, there has been an endless stream of gen AI news and hype. The coming year will likely be similar