In recent weeks, AI professionals had the privilege of attending groundbreaking hands-on workshops at the headquarters of two Silicon Valley giants, Salesforce and Google. These experiences offered a firsthand look at the contrasting approaches these tech titans are taking to bring enterprise-grade, large language model (LLM) applications to scale. As they immersed themselves in the cutting-edge world of AI development, a sense of excitement and awe washed over them at the unfolding history. Salesforce and Google LLMs. The workshops provided a fascinating glimpse into the future of enterprise software, where AI is not just a buzzword but a transformative force reshaping how businesses operate. Salesforce and Google, each with their unique strengths and philosophies, are at the forefront of this revolution, pushing the boundaries of what’s possible with LLMs and retrieval-augmented generation (RAG). As they navigated through the hands-on exercises and engaged with the brilliant minds behind these innovations, they realized they were witnessing a pivotal moment in Silicon Valley and computer history. Salesforce LLM Salesforce: Low-Code, Business User-Friendly At the “Build the Future with AI and Data Workshop” held at Salesforce Tower in downtown San Francisco, the focus was on empowering business users with a low-code, clicks-not-code approach. The workshop, attended by around 100 people, took place in a ballroom-sized auditorium. Each attendee received a free instance of the Generative AI-enabled org, pre-populated with a luxury travel destination application, which expired in 5 days. Data Cloud: Lots of Clicks The workshop began with setting up data ingestion and objects for linking AWS S3 buckets to Salesforce’s Data Cloud. The process was intricate, involving a new nomenclature reminiscent of SQL Views within Views, requiring a considerable number of setup steps before accessing Prompt Builder. It should be noted that when using Einstein Studio for the first time, users don’t normally need to do Data Cloud setup. This was done in this workshop so they could later include Data Cloud embeddings in a Prompt Builder retrieval. Prompt Builder: Easy to Use Prompt Builder was the highlight of the workshop. It allows for template variables and various prompt types, including the intriguing Field Prompt, which enables users to attach a prompt to a field. When editing a record, clicking the wizard button in that field executes the prompt, filling out the field automatically. This feature has the potential to greatly enhance data richness, with numerous use cases across industries. Integrating Flow and Apex with Prompt Builder demonstrated the platform’s flexibility. They created an Apex Class using Code Builder, which returned a list that could be used by Prompt Builder to formulate a reply. The seamless integration of these components showcased Salesforce’s commitment to providing a cohesive, user-friendly experience. Einstein Copilot, Salesforce’s AI assistant, exhibited out-of-the-box capabilities when integrated with custom actions. By creating a Flow and integrating it into a custom action, users could invoke Einstein Copilot to assist with various tasks. A Warmly Received Roadmap Salesforce managers, including SVP of Product Management John Kucera, provided insights into the Generative AI roadmap during a briefing session. They emphasized upcoming features such as Recommended Actions, which package prompts into buttons, and improved context understanding for Einstein Copilot. The atmosphere in the room was warm, with genuine excitement and a sense of collaboration between Salesforce staff and attendees. The workshop positioned Salesforce’s AI solution as an alternative to hiring an AI programmer and building AI orchestration using tools like those used in the Google workshop. Salesforce’s approach focuses on a user-friendly interface for setting up data sources and custom actions, enabling users to leverage AI without relying on code. This low-code philosophy aims to democratize AI, making it accessible to a broader range of business users. For organizations already invested in the Salesforce ecosystem, the platform’s embedded AI capabilities offer a compelling way to build expertise and leverage the power of Data Cloud. Salesforce’s commitment to rapidly rolling out embedded AI enhancements, all building on the familiar Admin user experience, makes it an attractive option for businesses seeking to adopt AI without the steep learning curve associated with coding. While there was palpable enthusiasm among attendees, the workshop also highlighted the complexity of setting up data sources and the challenges of working with a new nomenclature. As Salesforce continues to refine its AI offerings, striking the right balance between flexibility and ease of use will be crucial to widespread adoption. Google LLM Google: Engineering-Centric, Code-Intensive The “Build LLM-Powered Apps with Google” workshop, held on the Google campus in Mountain View, attracted around 150 attendees, primarily developers and engineers. They met in a large meeting room with circular tables. The event kicked off with a keynote presentation and detailed descriptions of Google’s efforts in creating retrieval-augmented generation (RAG) pipelines. They participated in a hands-on workshop, building a RAG database for an “SFO Assistant” chatbot designed to assist passengers at San Francisco airport. Running Postgres and pgvector with BigQuery Using Google Cloud Platform, they created a new VM running Postgres with the pgvector extension. They executed a series of commands to load the SFO database and establish a connection between Gemini and the database. The workshop provided step-by-step guidance, with Google staff helping when needed. Ultimately, they successfully ran a chatbot utilizing the RAG database. The workshop also showcased the power of BigQuery in generating prompts at scale through SQL statements. By crafting SQL queries that combined prompt engineering with retrieved data, they learned how to create personalized content, such as emails, for a group of customers in a single step. This demonstration highlighted the potential for efficient, large-scale content generation using Google’s tools. Gemini Assistant One of the most exciting discoveries for them during the workshop was the Gemini Assistant for BigQuery, a standout IT Companion Chatbot tailored for the GCP ecosystem. Comparable to GitHub Copilot Chat or ChatGPT-Plus, Gemini Assistant demonstrated a deep understanding of GCP and the ability to generate code snippets in various programming languages. What distinguishes Gemini Assistant is its strong grounding in GCP knowledge, enabling it to provide contextually