Technology Archives - gettectonic.com - Page 8
LLMs and AI

LLMs and AI

Large Language Models (LLMs): Revolutionizing AI and Custom Solutions Large Language Models (LLMs) are transforming artificial intelligence by enabling machines to generate and comprehend human-like text, making them indispensable across numerous industries. The global LLM market is experiencing explosive growth, projected to rise from $1.59 billion in 2023 to $259.8 billion by 2030. This surge is driven by the increasing demand for automated content creation, advances in AI technology, and the need for improved human-machine communication. Several factors are propelling this growth, including advancements in AI and Natural Language Processing (NLP), large datasets, and the rising importance of seamless human-machine interaction. Additionally, private LLMs are gaining traction as businesses seek more control over their data and customization. These private models provide tailored solutions, reduce dependency on third-party providers, and enhance data privacy. This guide will walk you through building your own private LLM, offering valuable insights for both newcomers and seasoned professionals. What are Large Language Models? Large Language Models (LLMs) are advanced AI systems that generate human-like text by processing vast amounts of data using sophisticated neural networks, such as transformers. These models excel in tasks such as content creation, language translation, question answering, and conversation, making them valuable across industries, from customer service to data analysis. LLMs are generally classified into three types: LLMs learn language rules by analyzing vast text datasets, similar to how reading numerous books helps someone understand a language. Once trained, these models can generate content, answer questions, and engage in meaningful conversations. For example, an LLM can write a story about a space mission based on knowledge gained from reading space adventure stories, or it can explain photosynthesis using information drawn from biology texts. Building a Private LLM Data Curation for LLMs Recent LLMs, such as Llama 3 and GPT-4, are trained on massive datasets—Llama 3 on 15 trillion tokens and GPT-4 on 6.5 trillion tokens. These datasets are drawn from diverse sources, including social media (140 trillion tokens), academic texts, and private data, with sizes ranging from hundreds of terabytes to multiple petabytes. This breadth of training enables LLMs to develop a deep understanding of language, covering diverse patterns, vocabularies, and contexts. Common data sources for LLMs include: Data Preprocessing After data collection, the data must be cleaned and structured. Key steps include: LLM Training Loop Key training stages include: Evaluating Your LLM After training, it is crucial to assess the LLM’s performance using industry-standard benchmarks: When fine-tuning LLMs for specific applications, tailor your evaluation metrics to the task. For instance, in healthcare, matching disease descriptions with appropriate codes may be a top priority. Conclusion Building a private LLM provides unmatched customization, enhanced data privacy, and optimized performance. From data curation to model evaluation, this guide has outlined the essential steps to create an LLM tailored to your specific needs. Whether you’re just starting or seeking to refine your skills, building a private LLM can empower your organization with state-of-the-art AI capabilities. For expert guidance or to kickstart your LLM journey, feel free to contact us for a free consultation. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Generative AI Energy Consumption Rises

Generative AI Energy Consumption Rises

Generative AI Energy Consumption Rises, but Impact on ROI Unclear The energy costs associated with generative AI (GenAI) are often overlooked in enterprise financial planning. However, industry experts suggest that IT leaders should account for the power consumption that comes with adopting this technology. When building a business case for generative AI, some costs are evident, like large language model (LLM) fees and SaaS subscriptions. Other costs, such as preparing data, upgrading cloud infrastructure, and managing organizational changes, are less visible but significant. Generative AI Energy Consumption Rises One often overlooked cost is the energy consumption of generative AI. Training LLMs and responding to user requests—whether answering questions or generating images—demands considerable computing power. These tasks generate heat and necessitate sophisticated cooling systems in data centers, which, in turn, consume additional energy. Despite this, most enterprises have not focused on the energy requirements of GenAI. However, the issue is gaining more attention at a broader level. The International Energy Agency (IEA), for instance, has forecasted that electricity consumption from data centers, AI, and cryptocurrency could double by 2026. By that time, data centers’ electricity use could exceed 1,000 terawatt-hours, equivalent to Japan’s total electricity consumption. Goldman Sachs also flagged the growing energy demand, attributing it partly to AI. The firm projects that global data center electricity use could more than double by 2030, fueled by AI and other factors. ROI Implications of Energy Costs The extent to which rising energy consumption will affect GenAI’s return on investment (ROI) remains unclear. For now, the perceived benefits of GenAI seem to outweigh concerns about energy costs. Most businesses have not been directly impacted, as these costs tend to affect hyperscalers more. For instance, Google reported a 13% increase in greenhouse gas emissions in 2023, largely due to AI-related energy demands in its data centers. Scott Likens, PwC’s global chief AI engineering officer, noted that while energy consumption isn’t a barrier to adoption, it should still be factored into long-term strategies. “You don’t take it for granted. There’s a cost somewhere for the enterprise,” he said. Energy Costs: Hidden but Present Although energy expenses may not appear on an enterprise’s invoice, they are still present. Generative AI’s energy consumption is tied to both model training and inference—each time a user makes a query, the system expends energy to generate a response. While the energy used for individual queries is minor, the cumulative effect across millions of users can add up. How these costs are passed to customers is somewhat opaque. Licensing fees for enterprise versions of GenAI products likely include energy costs, spread across the user base. According to PwC’s Likens, the costs associated with training models are shared among many users, reducing the burden on individual enterprises. On the inference side, GenAI vendors charge for tokens, which correspond to computational power. Although increased token usage signals higher energy consumption, the financial impact on enterprises has so far been minimal, especially as token costs have decreased. This may be similar to buying an EV to save on gas but spending hundreds and losing hours at charging stations. Energy as an Indirect Concern While energy costs haven’t been top-of-mind for GenAI adopters, they could indirectly address the issue by focusing on other deployment challenges, such as reducing latency and improving cost efficiency. Newer models, such as OpenAI’s GPT-4o mini, are more economical and have helped organizations scale GenAI without prohibitive costs. Organizations may also use smaller, fine-tuned models to decrease latency and energy consumption. By adopting multimodel approaches, enterprises can choose models based on the complexity of a task, optimizing for both speed and energy efficiency. The Data Center Dilemma As enterprises consider GenAI’s energy demands, data centers face the challenge head-on, investing in more sophisticated cooling systems to handle the heat generated by AI workloads. According to the Dell’Oro Group, the data center physical infrastructure market grew in the second quarter of 2024, signaling the start of the “AI growth cycle” for infrastructure sales, particularly thermal management systems. Liquid cooling, more efficient than air cooling, is gaining traction as a way to manage the heat from high-performance computing. This method is expected to see rapid growth in the coming years as demand for AI workloads continues to increase. Nuclear Power and AI Energy Demands To meet AI’s growing energy demands, some hyperscalers are exploring nuclear energy for their data centers. AWS, Google, and Microsoft are among the companies exploring this option, with AWS acquiring a nuclear-powered data center campus earlier this year. Nuclear power could help these tech giants keep pace with AI’s energy requirements while also meeting sustainability goals. I don’t know. It seems like if you akin AI accessibility to more nuclear power plants you would lose a lot of fans. As GenAI continues to evolve, both energy costs and efficiency are likely to play a greater role in decision-making. PwC has already begun including carbon impact as part of its GenAI value framework, which assesses the full scope of generative AI deployments. “The cost of carbon is in there, so we shouldn’t ignore it,” Likens said. Generative AI Energy Consumption Rises Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
AI Agents and Digital Transformation

AI Agents and Digital Transformation

In the rapidly developingng world of technology, Artificial Intelligence (AI) is revolutionizing industries and reshaping how we interact with digital systems. One of the most promising advancements within AI is the development of AI agents. These intelligent entities, often powered by Large Language Models (LLMs), are driving the next wave of digital transformation by enabling automation, personalization, and enhanced decision-making across various sectors. AI Agents and digital transformation are here to stay. What is an AI Agent? An AI agent, or intelligent agent, is a software entity capable of perceiving its environment, reasoning about its actions, and autonomously working toward specific goals. These agents mimic human-like behavior using advanced algorithms, data processing, and machine-learning models to interact with users and complete tasks. LLMs to AI Agents — An Evolution The evolution of AI agents is closely tied to the rise of Large Language Models (LLMs). Models like GPT (Generative Pre-trained Transformer) have showcased remarkable abilities to understand and generate human-like text. This development has enabled AI agents to interpret complex language inputs, facilitating advanced interactions with users. Key Capabilities of LLM-Based Agents LLM-powered agents possess several key advantages: Two Major Types of LLM Agents LLM agents are classified into two main categories: Multi-Agent Systems (MAS) A Multi-Agent System (MAS) is a group of autonomous agents working together to achieve shared goals or solve complex problems. MAS applications span robotics, economics, and distributed computing, where agents interact to optimize processes. AI Agent Architecture and Key Elements AI agents generally follow a modular architecture comprising: Learning Strategies for LLM-Based Agents AI agents utilize various learning techniques, including supervised, reinforcement, and self-supervised learning, to adapt and improve their performance in dynamic environments. How Autonomous AI Agents Operate Autonomous AI agents act independently of human intervention by perceiving their surroundings, reasoning through possible actions, and making decisions autonomously to achieve set goals. AI Agents’ Transformative Power Across Industries AI agents are transforming numerous industries by automating tasks, enhancing efficiency, and providing data-driven insights. Here’s a look at some key use cases: Platforms Powering AI Agents The Benefits of AI Agents and Digital Transformation AI agents offer several advantages, including: The Future of AI Agents The potential of AI agents is immense, and as AI technology advances, we can expect more sophisticated agents capable of complex reasoning, adaptive learning, and deeper integration into everyday tasks. The future promises a world where AI agents collaborate with humans to drive innovation, enhance efficiency, and unlock new opportunities for growth in the digital age. AI Agents and Digital Transformation By partnering with AI development specialists at Tectonic, organizations can access cutting-edge solutions tailored to their needs, positioning themselves to stay ahead in the rapidly evolving AI-driven market. Agentforce Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
GPUs and AI Development

GPUs and AI Development

Graphics processing units (GPUs) have become widely recognized due to their growing role in AI development. However, a lesser-known but critical technology is also gaining attention: high-bandwidth memory (HBM). HBM is a high-density memory designed to overcome bottlenecks and maximize data transfer speeds between storage and processors. AI chipmakers like Nvidia rely on HBM for its superior bandwidth and energy efficiency. Its placement next to the GPU’s processor chip gives it a performance edge over traditional server RAM, which resides between storage and the processing unit. HBM’s ability to consume less power makes it ideal for AI model training, which demands significant energy resources. However, as the AI landscape transitions from model training to AI inferencing, HBM’s widespread adoption may slow. According to Gartner’s 2023 forecast, the use of accelerator chips incorporating HBM for AI model training is expected to decline from 65% in 2022 to 30% by 2027, as inferencing becomes more cost-effective with traditional technologies. How HBM Differs from Other Memory HBM shares similarities with other memory technologies, such as graphics double data rate (GDDR), in delivering high bandwidth for graphics-intensive tasks. But HBM stands out due to its unique positioning. Unlike GDDR, which sits on the printed circuit board of the GPU, HBM is placed directly beside the processor, enhancing speed by reducing signal delays caused by longer interconnections. This proximity, combined with its stacked DRAM architecture, boosts performance compared to GDDR’s side-by-side chip design. However, this stacked approach adds complexity. HBM relies on through-silicon via (TSV), a process that connects DRAM chips using electrical wires drilled through them, requiring larger die sizes and increasing production costs. According to analysts, this makes HBM more expensive and less efficient to manufacture than server DRAM, leading to higher yield losses during production. AI’s Demand for HBM Despite its manufacturing challenges, demand for HBM is surging due to its importance in AI model training. Major suppliers like SK Hynix, Samsung, and Micron have expanded production to meet this demand, with Micron reporting that its HBM is sold out through 2025. In fact, TrendForce predicts that HBM will contribute to record revenues for the memory industry in 2025. The high demand for GPUs, especially from Nvidia, drives the need for HBM as AI companies focus on accelerating model training. Hyperscalers, looking to monetize AI, are investing heavily in HBM to speed up the process. HBM’s Future in AI While HBM has proven essential for AI training, its future may be uncertain as the focus shifts to AI inferencing, which requires less intensive memory resources. As inferencing becomes more prevalent, companies may opt for more affordable and widely available memory solutions. Experts also see HBM following the same trajectory as other memory technologies, with continuous efforts to increase bandwidth and density. The next generation, HBM3E, is already in production, with HBM4 planned for release in 2026, promising even higher speeds. Ultimately, the adoption of HBM will depend on market demand, especially from hyperscalers. If AI continues to push the limits of GPU performance, HBM could remain a critical component. However, if businesses prioritize cost efficiency over peak performance, HBM’s growth may level off. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Alphabet Soup of Cloud Terminology As with any technology, the cloud brings its own alphabet soup of terms. This insight will hopefully help you navigate Read more

Read More
Ambient AI Enhances Patient-Provider Relationship

Ambient AI Enhances Patient-Provider Relationship

How Ambient AI is Enhancing the Patient-Provider Relationship Ambient AI is transforming the patient-provider experience at Ochsner Health by enabling clinicians to focus more on their patients and less on their screens. While some view technology as a barrier to human interaction, Ochsner’s innovation officer, Dr. Jason Hill, believes ambient AI is doing the opposite by fostering stronger connections between patients and providers. Researchers estimate that physicians spend over 40% of consultation time focused on electronic health records (EHRs), limiting face-to-face interactions. “We have highly skilled professionals spending time inputting data instead of caring for patients, and as a result, patients feel disconnected due to the screen barrier,” Hill said. Additionally, increased documentation demands related to quality reporting, patient satisfaction, and reimbursement are straining providers. Ambient AI scribes help relieve this burden by automating clinical documentation, allowing providers to focus on their patients. Using machine learning, these AI tools generate clinical notes in seconds from recorded conversations. Clinicians then review and edit the drafts before finalizing the record. Ochsner began exploring ambient AI several years ago, but only with the advent of advanced language models like OpenAI’s GPT did the technology become scalable and cost-effective for large health systems. “Once the technology became affordable for large-scale deployment, we were immediately interested,” Hill explained. Selecting the Right Vendor Ochsner piloted two ambient AI tools before choosing DeepScribe for an enterprise-wide partnership. After the initial rollout to 60 physicians, the tool achieved a 75% adoption rate and improved patient satisfaction scores by 6%. What set DeepScribe apart were its customization features. “We can create templates for different specialties, but individual doctors retain control over their note outputs based on specific clinical encounters,” Hill said. This flexibility was crucial in gaining physician buy-in. Ochsner also valued DeepScribe’s strong vendor support, which included tailored training modules and direct assistance to clinicians. One example of this support was the development of a software module that allowed Ochsner’s providers to see EHR reminders within the ambient AI app. “DeepScribe built a bridge to bring EHR data into the app, so clinicians could access important information right before the visit,” Hill noted. Ensuring Documentation Quality Ochsner has implemented several safeguards to maintain the accuracy of AI-generated clinical documentation. Providers undergo training before using the ambient AI system, with a focus on reviewing and finalizing all AI-generated notes. Notes created by the AI remain in a “pended” state until the provider signs off. Ochsner also tracks how much text is generated by the AI versus added by the provider, using this as a marker for the level of editing required. Following the successful pilot, Ochsner plans to expand ambient AI to 600 clinicians by the end of the year, with the eventual goal of providing access to all 4,700 physicians. While Hill anticipates widespread adoption, he acknowledges that the technology may not be suitable for all providers. “Some clinicians have different documentation needs, but for the vast majority, this will likely become the standard way we document at Ochsner within a year,” he said. Conclusion By integrating ambient AI, Ochsner Health is not only improving operational efficiency but also strengthening the human connection between patients and providers. As the technology becomes more widespread, it holds the potential to reshape how clinical documentation is handled, freeing up time for more meaningful patient interactions. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Zendesk Launches AI Agent Builder

The State of AI

The State of AI: How We Got Here (and What’s Next) Artificial intelligence (AI) has evolved from the realm of science fiction into a transformative force reshaping industries and lives around the world. But how did AI develop into the technology we know today, and where is it headed next? At Dreamforce, two of Salesforce’s leading minds in AI—Chief Scientist Silvio Savarese and Chief Futurist Peter Schwartz—offered insights into AI’s past, present, and future. How We Got Here: The Evolution of AI AI’s roots trace back decades, and its journey has been defined by cycles of innovation and setbacks. Peter Schwartz, Salesforce’s Chief Futurist, shared a firsthand perspective on these developments. Having been involved in AI since the 1970s, Schwartz witnessed the first “AI winter,” a period of reduced funding and interest due to the immense challenges of understanding and replicating the human brain. In the 1990s and early 2000s, AI shifted from attempting to mimic human cognition to adopting data-driven models. This new direction opened up possibilities beyond the constraints of brain-inspired approaches. By the 2010s, neural networks re-emerged, revolutionizing AI by enabling systems to process raw data without extensive pre-processing. Savarese, who began his AI research during one of these challenging periods, emphasized the breakthroughs in neural networks and their successor, transformers. These advancements culminated in large language models (LLMs), which can now process massive datasets, generate natural language, and perform tasks ranging from creating content to developing action plans. Today, AI has progressed to a new frontier: large action models. These systems go beyond generating text, enabling AI to take actions, adapt through feedback, and refine performance autonomously. Where We Are Now: The Present State of AI The pace of AI innovation is staggering. Just a year ago, discussions centered on copilots—AI systems designed to assist humans. Now, the conversation has shifted to autonomous AI agents capable of performing complex tasks with minimal human oversight. Peter Schwartz highlighted the current uncertainties surrounding AI, particularly in regulated industries like banking and healthcare. Leaders are grappling with questions about deployment speed, regulatory hurdles, and the broader societal implications of AI. While many startups in the AI space will fail, some will emerge as the giants of the next generation. Salesforce’s own advancements, such as the Atlas Reasoning Engine, underscore the rapid progress. These technologies are shaping products like Agentforce, an AI-powered suite designed to revolutionize customer interactions and operational efficiency. What’s Next: The Future of AI According to Savarese, the future lies in autonomous AI systems, which include two categories: The Road Ahead As AI continues to evolve, it’s clear that its potential is boundless. However, the path forward will require careful navigation of ethical, regulatory, and practical challenges. The key to success lies in innovation, collaboration, and a commitment to creating systems that enhance human capabilities. For Salesforce, the journey has only just begun. With groundbreaking technologies and visionary leadership, the company is not just predicting the future of AI—it’s creating it. The State of AI. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Salesforce Automotive Cloud

Salesforce Automotive Cloud

What is Salesforce Automotive Cloud? In September 2022, Salesforce introduced Automotive Cloud, a robust all-in-one platform tailored for the automotive industry. At first glance, it appears to be an ideal solution for businesses in this sector, but how well does it serve car dealerships? Drawing on experience both as a former auto dealership employee and in building Salesforce Dealership Management Systems (DMS), an in-depth exploration was undertaken to determine if this platform genuinely meets the needs of dealerships. What is a Dealership Management System (DMS)? A Dealership Management System (DMS) is a comprehensive software suite designed to manage the daily operations of a car dealership. It includes modules for sales, service, inventory management, vehicle lifecycle management, customer relationship management (CRM), and more. Essentially, it acts as the dealership’s corporate operating system, housing and processing customer data to generate valuable insights. What Does This Mean for Salesforce Consultants? Salesforce consultants with specialized expertise often find it easier to secure jobs and command higher rates compared to their generalist peers. This is especially true in niche areas like Automotive Cloud, where demand for specialized knowledge is high, and businesses are willing to invest in quality resources. In today’s uncertain economic climate, job security is a priority. Developing expertise in niche areas like Automotive Cloud can be a strategic move. As more car dealerships adopt this new technology, consultants with relevant experience will find ample opportunities to leverage their skills and meet the growing demand for DMS solutions. First Impressions of Automotive Cloud At first glance, Automotive Cloud offers a promising set of tools for managing various aspects of dealership operations, from sales and service to inventory management and CRM. However, initial impressions were mixed. Some features, like Vehicle Definitions, were initially overwhelming and unclear in their application. For example, while Automotive Cloud aggregates information about a specific vehicle model and its components (like engine, transmission, etc.), it lacks a CPQ (Configure Price Quote) feature. This omission is disappointing, as CPQ is crucial for configuring vehicles within the Salesforce interface. However, fear not, as third party CPQ tools are available. On the flip side, Automotive Cloud’s vehicle lifecycle management features are impressive. It allows for comprehensive tracking of a vehicle’s lifecycle, including purchase, maintenance, and decommissioning cycles. This is especially beneficial for dealerships, as much of their profit comes from post-sale services like warranty maintenance. What Salesforce Products Does It Use? A closer examination of the components within Automotive Cloud reveals that it is a mix of several Salesforce products, including: Additionally, Automotive Cloud includes customizations specifically designed for the automotive industry. For those interested in a more in-depth understanding, the Automotive Cloud documentation provides detailed explanations of the platform’s use cases. Automotive Cloud Data Model One of the first steps in exploring a new product is examining its data model, which provides insights into the product’s design and intended use. In Automotive Cloud, Salesforce focuses on several key dimensions: A Quick Overview of Capabilities Based on a thorough understanding of dealership operations, Automotive Cloud’s features most relevant to car dealers were evaluated: Is Salesforce Automotive Cloud Worth Learning for Car Dealers? The verdict is mixed. Automotive Cloud is not a perfect DMS for dealerships; it includes excessive features that may go unused while missing some critical functionalities. However, it is a great fit for auto manufacturers or distributors due to its built-in functionality for managing dealerships and manufacturing-related tasks. Is it worth learning? Absolutely. Automotive Cloud is a new offering from Salesforce, and currently, there isn’t an “Accredited Professional” badge available for it. By diving into Automotive Cloud early, Salesforce consultants can gain an edge over their peers and attract more employers. Moreover, Automotive Cloud combines multiple Salesforce Clouds, making it an excellent opportunity to learn Salesforce and familiarize oneself with complex data models. With its limited number of Flows and code, the learning curve is manageable, offering consultants a chance to build custom solutions that could become a selling point in their careers. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Multi AI Agent Systems

Multi AI Agent Systems

Building Multi-AI Agent Systems: A Comprehensive Guide As technology advances at an unprecedented pace, Multi-AI Agent systems are emerging as a transformative approach to creating more intelligent and efficient applications. This guide delves into the significance of Multi-AI Agent systems and provides a step-by-step tutorial on building them using advanced frameworks like LlamaIndex and CrewAI. What Are Multi-AI Agent Systems? Multi-AI Agent systems are a groundbreaking development in artificial intelligence. Unlike single AI agents that operate independently, these systems consist of multiple autonomous agents that collaborate to tackle complex tasks or solve intricate problems. Key Features of Multi-AI Agent Systems: Applications of Multi-AI Agent Systems: Multi-agent systems are versatile and impactful across industries, including: The Workflow of a Multi-AI Agent System Building an effective Multi-AI Agent system requires a structured approach. Here’s how it works: Building Multi-AI Agent Systems with LlamaIndex and CrewAI Step 1: Define Agent Roles Clearly define the roles, goals, and specializations of each agent. For example: Step 2: Initiate the Workflow Establish a seamless workflow for agents to perform their tasks: Step 3: Leverage CrewAI for Collaboration CrewAI enhances collaboration by enabling autonomous agents to work together effectively: Step 4: Integrate LlamaIndex for Data Handling Efficient data management is crucial for agent performance: Understanding AI Inference and Training Multi-AI Agent systems rely on both AI inference and training: Key Differences: Aspect AI Training AI Inference Purpose Builds the model. Uses the model for tasks. Process Data-driven learning. Real-time decision-making. Compute Needs Resource-intensive. Optimized for efficiency. Both processes are essential: training builds the agents’ capabilities, while inference ensures swift, actionable results. Tools for Multi-AI Agent Systems LlamaIndex An advanced framework for efficient data handling: CrewAI A collaborative platform for building autonomous agents: Practical Example: Multi-AI Agent Workflow Conclusion Building Multi-AI Agent systems offers unparalleled opportunities to create intelligent, responsive, and efficient applications. By defining clear agent roles, leveraging tools like CrewAI and LlamaIndex, and integrating robust workflows, developers can unlock the full potential of these systems. As industries continue to embrace this technology, Multi-AI Agent systems are set to revolutionize how we approach problem-solving and task execution. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more

Read More
Battle of Copilots

Battle of Copilots

Salesforce is directly challenging Microsoft in the growing battle of AI copilots, which are designed to enhance customer experience (CX) across key business functions like sales and support. In this competitive landscape, Salesforce is taking on not only Microsoft but also major AI rivals such as Google Gemini, OpenAI GPT, and IBM watsonx. At the heart of this strategy is Salesforce Agentforce, a platform that leverages autonomous decision-making to meet enterprise demands for data and AI abstraction. Salesforce Dreamforce Highlights One of the most significant takeaways from last month’s Dreamforce conference in San Francisco was the unveiling of autonomous agents, bringing advanced GenAI capabilities to the app development process. CEO Marc Benioff and other Salesforce executives made it clear that Salesforce is positioning itself to compete with Microsoft’s Copilot, rebranding and advancing its own AI assistant, previously known as Einstein AI. Microsoft’s stronghold, however, lies in Copilot’s seamless integration with widely used products like Teams, Outlook, PowerPoint, and Word. Furthermore, Microsoft has established itself as a developer’s favorite, especially with GitHub Copilot and the Azure portfolio, which are integral to app modernization in many enterprises. “Salesforce faces an uphill battle in capturing market share from these established players,” says Charlotte Dunlap, Research Director at GlobalData. “Salesforce’s best chance lies in highlighting the autonomous capabilities of Agentforce—enabling businesses to automate more processes, moving beyond basic chatbot functions, and delivering a personalized customer experience.” This emphasis on autonomy is vital, given that many enterprises are still grappling with the complexities of emerging GenAI technologies. Dunlap points out that DevOps teams are struggling to find third-party expertise that understands how GenAI fits within existing IT systems, particularly around security and governance concerns. Salesforce’s focus on automation, combined with the integration prowess of MuleSoft, positions it as a key player in making GenAI tools more accessible and intuitive for businesses. Elevating AI Abstraction and Automation Salesforce has increasingly focused on the idea of abstracting data and AI, exemplified by its Data Cloud and low-level UI capabilities. Now, with models like the Atlas Reasoning Engine, Salesforce is looking to push beyond traditional AI assistants. These tools are designed to automate complex, previously human-dependent tasks, spanning functions like sales, service, and marketing. Simplifying the Developer Experience The true measure of Salesforce’s success in its GenAI strategy will emerge in the coming months. The company is well aware that its ability to simplify the developer experience is critical. Enterprises are looking for more than just AI innovation—they want thought leadership that can help secure budget and executive support for AI initiatives. Many companies report ongoing struggles in gaining that internal buy-in, further underscoring the importance of strong, strategic partnerships with technology providers like Salesforce. In its pursuit to rival Microsoft Copilot, Salesforce’s future hinges on how effectively it can build on its track record of simplifying the developer experience while promoting the unique autonomous qualities of Agentforce. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Impact of Generative AI

Impact of Generative AI

Generative AI has emerged as the most dominant trend in data management and analytics, overshadowing all other technologies. This prominence began with the launch of ChatGPT by OpenAI in November 2022, which significantly advanced the capabilities of large language models (LLMs) and demonstrated the transformative potential of generative AI (GenAI) for enterprises. Generative AI’s impact is profound, particularly in making advanced business intelligence tools accessible to a broader range of employees, not just data scientists and analysts. Before the advent of GenAI, complex data management and analytics platforms required computer science skills, statistical expertise, and extensive data literacy. Generative AI has reduced these barriers, enabling more people to leverage data insights for decision-making. Another key advantage of generative AI is its ability to greatly enhance efficiency. It can automate time-consuming, repetitive tasks previously performed manually by data engineers and experts, acting as an independent agent in managing data processes. The landscape of generative AI has evolved rapidly. Following the launch of ChatGPT, a wave of competing LLMs has emerged. Initially, the transformative potential of these technologies was theoretical, but it is now becoming tangible. Companies like Google are developing tools to help customers build and deploy their own generative AI models and applications. Enterprises are increasingly moving from pilot testing to developing and implementing production models. Generative AI does not operate in isolation. Enterprises are also focusing on complementary aspects such as data quality and governance. Ensuring that the data feeding and training generative AI is reliable is crucial. Additionally, real-time data and automation are essential for making generative AI a proactive technology rather than a reactive one. Generative AI has highlighted the need for a robust data foundation. The main challenge now is ensuring that enterprise data is trusted, governed, and ready for AI applications. With the rise of multimodal data, enterprises require a unified approach to manage and govern diverse data types effectively. In addition to generative AI, other significant trends in data management and analytics include the focus on real-time data processing and automation. Integrating generative AI with real-time data streams and automated systems is expected to drive substantial business transformation. By enabling real-time insights and actions, businesses can achieve a level of operational efficiency previously unattainable. The convergence of these technologies is transforming business operations. Unified and simplified technology stacks, integrating foundational technologies, LLMs, and real-time data platforms, are essential for driving this transformation. The industry is making strides towards creating integrated solutions that support comprehensive data management and analytics. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more

Read More
New Salesforce Maps Experience Auto-Enabled in Winter ‘25 (October) Release

Christmas 2024

With artificial Christmas trees and holiday inflatables already appearing alongside Halloween decorations at big-box retailers, (and in neighbors’ yards before the first drop of pumpkin spice has been sipped) it’s clear that the holiday season is beginning earlier than ever this year. However, according to a new forecast from Salesforce, the expected holiday sales boost may be somewhat modest. Salesforce projects a 2 percent increase in overall sales for November and December, a slight drop from the 3 percent increase seen in 2023. The forecast highlights that consumers are facing higher debt due to elevated interest rates and inflation, which is likely to diminish their purchasing power compared to recent years. About 40 percent of shoppers plan to cut back on spending this year, while just under half intend to maintain their current spending levels. Adding to the challenge is the brief holiday shopping window between Thanksgiving and Christmas this year—only 27 days, the shortest since 2019. This data comes from Salesforce’s analysis of over 1.5 billion global shoppers across 64 countries, with a focus on 12 key markets including the U.S., Canada, U.K., Germany, and France. Shopping Trends and Strategies In terms of shopping habits, bargain hunters are expected to turn to platforms like Temu, Shein, and other Chinese-owned apps, with nearly one in five holiday purchases anticipated from these sources. TikTok is seeing rapid growth as a sales platform, with a 24 percent increase in shoppers making purchases through the app since April. For businesses, the focus on price is likely to intensify. Two-thirds of global shoppers will let cost dictate their shopping decisions this year, compared to 46 percent in 2020. Less than a third will prioritize product quality over price when selecting gifts. This trend suggests a busy Black Friday and Cyber Monday, with two-thirds of shoppers planning to delay major purchases until Cyber Week to seek out bargains. Salesforce forecasts an average discount of 30 percent in the U.S. during this period. Caila Schwartz, director of strategy and consumer insights at Salesforce, notes, “This season will be competitive, intense, and focused heavily on pricing and discounting strategies.” Shipping and Technology Challenges The shipping industry also poses a potential challenge, with container shipping costs becoming increasingly unstable. Brands and retailers are expected to incur an additional $197 billion in middle-mile expenses—a 97 percent increase from last year. To counter the threat from discount online retailers, stores with online capabilities should enhance their in-store pickup options. Salesforce predicts that buy online, pick up in store (BOPIS) will account for up to one-third of online orders globally in the week leading up to Christmas. Additionally, while still emerging, artificial intelligence (AI) is expected to play a role in holiday sales, with 18 percent of global orders influenced by predictive and generative AI, according to Salesforce. As retailers navigate these complexities, strategic pricing and efficient logistics will be key to capturing consumer attention and driving holiday sales. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
SingleStore Acquires BryteFlow

SingleStore Acquires BryteFlow

SingleStore Acquires BryteFlow, Paving the Way for Real-Time Analytics and Next-Gen AI Use Cases SingleStore, the world’s only database designed to transact, analyze, and search petabytes of data in milliseconds, has announced its acquisition of BryteFlow, a leading data integration platform. This move enhances SingleStore’s capabilities to ingest data from diverse sources—including SAP, Oracle, and Salesforce—while empowering users to operationalize data from their CRM and ERP systems. With the acquisition, SingleStore will integrate BryteFlow’s data integration technology into its core offering, launching a new experience called SingleConnect. This addition will complement SingleStore’s existing functionalities, enabling users to gain deeper insights from their data, accelerate real-time analytics, and support emerging generative AI (GenAI) use cases. “This acquisition marks a pivotal step in our mission to deliver unparalleled speed, scale, and simplicity,” said Raj Verma, CEO of SingleStore. “Customer demands are evolving rapidly due to shifts in big data storage formats and advancements in generative AI. We believe that data is the foundation of all intelligence, and SingleConnect comes at a perfect time to address this need.” BryteFlow’s platform provides scalable change data capture (CDC) capabilities across multiple data sources, ensuring data integrity between source and target. It integrates seamlessly with major cloud platforms like AWS, Microsoft Azure, and Google Cloud, making it a powerful tool for cloud-based data warehouses and data lakes. Its no-code interface allows for easy and accessible data integration, ensuring that existing BryteFlow customers will experience uninterrupted service and ongoing support. “By combining BryteFlow’s real-time data integration expertise with SingleStore’s capabilities, we aim to help global organizations extract maximum value from their data and scale modern applications,” said Pradnya Bhandary, CEO of BryteFlow. “With SingleConnect, developers will find it easier and faster to access enterprise data sources, tackle complex workloads, and deliver exceptional experiences to their customers.” Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Nvidia and Salesforce

Nvidia and Salesforce

Salesforce and Nvidia have announced a groundbreaking collaboration to push the boundaries of AI, transforming both customer and employee experiences. Redefining AI in Enterprise Software As businesses worldwide face the complexities and costs of integrating AI into their operations, Salesforce and Nvidia are stepping in with a strategic partnership designed to redefine AI capabilities. This collaboration merges Salesforce’s extensive CRM and enterprise software expertise with Nvidia’s advanced AI and high-performance computing technologies. The goal is to create a new generation of AI agents and avatars that can operate autonomously, grasp complex business contexts, and engage with humans in a more natural, intuitive manner. Marc Benioff, Chair and CEO of Salesforce, states: “Together with Nvidia, we’re spearheading the third wave of the AI revolution—moving beyond copilots to a seamless integration of humans and intelligent agents driving customer success.” Enhancing Salesforce’s Platform The partnership focuses on integrating Nvidia’s accelerated computing and AI software to enhance Salesforce’s platform performance. Key to this effort is the optimization of Salesforce Data Cloud, which harmonizes structured and unstructured customer data in real time. Nvidia’s full-stack accelerated computing platform will significantly increase compute resources, leading to faster insights and improved AI performance across Salesforce’s offerings. AI-Powered Avatars and Beyond A major innovation from this collaboration is the development of AI-powered avatars. By combining Nvidia ACE, a suite of digital human technologies, with Salesforce’s new Agentforce platform, the companies aim to create more engaging, human-like experiences for interactions with customers and employees. These avatars will leverage multi-modal AI models for speech recognition, text-to-speech, and contextual visual responses, potentially revolutionizing business communication. Nvidia founder and CEO Jensen Huang envisions a future where “every company, every job will be enhanced by a wide range of AI agents—assistants that will transform how we work.” He adds, “Nvidia and Salesforce are uniting our technologies to accelerate the development of AI agents, supercharging productivity for companies.” Transforming Business Operations The Salesforce-Nvidia partnership is more than a technological alliance; it’s a strategic move to meet the increasing demand for AI-driven enterprise solutions. The collaboration positions both companies at the forefront of the AI revolution in enterprise software, aiming to reshape how businesses interact with customers and manage their operations. Key facts include: Real-World Applications The potential applications of this technology are extensive. For example: Looking Ahead As Salesforce and Nvidia’s partnership unfolds, it promises not only technological advancements but a fundamental shift in how businesses leverage AI for growth, efficiency, and customer satisfaction. Marc Benioff highlights the potential: “By combining Nvidia’s AI platform with Agentforce, we’re amplifying AI performance and creating dynamic digital avatars, delivering more engaging, intelligent, and immersive customer experiences than ever before.” This collaboration is set to lead the third wave of the AI revolution, integrating humans and intelligent agents to drive unprecedented customer success. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
gettectonic.com