Databricks Archives - gettectonic.com
Transform Customer Experiences

Transform Customer Experiences

How to Transform Customer Experiences with AI and Sub-Second E2E Real-Time Data Sync Introducing Data Cloud’s Sub-Second E2E Real-Time FeatureDeliver hyper-personalized experiences in real time, no matter how or where customers engage with your brand. Exceptional customer experiences hinge on unifying interactions across every touchpoint. Yet, fragmented data dispersed across systems, channels, and clouds often stands in the way. Salesforce Data Cloud eliminates these silos by delivering a synchronized, real-time customer data ecosystem, enabling brands to create personalized, seamless experiences instantly—regardless of how or where customers connect. We’re excited to announce that the Sub-Second E2E Real-Time feature in Salesforce Data Cloud is now available. This innovation processes and analyzes data as it’s generated, empowering brands to make immediate, data-driven decisions. Combined with Einstein Personalization—which leverages advanced machine learning (ML) and rules-based automation—businesses can deliver individualized experiences across all channels, driving deeper engagement and improved outcomes. What is Sub-Second Real-Time? Sub-second real-time refers to the ability to process and deliver data or responses in less than one second, ensuring ultra-low latency and near-instantaneous results. This capability is critical for applications requiring immediate data updates, such as live analytics, responsive user interfaces, and time-sensitive decision-making. The Sub-Second E2E Real-Time feature empowers industries like fraud detection, predictive maintenance, and real-time marketing with instant insights. By synchronizing data across systems, channels, and clouds, Data Cloud ensures a unified, real-time customer view, giving businesses a competitive edge. Real-World Examples of Sub-Second Real-Time in Action 1. Real-Time Web Personalization Imagine a user browsing a website. As they interact with products, Data Cloud instantly captures this activity and updates their customer profile. Using Einstein Personalization, the system processes this data in milliseconds to tailor their browsing experience. For instance, personalized product recommendations can appear as the user clicks, leveraging insights from their behavior across platforms such as websites, point-of-sale systems, mobile apps, and other data sources. This seamless personalization is made possible by Data Cloud’s integrations, including zero-copy ingestion from major data warehouses like Snowflake, Databricks, and Redshift. The result? A continuously updated, 360-degree customer view that enhances every touchpoint. 2. Real-Time Support with Agentforce Now, consider a customer engaging in a live chat for assistance. As they browse, their actions are captured and updated in real time. When they initiate a chat, whether through Agentforce AI agents or human support, the agent has immediate access to their full activity history, updated within milliseconds. This enables the agent to provide tailored responses and solutions, ensuring a frictionless and engaging customer support experience. Why Sub-Second Real-Time Matters From personalization to support, the Sub-Second E2E Real-Time feature in Data Cloud ensures every customer interaction feels relevant, timely, and connected. By bridging the gap between data silos and intelligent automation, businesses can unlock new opportunities to exceed customer expectations—at scale and in real time. Explore how Salesforce Data Cloud can transform your customer experience today. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Snowflake Security and Development

Snowflake Security and Development

Snowflake Unveils AI Development and Enhanced Security Features At its annual Build virtual developer conference, Snowflake introduced a suite of new capabilities focused on AI development and strengthened security measures. These enhancements aim to simplify the creation of conversational AI tools, improve collaboration, and address data security challenges following a significant breach earlier this year. AI Development Updates Snowflake announced updates to its Cortex AI suite to streamline the development of conversational AI applications. These new tools focus on enabling faster, more efficient development while ensuring data integrity and trust. Highlights include: These features address enterprise demands for generative AI tools that boost productivity while maintaining governance over proprietary data. Snowflake aims to eliminate barriers to data-driven decision-making by enabling natural language queries and easy integration of structured and unstructured data into AI models. According to Christian Kleinerman, Snowflake’s EVP of Product, the goal is to reduce the time it takes for developers to build reliable, cost-effective AI applications: “We want to help customers build conversational applications for structured and unstructured data faster and more efficiently.” Security Enhancements Following a breach last May, where hackers accessed customer data via stolen login credentials, Snowflake has implemented new security features: These additions come alongside existing tools like the Horizon Catalog for data governance. Kleinerman noted that while Snowflake’s previous security measures were effective at preventing unauthorized access, the company recognizes the need to improve user adoption of these tools: “It’s on us to ensure our customers can fully leverage the security capabilities we offer. That’s why we’re adding more monitoring, insights, and recommendations.” Collaboration Features Snowflake is also enhancing collaboration through its new Internal Marketplace, which enables organizations to share data, AI tools, and applications across business units. The Native App Framework now integrates with Snowpark Container Services to simplify the distribution and monetization of analytics and AI products. AI Governance and Competitive Position Industry analysts highlight the growing importance of AI governance as enterprises increasingly adopt generative AI tools. David Menninger of ISG’s Ventana Research emphasized that Snowflake’s governance-focused features, such as LLM observability, fill a critical gap in AI tooling: “Trustworthy AI enhancements like model explainability and observability are vital as enterprises scale their use of AI.” With these updates, Snowflake continues to compete with Databricks and other vendors. Its strategy focuses on offering both API-based flexibility for developers and built-in tools for users seeking simpler solutions. By combining innovative AI development tools with robust security and collaboration features, Snowflake aims to meet the evolving needs of enterprises while positioning itself as a leader in the data platform and AI space. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more

Read More
Databricks Tools

Databricks Tools

Databricks recently introduced Databricks Apps, a toolkit designed to simplify AI and data application development. By integrating native development platforms and offering automatic provisioning of serverless compute, the toolkit enables customers to more easily develop and deploy applications. Databricks Apps builds on the existing capabilities of Mosaic AI, which allows users to integrate large language models (LLMs) with their enterprise’s proprietary data. However, the ability to develop interactive AI applications, such as generative AI chatbots, was previously missing. Databricks Apps addresses this gap, allowing developers to build and deploy custom applications entirely within the secure Databricks environment. According to Donald Farmer, founder and principal of TreeHive Strategy, Databricks Apps removes obstacles like the need to set up separate infrastructure for development and deployment, making the process easier and more efficient. The new features allow companies to go beyond implementing AI/ML models and create differentiated applications that leverage their unique data sets. Kevin Petrie, an analyst at BARC U.S., highlighted the significance of Databricks Apps in helping companies develop custom AI applications, which are essential for maintaining a competitive edge. Databricks, founded in 2013, was one of the pioneers of the data lakehouse storage format, and over the last two years, it has expanded its platform to focus on AI and machine learning (ML) capabilities. The company’s $1.3 billion acquisition of MosaicML in June 2023 was a key milestone in building its AI environment. Databricks has since launched DBRX, its own large language model, and introduced further functionalities through product development. Databricks Apps, now available in public preview on AWS and Azure, advances these AI development capabilities, simplifying the process of building applications within a single platform. Developers can use frameworks like Dash, Flask, Gradio, Shiny, and Streamlit, or opt for integrated development environments (IDEs) like Visual Studio Code or PyCharm. The toolkit also provides prebuilt Python templates to accelerate development. Additionally, applications can be deployed and managed directly in Databricks, eliminating the need for external infrastructures. Databricks Apps includes security features such as access control and data lineage through the Unity Catalog. Farmer noted that the support for popular developer frameworks and the automatic provisioning of serverless compute could significantly impact the AI development landscape by reducing the complexity of deploying data architectures. While competitors like AWS, Google Cloud, Microsoft, and Snowflake have also made AI a key focus, Farmer pointed out that Databricks’ integration of AI tools into a unified platform sets it apart. Databricks Apps further enhances this competitive advantage. Despite the added capabilities of Databricks Apps, Petrie cautioned that developing generative AI applications still requires a level of expertise in data, AI, and the business domain. While Databricks aims to make AI more accessible, users will still need substantial knowledge to effectively leverage these tools. Databricks’ vice president of product management, Shanku Niyogi, explained that the new features in Databricks Apps were driven by customer feedback. As enterprise interest in AI grows, customers sought easier ways to develop and deploy internal data applications in a secure environment. Looking ahead, Databricks plans to continue investing in simplifying AI application development, with a focus on enhancing Mosaic AI and expanding its collaborative AI partner ecosystem. Farmer suggested that the company should focus on supporting nontechnical users and emerging AI technologies like multimodal models, which will become increasingly important in the coming years. The introduction of Databricks Apps marks a significant step forward in Databricks’ AI and machine learning strategy, offering users a more streamlined approach to building and deploying AI applications. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Small Language Models Explained

Small Language Models Explained

Exploring Small Language Models (SLMs): Capabilities and Applications Large Language Models (LLMs) have been prominent in AI for some time, but Small Language Models (SLMs) are now enhancing our ability to work with natural and programming languages. While LLMs excel in general language understanding, certain applications require more accuracy and domain-specific knowledge than these models can provide. This has created a demand for custom SLMs that offer LLM-like performance while reducing runtime costs and providing a secure, manageable environment. In this insight, we dig down into the world of SLMs, exploring their unique characteristics, benefits, and applications. We also discuss fine-tuning methods applied to Llama-2–13b, an SLM, to address specific challenges. The goal is to investigate how to make the fine-tuning process platform-independent. We selected Databricks for this purpose due to its compatibility with major cloud providers like Azure, Amazon Web Services (AWS), and Google Cloud Platform. What Are Small Language Models? In AI and natural language processing, SLMs are lightweight generative models with a focus on specific tasks. The term “small” refers to: SLMs like Google Gemini Nano, Microsoft’s Orca-2–7b, and Meta’s Llama-2–13b run efficiently on a single GPU and include over 5 billion parameters. SLMs vs. LLMs Applications of SLMs SLMs are increasingly used across various sectors, including healthcare, technology, and beyond. Common applications include: Fine-Tuning Small Language Models Fine-tuning involves additional training of a pre-trained model to make it more domain-specific. This process updates the model’s parameters with new data to enhance its performance in targeted applications, such as text generation or question answering. Hardware Requirements for Fine-Tuning The hardware needs depend on the model size, project scale, and dataset. General recommendations include: Data Preparation Preparing data involves extracting text from PDFs, cleaning it, generating question-and-answer pairs, and then fine-tuning the model. Although GPT-3.5 was used for generating Q&A pairs, SLMs can also be utilized for this purpose based on the use case. Fine-Tuning Process You can use HuggingFace tools for fine-tuning Llama-2–13b-chat-hf. The dataset was converted into a HuggingFace-compatible format, and quantization techniques were applied to optimize performance. The fine-tuning lasted about 16 hours over 50 epochs, with the cost around $100/£83, excluding trial costs. Results and Observations The fine-tuned model demonstrated strong performance, with over 70% of answers being highly similar to those generated by GPT-3.5. The SLM achieved comparable results despite having fewer parameters. The process was successful on both AWS and Databricks platforms, showcasing the model’s adaptability. SLMs have some limitations compared to LLMs, such as higher operational costs and restricted knowledge bases. However, they offer benefits in efficiency, versatility, and environmental impact. As SLMs continue to evolve, their relevance and popularity are likely to increase, especially with new models like Gemini Nano and Mixtral entering the market. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Salesforce Data Cloud and Zero Copy

Salesforce Data Cloud and Zero Copy

As organizations across industries gather increasing amounts of data from diverse sources, they face the challenge of making that data actionable and deriving real-time insights. With Salesforce Data Cloud and zero copy architecture, organizations can streamline access to data and build dynamic, real-time dashboards that drive value while embedding contextual insights into everyday workflows. A session during Dreamforce 2024 with Joanna McNurlen, Principal Solution Engineer for Data Cloud at Salesforce, discussed how zero copy architecture facilitates the creation of dashboards and workflows that provide near-instant insights, enabling quick decision-making to enhance operational efficiency and competitive advantage. What is zero copy architecture?Traditionally, organizations had to replicate data from one system to another, such as copying CRM data into a data warehouse for analysis. This approach introduces latency, increases storage costs, and often results in inconsistencies between systems. Zero copy architecture eliminates the need for replication and provides a single source of truth for your data. It allows different systems to access data in its original location without duplication across platforms. Instead of using traditional extract, transform, and load (ETL) processes, systems like Salesforce Data Cloud can connect directly with external databases, such as Google Cloud BigQuery, Snowflake, Databricks, or Amazon Redshift, for real-time data access. Zero copy can also facilitate data sharing from within Salesforce to other systems. As Salesforce expands its zero copy partner network, opportunities to easily connect data from various sources will continue to grow. How does zero copy work?Zero copy employs virtual tables that act as blueprints for the data structure, enabling queries to be executed as if the data were local. Changes made in the data warehouse are instantly visible across all connected systems, ensuring users always work with the latest information. While developing dashboards, users can connect directly to the zero copy objects within Data Cloud to create visualizations and reports on top of them. Why is zero copy beneficial?Zero copy allows organizations to analyze data as it is generated, enabling faster responses, smarter decision-making, and enhanced customer experiences. This architecture reduces reliance on data transformation workflows and synchronizations within both Tableau and CRM Analytics, where organizations have historically encountered bottlenecks due to runtimes and platform limits. Various teams can benefit from the following capabilities: Unlocking real-time insights in Salesforce using zero copy architectureZero copy architecture and real-time data are transforming how organizations operate. By eliminating data duplication and providing real-time insights, the use of zero copy in Salesforce Data Cloud empowers organizations to work more efficiently, make informed decisions, and enhance customer experiences. Now is the perfect time to explore how Salesforce Data Cloud and zero copy can elevate your operations. Tectonic, a trusted Salesforce partner, can help you unlock the potential of your data and create new opportunities with the Salesforce platform. Connect with us today to get started. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
AI Agents Connect Tool Calling and Reasoning

AI Agents Connect Tool Calling and Reasoning

AI Agents: Bridging Tool Calling and Reasoning in Generative AI Exploring Problem Solving and Tool-Driven Decision Making in AI Introduction: The Emergence of Agentic AI Recent advancements in libraries and low-code platforms have simplified the creation of AI agents, often referred to as digital workers. Tool calling stands out as a key capability that enhances the “agentic” nature of Generative AI models, enabling them to move beyond mere conversational tasks. By executing tools (functions), these agents can act on your behalf and tackle intricate, multi-step problems requiring sound decision-making and interaction with diverse external data sources. This insight explores the role of reasoning in tool calling, examines the challenges associated with tool usage, discusses common evaluation methods for tool-calling proficiency, and provides examples of how various models and agents engage with tools. Reasoning as a Means of Problem-Solving Successful agents rely on two fundamental expressions of reasoning: reasoning through evaluation and planning, and reasoning through tool use. While both reasoning expressions are vital, they don’t always need to be combined to yield powerful solutions. For instance, OpenAI’s new o1 model excels in reasoning through evaluation and planning, having been trained to utilize chain of thought effectively. This has notably enhanced its ability to address complex challenges, achieving human PhD-level accuracy on benchmarks like GPQA across physics, biology, and chemistry, and ranking in the 86th-93rd percentile on Codeforces contests. However, the o1 model currently lacks explicit tool calling capabilities. Conversely, many models are specifically fine-tuned for reasoning through tool use, allowing them to generate function calls and interact with APIs effectively. These models focus on executing the right tool at the right moment but may not evaluate their results as thoroughly as the o1 model. The Berkeley Function Calling Leaderboard (BFCL) serves as an excellent resource for comparing the performance of various models on tool-calling tasks and provides an evaluation suite for assessing fine-tuned models against challenging scenarios. The recently released BFCL v3 now includes multi-step, multi-turn function calling, raising the standards for tool-based reasoning tasks. Both reasoning types are powerful in their own right, and their combination holds the potential to develop agents that can effectively deconstruct complex tasks and autonomously interact with their environments. For more insights into AI agent architectures for reasoning, planning, and tool calling, check out my team’s survey paper on ArXiv. Challenges in Tool Calling: Navigating Complex Agent Behaviors Creating robust and reliable agents necessitates overcoming various challenges. In tackling complex problems, an agent often must juggle multiple tasks simultaneously, including planning, timely tool interactions, accurate formatting of tool calls, retaining outputs from prior steps, avoiding repetitive loops, and adhering to guidelines to safeguard the system against jailbreaks and prompt injections. Such demands can easily overwhelm a single agent, leading to a trend where what appears to an end user as a single agent is actually a coordinated effort of multiple agents and prompts working in unison to divide and conquer the task. This division enables tasks to be segmented and addressed concurrently by distinct models and agents, each tailored to tackle specific components of the problem. This is where models with exceptional tool-calling capabilities come into play. While tool calling is a potent method for empowering productive agents, it introduces its own set of challenges. Agents must grasp the available tools, choose the appropriate one from a potentially similar set, accurately format the inputs, execute calls in the correct sequence, and potentially integrate feedback or instructions from other agents or humans. Many models are fine-tuned specifically for tool calling, allowing them to specialize in selecting functions accurately at the right time. Key considerations when fine-tuning a model for tool calling include: Common Benchmarks for Evaluating Tool Calling As tool usage in language models becomes increasingly significant, numerous datasets have emerged to facilitate the evaluation and enhancement of model tool-calling capabilities. Two prominent benchmarks include the Berkeley Function Calling Leaderboard and the Nexus Function Calling Benchmark, both utilized by Meta to assess the performance of their Llama 3.1 model series. The recent ToolACE paper illustrates how agents can generate a diverse dataset for fine-tuning and evaluating model tool use. Here’s a closer look at each benchmark: Each of these benchmarks enhances our ability to evaluate model reasoning through tool calling. They reflect a growing trend toward developing specialized models for specific tasks and extending the capabilities of LLMs to interact with the real world. Practical Applications of Tool Calling If you’re interested in observing tool calling in action, here are some examples to consider, categorized by ease of use, from simple built-in tools to utilizing fine-tuned models and agents with tool-calling capabilities. While the built-in web search feature is convenient, most applications require defining custom tools that can be integrated into your model workflows. This leads us to the next complexity level. To observe how models articulate tool calls, you can use the Databricks Playground. For example, select the Llama 3.1 405B model and grant access to sample tools like get_distance_between_locations and get_current_weather. When prompted with, “I am going on a trip from LA to New York. How far are these two cities? And what’s the weather like in New York? I want to be prepared for when I get there,” the model will decide which tools to call and what parameters to provide for an effective response. In this scenario, the model suggests two tool calls. Since the model cannot execute the tools, the user must input a sample result to simulate. Suppose you employ a model fine-tuned on the Berkeley Function Calling Leaderboard dataset. When prompted, “How many times has the word ‘freedom’ appeared in the entire works of Shakespeare?” the model will successfully retrieve and return the answer, executing the required tool calls without the user needing to define any input or manage the output format. Such models handle multi-turn interactions adeptly, processing past user messages, managing context, and generating coherent, task-specific outputs. As AI agents evolve to encompass advanced reasoning and problem-solving capabilities, they will become increasingly adept at managing

Read More
Agentforce - AI's New Role in Sales and Service

Agentforce – AI’s New Role in Sales and Service

From Science Fiction to Reality: AI’s Game-Changing Role in Service and Sales AI for service and sales has reached a critical tipping point, driving rapid innovation. At Dreamforce in San Francisco, hosted by Salesforce we explored how Salesforce clients are leveraging CRM, Data Cloud, and AI to extract real business value from their Salesforce investments. In previous years, AI features branded under “Einstein” had been met with skepticism. These features, such as lead scoring, next-best-action suggestions for service agents, and cross-sell/upsell recommendations, often required substantial quality data in the CRM and knowledge base to be effective. However, customer data was frequently unreliable, with duplicate records and missing information, and the Salesforce knowledge base was underused. Building self-service capabilities with chatbots was also challenging, requiring accurate predictions of customer queries and well-structured decision trees. This year’s Dreamforce revealed a transformative shift. The advancements in AI, especially for customer service and sales, have become exceptionally powerful. Companies now need to take notice of Salesforce’s capabilities, which have expanded significantly. Agentforce – AI’s New Role in Sales and Service Some standout Salesforce features include: At Dreamforce, we participated in a workshop where they built an AI agent capable of responding to customer cases using product sheets and company knowledge within 90 minutes. This experience demonstrated how accessible AI solutions have become, no longer requiring developers or LLM experts to set up. The key challenge lies in mapping external data sources to a unified data model in Data Cloud, but once achieved, the potential for customer service and sales is immense. How AI and Data Integrate to Transform Service and Sales Businesses can harness the following integrated components to build a comprehensive solution: Real-World Success and AI Implementation OpenTable shared a successful example of building an AI agent for its app in just two months, using a small team of four. This was a marked improvement from the company’s previous chatbot projects, highlighting the efficiency of the latest AI tools. Most CEOs of large enterprises are exploring AI strategies, whether by developing their own LLMs or using pre-existing models. However, many of these efforts are siloed, and engineering costs are high, leading to clunky transitions between AI and human agents. Tectonic is well-positioned to help our clients quickly deploy AI-powered solutions that integrate seamlessly with their existing CRM and ERP systems. By leveraging AI agents to streamline customer interactions, enhance sales opportunities, and provide smooth handoffs to human agents, businesses can significantly improve customer experiences and drive growth. Tectonic is ready to help businesses achieve similar success with AI-driven innovation. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Oracle Advertising Sundown

Oracle Advertising Sundown

Oracle Shifts Focus to B2B CX, Introduces New Fusion Cloud Features Despite winding down its online advertising products, Oracle is doubling down on its investment in customer experience (CX) technology, particularly in enabling B2B buying and supporting subscription and consumption models. During the Oracle CloudWorld conference on Wednesday, the company unveiled new capabilities for its Fusion Cloud Customer Experience and Unity Customer Data Platform. These enhancements empower Oracle CX users to analyze customer profiles to assemble B2B buying teams, leverage generative AI tools like native analytics, and utilize industry-specific accelerators to speed up the adoption of customer data tools. Key features include the ability to create self-service sites for individual accounts, enabling customers to review and summarize contracts using generative AI, receive quotes, and renew subscriptions. Other features enhance “assisted buying experiences,” blending self-service and human interaction, while tools like account onboarding and AI-powered email drafting simplify full-service sales processes. Subscription models, though still in their early stages for B2B, offer a streamlined alternative to traditional procurement processes. As Liz Miller, an analyst at Constellation Research, noted, subscription-based buying is easier and quicker, avoiding the lengthy procurement cycles many B2B buyers are familiar with. “The pain of traditional B2B buying is still fresh in everyone’s mind,” she said. Oracle Advertising Shuts Down Oracle’s advertising product support will end on September 30, as confirmed by CEO Safra Catz during the company’s June earnings call. The Oracle Advertising Data Management Platform (DMP), built from its BlueKai acquisition, is being retired, following in the footsteps of Salesforce, which discontinued its Audience Studio in 2021. Despite Oracle winding down its ad platform, this move shouldn’t be seen as a shift away from customer experience. Oracle founder Larry Ellison remains deeply involved in shaping the company’s CX strategy, with a focus on marketing tools and Apex low-code platforms, said Rob Pinkerton, Oracle’s senior vice president. Oracle’s modernized CX suite, built on the Fusion Cloud platform, has evolved significantly in recent years, though questions remain about whether it’s too late to regain market share. “Oracle as a CX platform has fallen off the radar for many buyers,” said Miller, adding that customers are no longer debating between Oracle, Microsoft, and Salesforce in the CX space. New Industry-Specific Tools for CX Oracle has also expanded its CX platform with industry-specific tools designed to accelerate the adoption of its customer data platform (CDP) across sectors such as high tech, manufacturing, professional services, telecommunications, utilities, financial services, travel, and retail. According to Rebecca Wettemann, CEO of research firm Valoir, Oracle’s Fusion platform has matured significantly and now supports the complexity of modern customer needs. Wettemann highlighted how common components like customer interaction summaries can be adapted for multiple industries, delivering faster results than traditional applications. Oracle’s Clinical Digital Assistant is one such example of this approach, illustrating the platform’s versatility and AI-driven enhancements. With these developments, Oracle continues to refine its CX offerings to better meet the unique demands of B2B customers, providing tools that streamline operations and enhance customer experiences across various industries. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Uplimit AI-Powered ELP

Uplimit AI-Powered ELP

Uplimit Secures $11M in Series A Funding to Enhance AI-Powered Enterprise Learning SAN FRANCISCO, July 24, 2024 /PRNewswire/ — Uplimit, a leading provider of AI-powered enterprise learning solutions, has announced the successful completion of an $11M Series A funding round. This funding, led by Salesforce Ventures with participation from existing investors GSV Ventures, Greylock Partners, and Cowboy Ventures, as well as new investors Translink Capital, Workday Ventures, and Conviction, underscores the growing importance of effective employee upskilling in response to the rapid advancements in Generative AI technology. Uplimit AI-Powered ELP. “Helping employees stay ahead of technological advancements is now a critical priority for the organizations we serve,” said Claudine Emeott, Partner at Salesforce Ventures and Head of the Salesforce Ventures Impact Fund. “AI has the potential to equip both companies and individuals with the necessary skills to thrive, and Uplimit is at the forefront of integrating AI into education and training. We are excited to support their continued growth and look forward to seeing the significant impact they will have in the coming years.” “AI has the potential to equip both companies and individuals with the necessary skills to thrive, and Uplimit is at the forefront of integrating AI into education and training. We are excited to support their continued growth and look forward to seeing the significant impact they will have in the coming years.” Claudine Emeott, Partner at Salesforce Ventures and Head of the Salesforce Ventures Impact Fund Uplimit AI-Powered ELP With this new funding, Uplimit plans to expand its enterprise platform offerings, aiming to provide comprehensive upskilling solutions to more organizations and employees. Traditional education systems often require extensive resources for content creation, personalized feedback, and support, which can hinder scalability. While some scalable solutions exist, they often compromise on quality and outcomes. Uplimit is addressing this challenge with an innovative approach that combines scale and effectiveness. Their AI-driven platform enhances cohort management, learner support, and course authoring, enabling companies to deliver personalized learning experiences at scale. Uplimit’s recent introduction of AI-enabled role-play scenarios provides learners with immediate feedback, revolutionizing training and development for roles such as managers, support teams, and sales professionals. “Quality education has historically been a scarce resource, but AI is changing that,” said Julia Stiglitz, CEO and Co-founder of Uplimit. “AI allows us to create and update educational content rapidly, ensuring that learners receive personalized experiences even in large-scale courses. This is crucial as the demand for new skills, driven by the rapid evolution of AI technologies, continues to grow. Uplimit provides the tools needed for employees to quickly grasp new skills, tailored to their current knowledge and needs.” Uplimit has collaborated with a diverse range of companies, from Fortune 500 giants like GE Healthcare and Kraft Heinz to innovative startups such as Procore. Databricks, a leader in AI-powered data intelligence, was an early adopter of Uplimit’s platform for customer education. “We needed a learning platform that could scale to hundreds of thousands of learners while maintaining high levels of engagement and completion,” said Rochana Golani, VP of Learning and Enablement at Databricks. “Uplimit’s platform offers the perfect blend of real-time human instruction and personalized AI support, along with valuable peer interaction. This approach is set to be transformative for many of our customers.” The new funding will enable Uplimit to further enhance its enterprise and customer education offerings, expanding its AI capabilities to include advanced cohort management tools, rapid course feedback integration, interactive practice and assessment modules, and AI-powered course authoring. Join us on August 14th for our launch event, where we will explore how this funding will accelerate our mission and demonstrate the impact our platform is having on enterprise learning. About Uplimit Uplimit is a comprehensive AI-driven learning platform designed to equip companies with the tools needed to train employees and customers in emerging skills. The platform leverages AI to scale learning programs effectively, offering features such as AI-powered learner support, generative AI for content creation, and live cohort management tools. This approach ensures high engagement and completion rates, significantly surpassing traditional online courses. Uplimit also offers a marketplace of advanced courses in AI, technology, and leadership, taught by industry experts. Founded by Julia Stiglitz, Sourabh Bajaj, and Jake Samuelson, Uplimit is backed by Salesforce Ventures, Greylock Partners, Cowboy Ventures, GSV Ventures, Conviction, Workday Ventures, and Translink Capital, with contributions from the co-founders of OpenAI and DeepMind. Notable customers include GE Healthcare, Kraft Heinz, and Databricks. Uplimit has been featured in leading industry publications such as ATD, Josh Bersin, and Fast Company. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
GitHub Copilot Autofix

GitHub Copilot Autofix

On Wednesday, GitHub announced the general availability of Copilot Autofix, an AI-driven tool designed to identify and remediate software vulnerabilities. Originally unveiled in March and tested in public beta, Copilot Autofix integrates GitHub’s CodeQL scanning engine with GPT-4, heuristics, and Copilot APIs to generate code suggestions for developers. The tool provides prompts based on CodeQL analysis and code snippets, allowing users to accept, edit, or reject the suggestions. In a blog post, Mike Hanley, GitHub’s Chief Security Officer and Senior Vice President of Engineering, highlighted the challenges developers and security teams face in addressing existing vulnerabilities. “Code scanning tools can find vulnerabilities, but the real issue is remediation, which requires security expertise and time—both of which are in short supply,” Hanley noted. “The problem isn’t finding vulnerabilities; it’s fixing them.” According to GitHub, the private beta of Copilot Autofix showed that users could respond to a CodeQL alert and automatically remediate a vulnerability in a pull request in just 28 minutes on average, compared to 90 minutes for manual remediation. The tool was even faster for common vulnerabilities like cross-site scripting, with remediation times averaging 22 minutes compared to three hours manually, and SQL injection flaws, which were fixed in 18 minutes on average versus almost four hours manually. Hanley likened the efficiency of Copilot Autofix in fixing vulnerabilities to the speed at which GitHub Copilot, their generative AI coding assistant released in 2022, produces code for developers. However, there have been concerns that GitHub Copilot and similar AI coding assistants could replicate existing vulnerabilities in the codebases they help generate. Industry analyst Katie Norton from IDC noted that while the replication of vulnerabilities is concerning, the rapid pace at which AI coding assistants generate new software could pose a more significant security issue. Chris Wysopal, CTO and co-founder of Veracode, echoed this concern, pointing out that faster coding speeds have led to more software being produced and a larger backlog of vulnerabilities for developers to manage. Norton also emphasized that AI-powered tools like Copilot Autofix could help alleviate the burden on developers by reducing these backlogs and enabling them to fix vulnerabilities without needing to be security experts. Other vendors, including Mobb and Snyk, have also developed AI-powered autoremediation tools. Initially supporting JavaScript, TypeScript, Java, and Python during its public beta, Copilot Autofix now also supports C#, C/C++, Go, Kotlin, Swift, and Ruby. Hanley also highlighted that Copilot Autofix would benefit the open-source software community. GitHub has previously provided open-source maintainers with free access to enterprise security tools for code scanning, secret scanning, and dependency management. Starting in September, Copilot Autofix will also be made available for free to these maintainers. “As the global home of the open-source community, GitHub is uniquely positioned to help maintainers detect and remediate vulnerabilities, making open-source software safer and more reliable for everyone,” Hanley said. Copilot Autofix is now available to all GitHub customers globally. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Everyone Is Implementing AI

Everyone Is Implementing AI

AI is undoubtedly a generational change in software, with its full trajectory still unpredictable. There is a perceived divide between the “Haves” and “Have Nots.” Companies like OpenAI, Microsoft, and Databricks are seen as understanding AI’s potential, with Nvidia providing the necessary hardware support. Many hot start-ups are Gen AI native, continuing to attract unicorn valuations. Meanwhile, several SaaS leaders appear to be lagging behind. We say, Everyone Is Implementing AI. Marc Benioff stated in their latest quarterly call: “Now, we’re working with thousands of customers to power generative AI use cases with our Einstein Copilot, our prompt builder, our Einstein Studio, all of which went live in the first quarter. And we’ve closed hundreds of copilot deals since this incredible technology has gone GA. And in just the last few months, we’re seeing Einstein Copilot develop higher levels of capability. We are absolutely delighted and cannot be more excited about the success that we’re seeing with our customers with this great new capability.” Everyone Is Implementing AI However, it remains unclear whether simply adding AI to classic B2B SaaS products accelerates growth. Despite significant investments in AI, companies like Salesforce, Asana, and ZoomInfo are growing at less than 10% annually. The main point is that while “AI Washing” might impress some investors, AI must significantly accelerate revenue growth to achieve more than market parity. It is essential to see how AI can add real value and integrate it effectively. But AI alone may not be a growth accelerant. Everyone Is Implementing AI Recent data from Emergence Capital shows that 60% of VC-backed SaaS companies have already released GenAI features, with another 24% planning to do so. Achieving “AI Parity” is crucial, but simply adding GenAI features may not be disruptive in the B2B space. Companies must go further to stand out, despite the challenges. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Salesforce Data Cloud Pioneer

Salesforce Data Cloud Pioneer

While many organizations are still building their data platforms, Salesforce Data Cloud Pioneer has made a significant leap forward. By seamlessly incorporating metadata integration, Salesforce has transformed the modern data stack into a comprehensive application platform known as the Einstein 1 Platform. Led by Muralidhar Krishnaprasad, executive vice president of engineering at Salesforce, the Einstein 1 Platform is built on the company’s metadata framework. This platform harmonizes metadata and integrates it with AI and automation, marking a new era of data utilization. The Einstein 1 Platform: Innovations and Capabilities Salesforce’s goal with the Einstein 1 Platform is to empower all business users—salespeople, service engineers, marketers, and analysts—to access, use, and act on all their data, regardless of its location, according to Krishnaprasad. The open, extensible platform not only unlocks trapped data but also equips organizations with generative AI functionality, enabling personalized experiences for employees and customers. “Analytics is very important to know how your business is doing, but you also want to make sure all that data and insights are actionable,” Krishnaprasad said. “Our goal is to blend AI, automation, and analytics together, with the metadata layer being the secret sauce.” Salesforce Data Cloud Pioneer In a conversation with George Gilbert, senior analyst at theCUBE Research, Krishnaprasad discussed the platform’s metadata integration, open-API technology, and key features. They explored how its extensibility and interoperability enhance usability across various data formats and sources. Metadata Integration: Accommodating Any IT Environment The Einstein 1 Platform is built on Trino, the federated open-source query engine, and Spark for data processing. It offers a rich set of connectors and an open, extensible environment, enabling organizations to share data between warehouses, lake houses, and other systems. “We use a hyper-engine for sub-second response times in Tableau and other data explorations,” Krishnaprasad explained. “This in-memory overlap engine ensures efficient data processing.” The platform supports various machine learning options and allows users to integrate their own large language models. Whether using Salesforce Einstein, Databricks, Vertex, SageMaker, or other solutions, users can operate without copying data. The platform includes three levels of extensibility, enabling organizations to standardize and extend their customer journey models. Users can start with basic reference models, customize them, and then generate insights, including AI-driven insights. Finally, they can introduce their own functions or triggers to act on these insights. The platform continuously performs unification, allowing users to create different unified graphs based on their needs. “We’re a multimodal system, considering your entire customer journey,” Krishnaprasad said. “We provide flexibility at all levels of the stack to create the right experience for your business.” The Triad of AI, Automation, and Analytics The platform’s foundation ingests, harmonizes, and unifies data, resulting in a standardized metadata model that offers a 360-degree view of customer interactions. This approach unlocks siloed data, much of which is in unstructured forms like conversations, documents, emails, audio, and video. “What we’ve done with this customer 360-degree model is to use unified data to generate insights and make these accessible across application surfaces, enabling reactions to these insights,” Krishnaprasad said. “This unlocks a comprehensive customer journey.” For instance, when a customer views an ad and visits the website, salespeople know what they’re interested in, service personnel understand their concerns, and analysts have the information needed for business insights. These capabilities enhance customer engagement. “Couple this with generative AI, and we enable a lot of self-service,” Krishnaprasad added. “We aim to provide accurate answers, elevating data to create a unified model and powering a unified experience across the entire customer journey.” Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Lakeflow for Data Engineering

Lakeflow for Data Engineering

Databricks unveiled Databricks LakeFlow last week, a new tool designed to unify all aspects of data engineering, from data ingestion and transformation to orchestration. What is Databricks LakeFlow? According to Databricks, LakeFlow simplifies the creation and operation of production-grade data pipelines, making it easier for data teams to handle complex data engineering tasks. This solution aims to meet the growing demands for reliable data and AI by providing an efficient and streamlined approach. The Current State of Data Engineering Data engineering is crucial for democratizing data and AI within businesses, yet it remains a challenging field. Data teams must often deal with: How LakeFlow Addresses These Challenges LakeFlow offers a unified experience for all aspects of data engineering, simplifying the entire process: Key Features of LakeFlow LakeFlow comprises three main components: LakeFlow Connect, LakeFlow Pipelines, and LakeFlow Jobs. Availability LakeFlow is entering preview soon, starting with LakeFlow Connect. Customers can register to join the waitlist today. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
  • 1
  • 2
gettectonic.com