Snowflake Archives - gettectonic.com - Page 2

pardot1084082=08782fe4994719f386e1d3c9bbd9a12817d57a65b36593fbba0e8645340e02b6

Winter 25 Salesforce Release

Get Ready for Winter 25 Salesforce Release

Salesforce Winter 25 Release notes are here. Salesforce Overall Learn about new features and enhancements that affect your Salesforce experience overall. August 8: Get early access by signing up for a Pre-Release org Admins can sign up for a pre-release Developer Edition environment, which is full of all the Winter ’25 features to explore to your heart’s content. Developer environments are stand-alone environments where you can learn, build, and get comfortable with features and functionality. If you already had a pre-release org for Summer ’24, you can log back into that one. August 14: Review the Release Notes Search the products you use for release updates in the Release Notes section of Salesforce Help. The notes will go live August 14 and we will share the link here. Get help from the community! With each release, there are a number of blogs by community members who break it down. Check out the Release Readiness Trailblazer Community Group where you can continue to get updates, share your favorite features, and ask questions about the upcoming release. August 19: Be Release Ready with Winter ’25 features for Admins Starting on August 19th, we’ll begin publishing blog posts on the Admin Blog to help you Be Release Ready with Winter ’25 features. Get ready to dive into blog posts featuring Winter ’25 user access highlights and more! As blog posts and more release resources become available, we’ll be updating the Be Release Ready page with all the resources and information you need to get started with Winter ’25. August 29 before 5 p.m. PT: Be sure to refresh your Sandbox Once you’ve explored the pre-release org and reviewed the Release Notes for features that are important to you, it’s time to try out features related to your customizations in your sandbox. This is a great time to evaluate how specific features may be useful or impact the way your organization uses Salesforce. During each release, there is a group of sandboxes slated to remain on the non-preview instance (i.e. the current release) while there is another group of sandboxes that will upgrade to the preview instance. Use the Salesforce Sandbox Preview Guide to determine the plan for your sandbox instance(s). Use the tool where you can search by sandbox instance and then specify what you want to do with your sandbox — stay on the non-preview or move to preview. It will then instruct you to refresh your sandbox to get to the desired instance or inform you that there is no action needed because your sandbox is slated for the desired instance. Contact Tectonic today if you need assistance getting Salesforce release ready. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce Data Snowflake and You

Salesforce Data Snowflake and You

Unlock the Full Potential of Your Salesforce Data with Snowflake At Tectonic, we’ve dedicated years to helping businesses maximize their Salesforce investment, driving growth and enhancing customer experiences. Now, we’re expanding those capabilities by integrating with Snowflake.Imagine the power of merging Salesforce data with other sources, gaining deeper insights, and making smarter decisions—without the hassle of complex infrastructure. Snowflake brings this to life with a flexible, scalable solution for unifying your data ecosystem.In this insight, we’ll cover why Snowflake is essential for Salesforce users, how seamlessly it integrates, and why Tectonic is the ideal partner to help you leverage its full potential. Why Snowflake Matters for Salesforce Users Salesforce excels at managing customer relationships, but businesses today need data from multiple sources—e-commerce, marketing platforms, ERP systems, and more. That’s where Snowflake shines. With Snowflake, you can unify these data sources, enrich your Salesforce data, and turn it into actionable insights. Say goodbye to silos and blind spots. Snowflake is easy to set up, scales effortlessly, and integrates seamlessly with Salesforce, making it ideal for enhancing CRM data across various business functions.The Power of Snowflake for Salesforce Users Enterprise-Grade Security & GovernanceSnowflake ensures that your data is secure and compliant. With top-tier security and data governance tools, your customer data remains protected and meets regulatory requirements across platforms, seamlessly integrating with Salesforce. Cross-Cloud Data SharingSnowflake’s Snowgrid feature makes it easy for Salesforce users to share and collaborate on data across clouds. Teams across marketing, sales, and operations can access the same up-to-date information, leading to better collaboration and faster, more informed decisions. Real-Time Data ActivationCombine Snowflake’s data platform with Salesforce Data Cloud to activate insights in real-time, enabling enriched customer experiences through dynamic insights from web interactions, purchase history, and service touchpoints. Tectonic + Snowflake: Elevating Your Salesforce Experience Snowflake offers powerful data capabilities, but effective integration is key to realizing its full potential—and that’s where Tectonic excels. Our expertise in Salesforce, now combined with Snowflake, ensures that businesses can maximize their data strategies. How Tectonic Helps: Strategic Integration Planning: We assess your current data ecosystem and design a seamless integration between Salesforce and Snowflake to unify data without disrupting operations. Custom Data Solutions: From real-time dashboards to data enrichment workflows, we create solutions tailored to your business needs. Ongoing Support and Optimization: Tectonic provides continuous support, adapting your Snowflake integration to meet evolving data needs and business strategies. Real-World Applications Retail: Integrate in-store and e-commerce sales data with Salesforce for real-time customer insights. Healthcare: Unify patient data from wearables, EMRs, and support interactions for a holistic customer care experience. Financial Services: Enhance Salesforce data with third-party risk assessments, enabling quicker, more accurate underwriting. Looking Ahead: The Tectonic Advantage Snowflake opens up new possibilities for Salesforce-powered businesses. Effective integration, however, requires strategic planning and hands-on expertise. Tectonic has a long-standing track record of helping clients get the most out of Salesforce, and now, Snowflake adds an extra dimension to our toolkit. Whether you want to better manage data, unlock insights, or enhance AI initiatives, Tectonic’s combined Salesforce and Snowflake expertise ensures you’ll harness the best of both worlds. Stay tuned as we dive deeper into Snowflake’s features, such as Interoperable Storage, Elastic Compute, and Cortex AI with Arctic, and explore how Tectonic is helping businesses unlock the future of data and AI. Ready to talk about how Snowflake and Salesforce can transform your business? Contact Tectonic today! Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Snowpark Container Services

Snowpark Container Services

Snowflake announced on Thursday the general availability of Snowpark Container Services, enabling customers to securely deploy and manage models and applications, including generative AI, within Snowflake’s environment. Initially launched in preview in June 2023, Snowpark Container Services is now a fully managed service available in all AWS commercial regions and in public preview in all Azure commercial regions. Containers are a software method used to isolate applications for secure deployment. Snowflake’s new feature allows customers to use containers to manage and deploy any type of model, optimally for generative AI applications, by securely integrating large language models (LLMs) and other generative AI tools with their data, explained Jeff Hollan, Snowflake’s head of applications and developer platform. Mike Leone, an analyst at TechTarget’s Enterprise Strategy Group, noted that Snowpark Container Services’ launch builds on Snowflake’s recent efforts to provide customers with an environment for developing generative AI models and applications. Sridhar Ramaswamy became Snowflake’s CEO in February, succeeding Frank Slootman, who led the company through a record-setting IPO. Under Ramaswamy, Snowflake has aggressively added generative AI capabilities, including launching its own LLM, integrating with Mistral AI, and providing tools for creating AI chatbots. “There has definitely been a concerted effort to enhance Snowflake’s capabilities and presence in the AI and GenAI markets,” Leone said. “Offerings like Snowpark help AI stakeholders like data scientists and developers use the languages they prefer.” As a result, Snowpark Container Services is a significant new feature for Snowflake customers. “It’s a big deal for the Snowflake ecosystem,” Leone said. “By enabling easy deployment and management of containers within the Snowflake platform, it helps customers handle complex workloads and maintain consistency across development and production stages.” Despite the secure environment provided by Snowflake Container Services, it was revealed in May that the login credentials of potentially 160 customers had been stolen and used to access their data. However, Snowflake has stated there is no evidence that the breach resulted from a vulnerability or misconfiguration of the Snowflake platform. Prominent customers affected include AT&T and Ticketmaster, and Snowflake’s investigation is ongoing. New Capabilities Generative AI can transform business by enabling employees to easily work with data to inform decisions and making trained experts more efficient. Generative AI, combined with an enterprise’s proprietary data, allows users to interact with data using natural language, reducing the need for coding and data literacy training. Non-technical workers can query and analyze data, freeing data engineers and scientists from routine tasks. Many data management and analytics vendors are focusing on developing generative AI-powered features. Enterprises are building models and applications trained on their proprietary data to inform business decisions. Among data platform vendors, AWS, Databricks, Google, IBM, Microsoft, and Oracle are providing environments for generative AI tool development. Snowflake, under Slootman, was less aggressive in this area but is now committed to generative AI development, though it still has ground to cover compared to its competitors. “Snowflake has gone as far as creating their own LLM,” Leone said. “But they still have a way to go to catch up to some of their top competitors.” Matt Aslett, an analyst at ISG’s Ventana Research, echoed that Snowflake is catching up to its rivals. The vendor initially focused on traditional data warehouse capabilities but made a significant step forward with the late 2023 launch of Cortex, a platform for developing AI models and applications. Cortex includes access to various LLMs and vector search capabilities, marking substantial progress. The general availability of Snowpark Container Services furthers Snowflake’s effort to foster generative AI development. The feature provides users with on-demand GPUs and CPUs to run any code next to their data. This enables the deployment and management of any type of model or application without moving data out of Snowflake’s platform. “It’s optimized for next-generation data and AI applications by pushing that logic to the data,” Hollan said. “This means customers can now easily and securely deploy everything from source code to homegrown models in Snowflake.” Beyond security, Snowpark Container Services simplifies model management and deployment while reducing associated costs. Snowflake provides a fully integrated managed service, eliminating the need for piecing together various services from different vendors. The service includes a budget control feature to reduce operational costs and provide cost certainty. Snowpark Container Services includes diverse storage options, observability tools like Snowflake Trail, and streamlined DevOps capabilities. It supports deploying LLMs with local volumes, memory, Snowflake stages, and configurable block storage. Integrations with observability specialists like Datadog, Grafana, and Monte Carlo are also included. Aslett noted that the 2020 launch of the Snowpark development environment enabled users to use their preferred coding languages with their data. Snowpark Container Services takes this further by allowing the use of third-party software, including generative AI models and data science libraries. “This potentially reduces complexity and infrastructure resource requirements,” Aslett said. Snowflake spent over a year moving Snowpark Container Services from private preview to general availability, focusing on governance, networking, usability, storage, observability, development operations, scalability, and performance. One customer, Landing AI, used Snowpark Container Services during its preview phases to develop LandingLens, an application for training and deploying computer vision models. “[With Snowflake], we are increasing access to AI for more companies and use cases, especially given the rapid growth of unstructured data in our increasingly digital world,” Landing AI COO Dan Maloney said in a statement Thursday. Future Plans With Snowpark Container Services now available on AWS, Snowflake plans to extend the feature to all cloud platforms. The vendor’s roadmap includes further improvements to Snowpark Container Services with more enterprise-grade tools. “Our team is investing in making it easy for companies ranging from startups to enterprises to build, deliver, distribute, and monetize next-generation AI products across their ecosystems,” Hollan said. Aslett said that making Snowpark Container Services available on Azure and Google Cloud is the logical next step. He noted that the managed service’s release is significant but needs broader availability beyond AWS regions. “The next step will be to bring Snowpark Container Services to general

Read More
Data Cloud - Facts and Fiction

Data Cloud – Facts and Fiction

Salesforce Data Cloud: Debunking Myths and Unveiling Facts If you’ve been active on LinkedIn, attending recent Salesforce events, or even watching a myriad of sporting events, you’ve likely noticed that Salesforce has evolved beyond just CRM. It’s now CRM + DATA + AI. Although Salesforce has always incorporated these elements, with Einstein AI and data being integral to CRM, the latest innovation lies in the Data Cloud. Data Cloud – Facts and Fiction Data Cloud, formerly known as Salesforce Genie, represents Salesforce’s latest evolution, focusing on enabling organizations to scale and grow in an era where data is the new currency. It is the fastest-growing product in Salesforce’s history, pushing new boundaries of innovation by providing better access to data and actionable insights. As Data Cloud rapidly develops, potential clients often have questions about its function and how it can address their challenges. Here are some common myths about Data Cloud and the facts that debunk them. Myth: Data Cloud Requires MuleSoft Fact: While MuleSoft Anypoint Platform can accelerate connecting commonly used data sources, it is not required for Data Cloud. Data Cloud can ingest data from multiple systems and platforms using several out-of-the-box (OOTB) connectors, including SFTPs, Snowflake, AWS, and more. Salesforce designs its solutions to work seamlessly together, but Data Cloud also offers connector options for non-Salesforce products, ensuring flexibility and integration capabilities beyond the Salesforce ecosystem. Myth: Data Cloud Will De-Duplicate Your Data Fact: Harmonizing data in Data Cloud means standardizing your data model rather than de-duplicating it. Data Cloud maps fields to a common data model and performs “Identity Resolution,” using rules to match individuals based on attributes like email, address, device ID, or phone number. This process creates a Unified Individual ID without automatically de-duplicating Salesforce records. Salesforce intentionally does not function as a Master Data Management (MDM) system. Myth: Data Cloud Will Create a Golden Record Fact: Data Cloud does not create a single, updated record synchronized across all systems (a “golden record”). Instead, it retains original source information, identifies matches across systems, and uses this data to facilitate engagements, known as the Data Cloud Key Ring. For instance, it can recognize an individual across different systems and provide personalized experiences without overwriting original data. Myth: You Can’t Ingest Custom Objects from Salesforce Fact: During the data ingestion process, you can select which objects to ingest from your Salesforce CRM Org, including custom objects. The system identifies the API names of the objects and fields from the data source. Ensuring the Data Cloud integration user has access to the necessary information (similar to assigning Permission Sets) allows you to ingest and map custom objects accordingly. Myth: Data Cloud Requires a Data Scientist and Takes a Long Time to Implement Fact: While implementing Data Cloud involves ingesting, mapping data, running identity resolution, and generating insights, it does not necessarily require a data scientist. Skilled Salesforce Admins can often manage data integration from third-party applications. Effective Data Cloud implementation requires thorough planning and preparation, akin to prepping a room before painting. Identifying use cases and understanding data sources in advance can streamline the implementation process. Myth: Data Cloud is Expensive Fact: Data Cloud operates on a consumption-based pricing model. Engaging in strategic conversations with Salesforce Account Executives can help understand the financial implications. Emphasizing the value of a comprehensive data strategy and considering the five V’s of Big Data—Volume, Variety, Veracity, Value, and Velocity—ensures that your data supports meaningful business outcomes and KPIs. In Summary Salesforce Data Cloud represents a significant evolution in managing and leveraging data within your organization. It helps break down data silos, providing actionable insights to drive organizational goals. Despite initial misconceptions, implementing Data Cloud does not require extensive coding skills or a data scientist. Instead, thorough planning and preparation can streamline the process and maximize efficiency. Understanding the value of a comprehensive data strategy is crucial, as data becomes the new currency. Addressing the five V’s of Big Data ensures that your data supports meaningful business outcomes and KPIs. At Tectonic, our team of certified professionals is ready to assist you on this journey. We offer a Salesforce Implementation Solution package to help you get hands-on with the tool and explore its capabilities. Whether you need help understanding your data sources or defining use cases, our data practice can provide the expertise you need. Talk to Tectonic about Data Cloud and discover how our tailored solutions can help you harness the full potential of your data. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Cyber Group Targets SaaS Platforms

Cyber Group Targets SaaS Platforms

Cyber Group UNC3944 Targets SaaS Platforms like Azure, Salesforce, vSphere, AWS, and Google Cloud UNC3944, also known as “0ktapus” and “Scattered Spider,” has shifted its focus to attacking Software-as-a-Service (SaaS) applications, as reported by Google Cloud’s Mandiant threat intelligence team. This hacking group, previously linked to incidents involving companies such as Snowflake and MGM Entertainment, has evolved its strategies to concentrate on data theft and extortion. Cyber Group Targets SaaS Platforms Attack Techniques UNC3944 exploits legitimate third-party tools for remote access and leverages Okta permissions to expand their intrusion capabilities. One notable aspect of their attacks involves creating new virtual machines in VMware vSphere and Microsoft Azure, using administrative permissions linked through SSO applications for further activities. The group uses commonly available utilities to reconfigure virtual machines (VMs), disable security protocols, and download tools such as Mimikatz and ADRecon, which extract and combine various artifacts from Active Directory (AD) and Microsoft Entra ID environments. Evolving Methods Initially, UNC3944 employed a variety of techniques, but over time, their methods have expanded to include ransomware and data theft extortion. Active since at least May 2022, the group has developed resilience mechanisms against virtualization platforms and improved their ability to move laterally by abusing SaaS permissions. The group also uses SMS phishing to reset passwords and bypass multi-factor authentication (MFA). Once inside, they conduct thorough reconnaissance of Microsoft applications like SharePoint to understand remote connection needs. According to Google Cloud’s Mandiant team, UNC3944’s primary activity is now data theft without using ransomware. They employ expert social engineering tactics, using detailed personal information to bypass identity checks and target employees with high-level access. Social Engineering and Threats Attackers often pose as employees, contacting help desks to request MFA resets for setting up new phones. If help desk staff comply, attackers can easily bypass MFA and reset passwords. If social engineering fails, UNC3944 resorts to threats, including doxxing, physical threats, or releasing compromising material to coerce credentials from victims. Once access is gained, they gather information on tools like VPNs, virtual desktops, and remote work utilities to maintain consistent access. Targeting SaaS and Cloud Platforms UNC3944 targets Okta’s single sign-on (SSO) tools, allowing them to create accounts that facilitate access to multiple systems. Their attacks extend to VMware’s vSphere hybrid cloud management tool and Microsoft Azure, where they create virtual machines for malicious purposes. By operating within a trusted IP address range, they complicate detection. Additional targets include SaaS applications like VMware’s vCenter, CyberArk, Salesforce, CrowdStrike, Amazon Web Services (AWS), and Google Cloud. Office 365 is another focus, with attackers using Microsoft’s Delve tool to identify valuable information. To exfiltrate data, they use synchronization utilities such as Airbyte and Fivetran to transfer information to their own cloud storage. The group also targets Active Directory Federation Services (ADFS) to extract certificates and employ Golden SAML attacks for continued access to cloud applications. They leverage Microsoft 365 capabilities like Office Delve for quick reconnaissance and data mining. Recommendations – Cyber Group Targets SaaS Platforms Mandiant advises deploying host-based certificates with MFA for VPN access, implementing stricter conditional access policies, and enhancing monitoring for SaaS applications. Consolidating logs from crucial SaaS applications and monitoring virtual machine setups can help identify potential breaches. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Zero ETL

Zero ETL

What is Zero-ETL? Zero-ETL represents a transformative approach to data integration and analytics by bypassing the traditional ETL (Extract, Transform, Load) pipeline. Unlike conventional ETL processes, which involve extracting data from various sources, transforming it to fit specific formats, and then loading it into a data repository, Zero-ETL eliminates these steps. Instead, it enables direct querying and analysis of data from its original source, facilitating real-time insights without the need for intermediate data storage or extensive preprocessing. This innovative method simplifies data management, reducing latency and operational costs while enhancing the efficiency of data pipelines. As the demand for real-time analytics and the volume of data continue to grow, ZETL offers a more agile and effective solution for modern data needs. Challenges Addressed by Zero-ETL Benefits of ZETL Use Cases for ZETL In Summary ZETL transforms data management by directly querying and leveraging data in its original format, addressing many limitations of traditional ETL processes. It enhances data quality, streamlines analytics, and boosts productivity, making it a compelling choice for modern organizations facing increasing data complexity and volume. Embracing Zero-ETL can lead to more efficient data processes and faster, more actionable insights, positioning businesses for success in a data-driven world. Components of Zero-ETL ZETL involves various components and services tailored to specific analytics needs and resources: Advantages and Disadvantages of ZETL Comparison: Z-ETL vs. Traditional ETL Feature Zero-ETL Traditional ETL Data Virtualization Seamless data duplication through virtualization May face challenges with data virtualization due to discrete stages Data Quality Monitoring Automated approach may lead to quality issues Better monitoring due to discrete ETL stages Data Type Diversity Supports diverse data types with cloud-based data lakes Requires additional engineering for diverse data types Real-Time Deployment Near real-time analysis with minimal latency Batch processing limits real-time capabilities Cost and Maintenance More cost-effective with fewer components More expensive due to higher computational and engineering needs Scale Scales faster and more economically Scaling can be slow and costly Data Movement Minimal or no data movement required Requires data movement to the loading stage Comparison: Zero-ETL vs. Other Data Integration Techniques Top Zero-ETL Tools Conclusion Transitioning to Zero-ETL represents a significant advancement in data engineering. While it offers increased speed, enhanced security, and scalability, it also introduces new challenges, such as the need for updated skills and cloud dependency. Zero-ETL addresses the limitations of traditional ETL and provides a more agile, cost-effective, and efficient solution for modern data needs, reshaping the landscape of data management and analytics. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Snowflake With AWS Salesforce and Microsoft

Snowflake With AWS Salesforce and Microsoft

In anticipation of its sixth annual user conference, Snowflake Summit 2024, Snowflake has unveiled the Polaris Catalog, a vendor-neutral, open catalog implementation for Apache Iceberg. This open standard is widely used for implementing data lakehouses, data lakes, and other data architectures. Snowflake With AWS Salesforce and Microsoft. The Polaris Catalog will be open-sourced for the next 90 days, offering enterprises like Goldman Sachs and the Iceberg community increased choice, flexibility, and control over their data. It also promises comprehensive enterprise security and compatibility with Apache Iceberg, enabling interoperability with AWS, Confluent, Dremio, Google Cloud, Microsoft Azure, Salesforce, and more. “We are collaborating with numerous industry partners to provide our mutual customers the ability to mix and match various query engines and coordinate read and write operations without vendor lock-in, and most importantly, to do so in an open manner.” Christian Kleinerman, Snowflake’s EVP of Product Kleinerman further highlighted that this initiative can “simplify how organizations access their data across diverse systems, enhancing flexibility and control.” Apache Iceberg, which became a top-level Apache Software Foundation project in May 2020 after emerging from incubation, has quickly become a leading open-source data table format. Building on this success, Polaris Catalog offers users a centralized location for any engine to discover and access an organization’s Iceberg tables with open interoperability. To ensure Polaris Catalog meets the evolving needs of the community, Snowflake is collaborating with the Iceberg ecosystem to advance the project. Chris Grusz, MD of technology partnerships at AWS, noted AWS’s commitment to working with partners on open-source solutions that enhance customer choice: “We’re pleased to work with Snowflake to continue to make Apache Iceberg interoperable across our engines.” Similarly, Raveendrnathan Loganathan, EVP of software engineering at Salesforce, mentioned that Apache Iceberg’s popularity has established an open storage standard simplifying zero-copy data access for organizations. “We’re thrilled to have Snowflake as a member of our Zero Copy Partner Network, and we’re excited about how this new open catalog standard will further zero-copy access in the enterprise,” he said. This development follows the recent expansion of the partnership between Snowflake and Microsoft, supporting leading open standards for storage formats, including Apache Iceberg and Apache Parquet. With Polaris Catalog, they aim to continue their mission of enabling users to leverage their enterprise data, regardless of its location, to develop AI-powered applications at scale. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Einstein Personalization and Copilots

Einstein Personalization and Copilots

Salesforce launched a suite of new generative AI products at Connections in Chicago, including new Einstein Copilots for marketers and merchants, and Einstein Personalization. Einstein Personalization and Copilots To gain insights into these products and Salesforce’s evolving architecture, Bobby Jania, CMO of Marketing Cloud was interviewed. Salesforce’s Evolving Architecture Salesforce has a knack for introducing new names for its platforms and products, sometimes causing confusion about whether something is entirely new or simply rebranded. Reporters sought clarification on the Einstein 1 platform and its relationship to Salesforce Data Cloud. “Data Cloud is built on the Einstein 1 platform,” Jania explained. “Einstein 1 encompasses the entire Salesforce platform, including products like Sales Cloud and Service Cloud, continuing the original multi-tenant cloud concept.” Data Cloud, developed natively on Einstein 1, was the first product built on Hyperforce, Salesforce’s new cloud infrastructure. “From the start, Data Cloud has been able to connect to and read anything within Sales Cloud, Service Cloud, etc. Additionally, it can now handle both structured and unstructured data.” This marks significant progress from a few years ago when Salesforce’s platform comprised various acquisitions (like ExactTarget) that didn’t seamlessly integrate. Previously, data had to be moved between products, often resulting in duplicates. Now, Data Cloud serves as the central repository, with applications like Tableau, Commerce Cloud, Service Cloud, and Marketing Cloud all accessing the same operational customer profile without duplicating data. Salesforce customers can also import their own datasets into Data Cloud. “We wanted a federated data model,” Jania said. “If you’re using Snowflake, for example, we virtually sit on your data lake, providing value by forming comprehensive operational customer profiles.” Understanding Einstein Copilot “Copilot means having an assistant within the tool you’re using, contextually aware of your tasks and assisting you at every step,” Jania said. For marketers, this could start with a campaign brief created with Copilot’s help, identifying an audience, and developing content. “Einstein Studio is exciting because customers can create actions for Copilot that we hadn’t even envisioned.” Contrary to previous reports, there is only one Copilot, Einstein Copilot, with various use cases like marketing, merchants, and shoppers. “We use these names for clarity, but there’s just one Copilot. You can build your own use cases in addition to the ones we provide.” Marketers will need time to adapt to Copilot. “Adoption takes time,” Jania acknowledged. “This Connections event offers extensive hands-on training to help people use Data Cloud and these tools, beyond just demonstrations.” What’s New with Einstein Personalization Einstein Personalization is a real-time decision engine designed to choose the next best action or offer for customers. “What’s new is that it now runs natively on Data Cloud,” Jania explained. While many decision engines require a separate dataset, Einstein Personalization evaluates a customer holistically and recommends actions directly within Service Cloud, Sales Cloud, or Marketing Cloud. Ensuring Trust Connections presentations emphasized that while public LLMs like ChatGPT can be applied to customer data, none of this data is retained by the LLMs. This isn’t just a matter of agreements; it involves the Einstein Trust Layer. “All data passing through an LLM runs through our gateway. Personally identifiable information, such as credit card numbers or email addresses, is stripped out. The LLMs do not store the output; Salesforce retains it for auditing. Any output that returns through our gateway is logged, checked for toxicity, and only then is PII reinserted into the response. These measures ensure data safety beyond mere handshakes,” Jania said. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Growing Family of Einstein Copilots

Growing Family of Einstein Copilots

Salesforce made several announcements this month, regarding the Growing Family of Einstein Copilots. By unveiling AI-powered Einstein Copilots for marketing and merchants. These new Copilots build on the previously announced Copilots for retailers and shoppers and are integrated into the Einstein 1 platform. They can communicate with each other, effectively bridging marketing and commerce, and have full access to Salesforce Data Cloud. “Welcome to the AI enterprise,” said Ariel Kelman, Salesforce President and CMO, during his keynote at Salesforce Connections in Chicago. Kelman outlined four waves of AI: Predictive (e.g., lead scoring), Generative, Autonomous, and AI General Intelligence. “We are starting to enter the third wave,” he stated, where AI will begin to take actions independently. Copilots are a step in that direction, although for now, a human remains in control. The Path to the AI Enterprise Kelman described five steps towards creating an AI enterprise: Regarding the last point, new Slack AI tools were demonstrated for summarizing interactions and importing actionable data from Data Cloud into Slack. The strategy for Einstein Copilots aims to empower business users in marketing, commerce, and other functions to execute complex tasks, such as creating personalized customer journeys, using natural language prompts. Einstein Copilots for Marketing and Merchants The marketing Copilot can generate marketing briefs and content, and create email campaigns. Through Data Cloud, it can ingest and execute a brand’s datasets, including customer data from repositories like AWS, Snowflake, and Databricks. By automating routine tasks and time-consuming projects like data connection and analysis, the Copilot aims to free up marketers to engage more thoughtfully with their audiences. The commerce Copilot, part of Salesforce’s commerce offerings, responds to natural language prompts to create online storefronts, improve product discoverability, write product descriptions, and make product recommendations. Other Announcements Availability Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Securing SaaS

Securing SaaS

Obsidian Security recently discussed the complexity of enforcing Single Sign-On (SSO) within Salesforce and frequently encountering misconfigurations. Notably, 60% of Obsidian’s customers initially have local access without Multi-Factor Authentication (MFA) configured for Salesforce, highlighting a significant security gap that Obsidian diligently works to secure. Securing SaaS. The Hidden Vulnerability Application owners who manage Salesforce daily often remain unaware of this misconfiguration. Despite their deep knowledge of Salesforce management, local access without MFA presents an overlooked vulnerability. This situation raises concerns about the security of other SaaS applications, especially those without developed expertise or knowledge. If you have concerns about your configuration, Tectonic can help. Attacker Focus and Trends Attackers have historically targeted the Identity Provider (IdP) space, focusing on providers like Okta, Microsoft Entra, and Ping. This strategy offers maximal impact, as compromising an IdP grants broad access across multiple applications. Developing expertise to breach a few IdPs is more efficient than learning the diverse local access pathways of numerous SaaS vendors. Over the past 12 months, nearly 100% of the breaches that required Obsidian’s intervention through CrowdStrike or other incident response partners were IdP-focused. Notably, 70% of these breaches involved subverting MFA, often through methods like SIM swapping. In instances where local access bypasses the IdP, 95% of the time it lacks MFA. Recent discussions around Snowflake have brought attention to “shadow authentication,” defined as unsanctioned means to authenticate a user within an application. Obsidian Security has observed an increase in brute force attacks against SaaS applications via local access pathways over the last two weeks, indicating a growing awareness of this attack vector. Future Expectations Attackers continually seek easy and efficient pathways. Over the next 12 months, local access or shadow authentication is expected to become a major attack vector. Organizations must proactively secure these pathways as attackers shift their focus. What You Can Do How Obsidian Helps Salesforce Security partners offers robust solutions to address these challenges: By leveraging partner capabilities, organizations can enhance their security posture, protecting against evolving threats targeting local access and shadow authentication. The post “The Growing Importance of Securing Local Access in SaaS Applications” appeared first on Obsidian Security. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce in a Mega-Data Deal with Informatica

Salesforce in a Mega-Data Deal with Informatica

Since Salesforce announced its acquisition of Slack for $27.7B in late 2020, the cloud software mega-giant has paused its acquisition strategy due to factors like rising interest rates, declining revenues, and a laser focus on profitability. However, recent leaks from The Wall Street Journal and other news publications suggest that Salesforce in a Mega-Data Deal with Informatica, is in advanced talks to acquire Informatica in a deal worth over $11B. Informatica is a significant player in enterprise data management, boasting revenues of over $1.51B and a workforce of over 5,000 employees. They specialize in AI-powered cloud data management, assisting companies in processing and managing large volumes of data from various sources to derive actionable and real-time insights. Salesforce in a Mega-Data Deal with Informatica The synergies between Informatica and Salesforce are many, with both companies focusing on consolidating data from multiple sources to provide comprehensive business insights. This aligns well with Salesforce’s strategic shift towards AI-driven data processing and analysis, aiming to enhance generative and predictive capabilities. While Salesforce’s previous acquisition of MuleSoft in 2018 for $6.5B has proven successful in facilitating API connectivity for real-time integrations, Informatica brings expertise in ETL (Extract-Transform-Load), data quality, and data movement to and from platforms like Snowflake and Databricks. This potential mega-data deal underscores the growing importance of data in the tech industry, especially with the emergence of generative AI and large language models (LLMs) that enable deeper analysis of vast datasets. Salesforce’s recent rebranding of its platform to “Einstein 1” underscores the convergence of AI and data within its product suite. The company’s emphasis on “AI + Data + CRM” reflects its commitment to leveraging data analytics for CRM enhancement, exemplified by the growth of its Data Cloud product. Partnering with industry leaders like Snowflake, Databricks, AWS, and Google, Salesforce aims to offer comprehensive data solutions that integrate seamlessly with existing systems. Informatica’s capabilities in ETL and Master Data Management (MDM) align with this vision, particularly in streamlining data integration and ensuring data quality across disparate systems. In final thoughts, while the Informatica acquisition is still pending finalization, it represents a strategic move by Salesforce to strengthen its position in the AI and data-driven CRM market. As Salesforce continues to evolve its product ecosystem, this acquisition signals its commitment to innovation and leadership in the era of AI-powered data analytics. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
gettectonic.com