, Author at gettectonic.com - Page 35
AI Senate Bill 1047

AI Senate Bill 1047

California’s new AI bill has sparked intense debate, with proponents viewing it as necessary regulation and critics warning it could stifle innovation, particularly for small businesses. Senate Bill 1047, known as the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, mandates that developers of advanced AI systems costing at least $100 million to train must test their models for potential harm and put safeguards in place. It also offers whistleblower protections for employees at large AI firms and establishes CalCompute, a public cloud computing resource aimed at startups and researchers. The bill is awaiting Governor Gavin Newsom’s signature by Sept. 30 to become law. Prominent AI experts, including Geoffrey Hinton and Yoshua Bengio, support the bill. However, it has met resistance from various quarters, including Rep. Nancy Pelosi and OpenAI, who argue it could hinder innovation and the startup ecosystem. Pelosi and others have expressed concerns that the bill’s requirements might burden smaller businesses and harm California’s leadership in tech innovation. Gartner analyst Avivah Litan acknowledged the dilemma, stating that while regulation is critical for AI, the bill’s requirements might negatively impact small businesses. “Some regulation is better than none,” she said, but added that thresholds could be challenging for smaller firms. Steve Carlin, CEO of AiFi, criticized the bill for its vague language and complex demands on AI developers, including unclear guidance on enforcing the rules. He suggested that instead of focusing on AI models, legislation should address the risks and applications of AI, as seen with the EU AI Act. Despite concerns, some experts like Forrester Research’s Alla Valente support the bill’s safety testing and whistleblower protections. Valente argued that safeguarding AI models is essential across industries, though she acknowledged that the costs of compliance could be higher for small businesses. Still, she emphasized that the long-term costs of not implementing safeguards could be greater, with risks including customer lawsuits and regulatory penalties. California’s approach to AI regulation adds to the growing patchwork of state-level AI laws in the U.S. Colorado and Connecticut have also introduced AI legislation, and cities like New York have tackled issues like algorithmic bias. Carlin warned that a fragmented state-by-state regulatory framework could create a costly and complex environment for developers, calling for a unified federal standard instead. While federal legislation has been proposed, none has passed, and Valente pointed out that relying on Congress for action is a slow process. In the meantime, states like California are pushing ahead with their own AI regulations, creating both opportunities and challenges for the AI industry. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Data Quality Critical

Data Quality Critical

Data quality has never been more critical, and it’s only set to grow in importance with each passing year. The reason? The rise of AI—particularly generative AI. Generative AI offers transformative benefits, from vastly improved efficiency to the broader application of data in decision-making. But these advllucantages hinge on the quality of data feeding the AI. For enterprises to fully capitalize on generative AI, the data driving models and applications must be accurate. If the data is flawed, so are the AI’s outputs. Generative AI models require vast amounts of data to produce accurate responses. Their outputs aren’t based on isolated data points but on aggregated data. Even if the data is high-quality, an insufficient volume could result in an incorrect output, known as an AI hallucination. With so much data needed, automating data pipelines is essential. However, with automation comes the challenge: humans can’t monitor every data point along the pipeline. That makes it imperative to ensure data quality from the outset and to implement output checks along the way, as noted by David Menninger, an analyst at ISG’s Ventana Research. Ignoring data quality when deploying generative AI can lead to not just inaccuracies but biased or even offensive outcomes. “As we’re deploying more and more generative AI, if you’re not paying attention to data quality, you run the risks of toxicity, of bias,” Menninger warns. “You’ve got to curate your data before training the models and do some post-processing to ensure the quality of the results.” Enterprises are increasingly recognizing this, with leaders like Saurabh Abhyankar, chief product officer at MicroStrategy, and Madhukar Kumar, chief marketing officer at SingleStore, noting the heightened emphasis on data quality, not just in terms of accuracy but also security and transparency. The rise of generative AI is driving this urgency. Generative AI’s potential to lower barriers to analytics and broaden access to data has made it a game-changer. Traditional analytics tools have been difficult to master, often requiring coding skills and data literacy training. Despite efforts to simplify these tools, widespread adoption has been limited. Generative AI, however, changes the game by enabling natural language interactions, making it easier for employees to engage with data and derive insights. With AI-powered tools, the efficiency gains are undeniable. Generative AI can take on repetitive tasks, generate code, create data pipelines, and even document processes, allowing human workers to focus on higher-level tasks. Abhyankar notes that this could be as transformational for knowledge workers as the industrial revolution was for manual labor. However, this potential is only achievable with high-quality data. Without it, AI-driven decision-making at scale could lead to ethical issues, misinformed actions, and significant consequences, especially when it comes to individual-level decisions like credit approvals or healthcare outcomes. Ensuring data quality is challenging, but necessary. Organizations can use AI-powered tools to monitor data quality, detect irregularities, and alert users to potential issues. However, as advanced as AI becomes, human oversight remains critical. A hybrid approach, where technology augments human expertise, is essential for ensuring that AI models and applications deliver reliable outputs. As Kumar of SingleStore emphasizes, “Hybrid means human plus AI. There are things AI is really good at, like repetition and automation, but when it comes to quality, humans are still better because they have more context.” Ultimately, while AI offers unprecedented opportunities, it’s clear that data quality is the foundation. Without it, the risks are too great, and the potential benefits could turn into unintended consequences. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Exploring Large Action Models

Exploring Large Action Models

Exploring Large Action Models (LAMs) for Automated Workflow Processes While large language models (LLMs) are effective in generating text and media, Large Action Models (LAMs) push beyond simple generation—they perform complex tasks autonomously. Imagine an AI that not only generates content but also takes direct actions in workflows, such as managing customer relationship management (CRM) tasks, sending emails, or making real-time decisions. LAMs are engineered to execute tasks across various environments by seamlessly integrating with tools, data, and systems. They adapt to user commands, making them ideal for applications in industries like marketing, customer service, and beyond. Key Capabilities of LAMs A standout feature of LAMs is their ability to perform function-calling tasks, such as selecting the appropriate APIs to meet user requirements. Salesforce’s xLAM models are designed to optimize these tasks, achieving high performance with lower resource demands—ideal for both mobile applications and high-performance environments. The fc series models are specifically tuned for function-calling, enabling fast, precise, and structured responses by selecting the best APIs based on input queries. Practical Examples Using Salesforce LAMs In this article, we’ll explore: Implementation: Setting Up the Model and API Start by installing the necessary libraries: pythonCopy code! pip install transformers==4.41.0 datasets==2.19.1 tokenizers==0.19.1 flask==2.2.5 Next, load the xLAM model and tokenizer: pythonCopy codeimport json import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_name = “Salesforce/xLAM-7b-fc-r” model = AutoModelForCausalLM.from_pretrained(model_name, device_map=”auto”, torch_dtype=”auto”, trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_name) Now, define instructions and available functions. Task Instructions: The model will use function calls where applicable, based on user questions and available tools. Format Example: jsonCopy code{ “tool_calls”: [ {“name”: “func_name1”, “arguments”: {“argument1”: “value1”, “argument2”: “value2”}} ] } Define available APIs: pythonCopy codeget_weather_api = { “name”: “get_weather”, “description”: “Retrieve weather details”, “parameters”: {“location”: “string”, “unit”: “string”} } search_api = { “name”: “search”, “description”: “Search for online information”, “parameters”: {“query”: “string”} } Creating Flask APIs for Business Logic We can use Flask to create APIs to replicate business processes. pythonCopy codefrom flask import Flask, request, jsonify app = Flask(__name__) @app.route(“/customer”, methods=[‘GET’]) def get_customer(): customer_id = request.args.get(‘customer_id’) # Return dummy customer data return jsonify({“customer_id”: customer_id, “status”: “active”}) @app.route(“/send_email”, methods=[‘GET’]) def send_email(): email = request.args.get(’email’) # Return dummy response for email send status return jsonify({“status”: “sent”}) Testing the LAM Model and Flask APIs Define queries to test LAM’s function-calling capabilities: pythonCopy codequery = “What’s the weather like in New York in fahrenheit?” print(custom_func_def(query)) # Expected: {“tool_calls”: [{“name”: “get_weather”, “arguments”: {“location”: “New York”, “unit”: “fahrenheit”}}]} Function-Calling Models in Action Using base_call_api, LAMs can determine the correct API to call and manage workflow processes autonomously. pythonCopy codedef base_call_api(query): “””Calls APIs based on LAM recommendations.””” base_url = “http://localhost:5000/” json_response = json.loads(custom_func_def(query)) api_url = json_response[“tool_calls”][0][“name”] params = json_response[“tool_calls”][0][“arguments”] response = requests.get(base_url + api_url, params=params) return response.json() With LAMs, businesses can automate and streamline tasks in complex workflows, maximizing efficiency and empowering teams to focus on strategic initiatives. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Machine Learning on Kubernetes

Machine Learning on Kubernetes

How and Why to Run Machine Learning Workloads on Kubernetes Running machine learning (ML) model development and deployment on Kubernetes has become essential for optimizing resources and managing costs. As AI and ML tools gain mainstream acceptance, business and IT professionals are increasingly familiar with these technologies. With the growing buzz around AI, engineering needs in ML and AI have expanded, particularly in managing the complexities and costs associated with these workloads. The Need for Kubernetes in ML As ML use cases become more complex, training models has become increasingly resource-intensive and costly. This has driven up demand and costs for GPUs, a key resource for ML tasks. Containerizing ML workloads offers a solution to these challenges by improving scalability, automation, and infrastructure efficiency. Kubernetes, a leading tool for container orchestration, is particularly effective for managing ML processes. By decoupling workloads into manageable containers, Kubernetes helps streamline ML operations and reduce costs. Understanding Kubernetes The evolution of engineering priorities has consistently focused on minimizing application footprints. From mainframes to modern servers and virtualization, the trend has been towards reducing operational overhead. Containers emerged as a solution to this trend, offering a way to isolate application stacks while maintaining performance. Initially, containers used Linux cgroups and namespaces, but their popularity surged with Docker. However, Docker containers had limitations in scaling and automatic recovery. Kubernetes was developed to address these issues. As an open-source orchestration platform, Kubernetes manages containerized workloads by ensuring containers are always running and properly scaled. Containers run inside resources called pods, which include everything needed to run the application. Kubernetes has also expanded its capabilities to orchestrate other resources like virtual machines. Running ML Workloads on Kubernetes ML systems demand significant computing power, including CPU, memory, and GPU resources. Traditionally, this required multiple servers, which was inefficient and costly. Kubernetes addresses this challenge by orchestrating containers and decoupling workloads, allowing multiple pods to run models simultaneously and share resources like CPU, memory, and GPU power. Using Kubernetes for ML can enhance practices such as: Challenges of ML on Kubernetes Despite its advantages, running ML workloads on Kubernetes comes with challenges: Key Tools for ML on Kubernetes Kubernetes requires specific tools to manage ML workloads effectively. These tools integrate with Kubernetes to address the unique needs of ML tasks: TensorFlow is another option, but it lacks the dedicated integration and optimization of Kubernetes-specific tools like Kubeflow. For those new to running ML workloads on Kubernetes, Kubeflow is often the best starting point. It is the most advanced and mature tool in terms of capabilities, ease of use, community support, and functionality. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce Data Quality Challenges and AI Integration

Salesforce Data Quality Challenges and AI Integration

Salesforce Data Quality Challenges and AI Integration Salesforce is an incredibly powerful CRM tool, but like any system, it’s vulnerable to data quality issues if not properly managed. As organizations race to unlock the power of AI to improve sales and service experiences, they are finding that great AI requires great data. Let’s explore some of the most common Salesforce data quality challenges and how resolving them is key to succeeding in the AI era. 1. Duplicate Records Duplicate data can clutter your Salesforce system, leading to reporting inaccuracies and confusing AI-driven insights. Use Salesforce’s built-in deduplication tools or third-party apps that specialize in identifying and merging duplicate records. Implement validation rules to prevent duplicates from entering the system in the first place, ensuring cleaner data that supports accurate AI outputs. 2. Incomplete Data Incomplete data often results in missed opportunities and poor customer insights. This becomes especially problematic in AI applications, where missing data could skew results or lead to incomplete recommendations. Use Salesforce validation rules to make certain fields mandatory, ensuring critical information is captured during data entry. Regularly audit your system to identify missing data and assign tasks to fill in gaps. This ensures that both structured and unstructured data can be effectively leveraged by AI models. 3. Outdated Information Over time, data in Salesforce can become outdated, particularly customer contact details or preferences. Regularly cleanse and update your data using enrichment services that automatically refresh records with current information. For AI to deliver relevant, real-time insights, your data needs to be fresh and up to date. This is especially important when AI systems analyze both structured data (e.g., CRM entries) and unstructured data (e.g., emails or transcripts). 4. Inconsistent Data Formatting Inconsistent data formatting complicates analysis and weakens AI performance. Standardize data entry using picklists, drop-down menus, and validation rules to enforce proper formatting across all fields. A clean, consistent data set helps AI models more effectively interpret and integrate structured and unstructured data, delivering more relevant insights to both customers and employees. 5. Lack of Data Governance Without clear guidelines, it’s easy for Salesforce data quality to degrade, especially when unstructured data is added to the mix. Establish a data governance framework that includes policies for data entry, updates, and regular cleansing. Good data governance ensures that both structured and unstructured data are properly managed, making them usable by AI technologies like Large Language Models (LLMs) and Retrieval Augmented Generation (RAG). The Role of AI in Enhancing Data Management This year, every organization is racing to understand and unlock the power of AI, especially to improve sales and service experiences. However, great AI requires great data. While traditional CRM systems deal primarily with structured data like rows and columns, every business also holds a treasure trove of unstructured data in documents, emails, transcripts, and other formats. Unstructured data offers invaluable AI-driven insights, leading to more comprehensive, customer-specific interactions. For example, when a customer contacts support, AI-powered chatbots can deliver better service by pulling data from both structured (purchase history) and unstructured sources (warranty contracts or past chats). To ensure AI-generated responses are accurate and contextual, companies must integrate both structured and unstructured data into a unified 360-degree customer view. AI Frameworks for Better Data Utilization An effective way to ensure accuracy in AI is with frameworks like Retrieval Augmented Generation (RAG). RAG enhances AI by augmenting Large Language Models with proprietary, real-time data from both structured and unstructured sources. This method allows companies to deliver contextual, trusted, and relevant AI-driven interactions with customers, boosting overall satisfaction and operational efficiency. Tectonic’s Role in Optimizing Salesforce Data for AI To truly unlock the power of AI, companies must ensure that their data is of high quality and accessible to AI systems. Experts like Tectonic provide tailored Salesforce consulting services to help businesses manage and optimize their data. By ensuring data accuracy, completeness, and governance, Tectonic can support companies in preparing their structured and unstructured data for the AI era. Conclusion: The Intersection of Data Quality and AI In the modern era, data quality isn’t just about ensuring clean CRM records; it’s also about preparing your data for advanced AI applications. Whether it’s eliminating duplicates, filling in missing information, or governing data across touchpoints, maintaining high data quality is essential for leveraging AI effectively. For organizations ready to embrace AI, the first step is understanding where all their data resides and ensuring it’s suitable for their generative AI models. With the right data strategy, businesses can unlock the full potential of AI, transforming sales, service, and customer experiences across the board. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce Nonprofit Cloud Focuses Resources

Salesforce Nonprofit Cloud Focuses Resources

The Pancreatic Cancer Action Network (PanCAN) is currently exploring how to harness the AI capabilities within Salesforce’s Nonprofit Cloud to enhance its mission-driven services. Salesforce Nonprofit Cloud Focuses Resources allowing PanCan to focus on cancer. Pancreatic cancer is among the most aggressive and deadly forms of cancer, with a five-year survival rate of just 12.8% in the U.S. Despite accounting for only 3.3% of new cancer cases, it is responsible for 8.5% of all cancer-related deaths, making it the third leading cause of cancer mortality after lung and colon cancer. Founded in 1999, PanCAN is dedicated to researching this devastating disease and advocating for patients nationwide. The organization also provides critical information and resources to help patients make informed decisions, thereby supporting a community of patients and their families. One of PanCAN’s key services is connecting patients with specialists in their area. Julie Fleshman, President, CEO, and PanCAN’s first employee, emphasizes the importance of this program: “Our patient service program, particularly our call center, is the cornerstone of what we offer. When someone is diagnosed, they are understandably scared. Our trained case managers provide support and recommend that each patient sees a specialist. We maintain a database of specialists across the U.S. and can provide a list of surgeons or oncologists based on how far patients are willing to travel.” This case management system allows patients and their families to work with the same case manager consistently, ensuring a seamless, free-of-charge service and simplifying the information-gathering process. Salesforce Nonprofit Cloud Focuses Resources PanCAN also helps patients find clinical trials in their area, an essential service given the current state of treatment for pancreatic cancer. Nonprofit Cloud focuses on technical resources allowing PanCAN to focus on patient services. Fleshman explains: “Clinical trials often offer the most cutting-edge treatments, which is why we recommend patients consider them. We maintain a database of trials and can quickly inform patients of their options during phone consultations. This allows them to discuss potential trials with their physician and determine the best course of action.” Evolving Technology to Better Serve Patients PanCAN’s initial case management system was developed in-house about a decade ago. As it neared the end of its life, searches could take up to two hours, prompting the organization to seek a more efficient solution. PanCAN enlisted a Salesforce consulting partner to evaluate options and develop a technology strategy aligned with its goals. In June, a new system based on Salesforce’s Nonprofit Cloud Person Accounts module was launched. The new system has significantly reduced the time required to search for information, enabling case managers to assist more patients daily—a crucial improvement given the projected 66,440 new pancreatic cancer diagnoses in the U.S. this year alone. Additionally, the system’s user-friendliness has led to higher job satisfaction among employees. Fleshman stresses the importance of involving a multifunctional team in the implementation process: “It’s essential to be clear about your objectives from the start, but it’s equally important to include the right people. If we had only involved the patient services team and not the tech team responsible for maintenance and security, or the finance team whose system needed to integrate with ours, the project would have been siloed and incomplete.” Salesforce Nonprofit Cloud Focuses Resources and Solutions for Nonprofits The updated system includes advanced features like the OmniStudio process automation tool, which has streamlined the patient questionnaire process, and an integrated data processing engine capable of saving multiple records simultaneously. Leveraging AI to Enhance Impact Looking ahead, PanCAN is assessing how to leverage the AI capabilities within Salesforce’s Nonprofit Cloud to further enhance its services. Fleshman outlines the next steps: “Providing information and resources to patients is crucial, but we also need to use data to optimize our programs and allocate our resources effectively. We hope AI will help us analyze data to better understand our impact and patient experiences. For instance, AI could reveal trends in how often we refer patients to specific doctors or studies, identify gaps in our services, or highlight areas where we should focus more of our efforts. Understanding these factors will help us allocate our time, energy, and resources more efficiently.” Despite the benefits, managing change has been key to addressing employee concerns about AI potentially threatening their jobs. Fleshman notes: “While there was excitement about getting a faster, more efficient tool, there was also anxiety about job security. Our focus was on demonstrating how AI could enhance our ability to provide better reports and insights rather than replacing jobs. We believe that even the best tools are useless if people aren’t trained to use them effectively.” A Vision for the Future PanCAN has developed a five-year technology roadmap that includes upgrading its grant management and financial systems, as well as introducing marketing applications to better understand its target audience, improve outreach, and personalize interactions. As Fleshman concludes: “Our executive team recognizes that without cutting-edge technology, we won’t achieve our ambitious goals. In our sector, technology often gets deprioritized, but updating systems allows us to deliver our mission more productively and efficiently, ultimately better serving those we’re here to help.” Our Take According to Salesforce’s sixth Nonprofit Trends Report, many charities view AI with a mix of optimism, curiosity, and caution. PanCAN’s approach to adopting AI—focusing on its potential to optimize resources and better support patients—demonstrates the organization’s forward-thinking and commitment to its mission. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Data Integration with AWS Glue

Data Integration with AWS Glue

The rapid rise of Software as a Service (SaaS) solutions has led to data silos across different platforms, making it challenging to consolidate insights. Effective data analytics depends on the ability to seamlessly integrate data from various systems by identifying, gathering, cleansing, and combining it into a unified format. AWS Glue, a serverless data integration service, simplifies this process with scalable, efficient, and cost-effective solutions for unifying data from multiple sources. By using AWS Glue, organizations can streamline data integration, minimize silos, and enhance agility in managing data pipelines, unlocking the full potential of their data for analytics, decision-making, and innovation. This insight explores the new Salesforce connector for AWS Glue and demonstrates how to build a modern Extract, Transform, and Load (ETL) pipeline using AWS Glue ETL scripts. Introducing the Salesforce Connector for AWS Glue To meet diverse data integration needs, AWS Glue now supports SaaS connectivity for Salesforce. This enables users to quickly preview, transfer, and query customer relationship management (CRM) data, while dynamically fetching the schema. With the Salesforce connector, users can ingest and transform CRM data and load it into any AWS Glue-supported destination, such as Amazon S3, in preferred formats like Apache Iceberg, Apache Hudi, and Delta Lake. It also supports reverse ETL use cases, enabling data to be written back to Salesforce. Key Benefits: Solution Overview For this use case, we retrieve the full load of a Salesforce account object into a data lake on Amazon S3 and capture incremental changes. The solution also enables updates to certain fields in the data lake and synchronizes them back to Salesforce. The process involves creating two ETL jobs using AWS Glue with the Salesforce connector. The first job ingests the Salesforce account object into an Apache Iceberg-format data lake on Amazon S3. The second job captures updates and pushes them back to Salesforce. Prerequisites: Creating the ETL Pipeline Step 1: Ingest Salesforce Account Object Using the AWS Glue console, create a new job to transfer the Salesforce account object into an Apache Iceberg-format transactional data lake in Amazon S3. The script checks if the account table exists, performs an upsert if it does, or creates a new table if not. Step 2: Push Changes Back to Salesforce Create a second ETL job to update Salesforce with changes made in the data lake. This job writes the updated account records from Amazon S3 back to Salesforce. Example Query sqlCopy codeSELECT id, name, type, active__c, upsellopportunity__c, lastmodifieddate FROM “glue_etl_salesforce_db”.”account”; Additional Considerations You can schedule the ETL jobs using AWS Glue job triggers or integrate them with other AWS services like AWS Lambda and Amazon EventBridge for advanced workflows. Additionally, AWS Glue supports importing deleted Salesforce records by configuring the IMPORT_DELETED_RECORDS option. Clean Up After completing the process, clean up the resources used in AWS Glue, including jobs, connections, Secrets Manager secrets, IAM roles, and the S3 bucket to avoid incurring unnecessary charges. Conclusion The AWS Glue connector for Salesforce simplifies the analytics pipeline, accelerates insights, and supports data-driven decision-making. Its serverless architecture eliminates the need for infrastructure management, offering a cost-effective and agile approach to data integration, empowering organizations to efficiently meet their analytics needs. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More

Linus Torvalds Insights

Linus Torvalds Shares Insights on the Future of Programming with AI Linus Torvalds, the mastermind behind Linux and Git—two cornerstones of modern software development—recently shared his perspective on how artificial intelligence (AI) is reshaping the world of programming. His candid insights offer a balanced view of AI’s capabilities and limitations, coming from one of the industry’s most influential voices. If you prefer a quick breakdown over watching a full interview, here are the key takeaways from Torvalds’ conversation. AI in Programming: Evolution, Not Revolution Torvalds describes AI, particularly large language models (LLMs), as “autocorrect on steroids.” These tools excel at predicting the next word or line of code based on established patterns but aren’t “intelligent” in the human sense. Rather than a seismic shift, AI represents the next step in a long history of automation in coding. From the days of machine language to today’s high-level languages like Python and Rust, tools have continuously evolved to make developers’ lives easier. AI is just another link in this chain—helping write, refine, and debug code while boosting productivity. AI as a Developer’s Supercharged Assistant Far from being a replacement for human programmers, Torvalds sees AI as a powerful assistant. Tools like GitHub Copilot are already enhancing the coding process by suggesting fixes, spotting bugs, and speeding up routine tasks. The vision? A future where programmers can abstract tasks even further, possibly instructing AI in plain English. Imagine simply saying, “Build me a tool to manage my expenses,” and watching it happen. However, for now, AI is an incremental improvement, not a groundbreaking leap. The Shift Toward AI-Generated Code One of Torvalds’ more intriguing predictions is that AI may eventually write code in ways incomprehensible to human programmers. Since AI doesn’t require human-readable syntax, it could optimize code in ways that only it understands. In this scenario, developers might transition from writing code to managing AI systems that generate and refine it—shifting from hands-on creators to overseers of automated processes. AI in Code Review: Smarter Intern or Future Partner? When it comes to code review, AI’s potential is clear. Torvalds notes that AI could efficiently catch simple errors—like typos or syntax mistakes—freeing up human reviewers to focus on more complex logic and functionality. While AI might streamline tedious tasks, it’s far from perfect. Issues like “hallucinations,” where AI confidently produces incorrect results, highlight the need for human oversight. AI can assist, but it still requires developers to verify its output. A Balanced Take on AI and Jobs Torvalds dismisses fears of AI taking over programming jobs, pointing out that technological advancements historically create new opportunities rather than eliminate roles. AI, in his view, is less about replacing humans and more about augmenting their abilities. It’s a tool to make developers more efficient—not a harbinger of obsolescence. Final Thoughts: Embrace AI, But Stay Grounded Linus Torvalds envisions AI as a valuable, evolving tool for programmers, not a threat to their livelihood. While it’s set to change how we code, the shift will be gradual rather than revolutionary. Whether you’re a seasoned developer or a newcomer, now is the time to explore AI-powered tools, embrace their potential, and adapt to this new era of programming. Instead of fearing change, we can use AI to push the boundaries of what’s possible. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Five9 Cautious Forward Looking

Five9 Cautious Forward Looking

Recently, Five9 reported its second-quarter FY24 results, revealing a strong performance for the period. However, the company’s cautious forward-looking guidance led to a significant drop in its stock price, which fell by over 25%. In response to queries about the conservative outlook, a Five9 spokesperson attributed the reduced 2024 revenue guidance—a 3.8% decrease—to macroeconomic headwinds. This cautious forecast stands in contrast to the more optimistic outlooks of Five9’s publicly traded peers. Economic factors such as global issues, talent shortages, AI uncertainty, and the upcoming election are influencing customers’ decisions on IT investments, which likely contributed to the reduced guidance. Additionally, sales execution challenges have prompted the company to take corrective measures. While Five9 might face unique challenges that other CCaaS providers do not, the full impact will become clearer in the next quarter. In response to these challenges, Five9 has taken steps to stabilize its operations, including promoting Matt Tuckness from VP of Global Customer Success to EVP of Sales and Customer Success. This move, described by leadership as promoting a “dedicated sales leader” with a decade of experience at Five9, aims to enhance sales execution. Scott Berg from Needham questioned the timing of the promotion, suggesting it might be a reaction to a single quarter’s results. Dan Burkland, Five9’s President, defended the decision, emphasizing that having a dedicated EVP of Sales is crucial for focusing on enterprise deals, especially given Five9’s efforts to grow its enterprise base. Five9 has also announced a 7% workforce reduction, affecting approximately 185 employees. This marks the company’s first layoff in its history, which is notable given its history of growth through acquisitions, such as the recent planned acquisition of Acqueon, a real-time revenue execution platform. Typically, acquisitions lead to headcount adjustments, but Five9 had managed to avoid such cuts until now. The company stated that the reduction was necessary to focus on profitable growth and long-term business resilience while continuing to serve global customers and innovate. Although layoffs are challenging, they are sometimes necessary for business adaptation. Many UCaaS and CCaaS providers expanded their workforces during the pandemic and later faced the need to trim excess staff as the market softened. Five9’s adjustment in headcount reflects changing market conditions. The acquisition of Acqueon is expected to accelerate Five9’s vision by integrating expertise in inbound and outbound communications to enhance personalized customer experiences across marketing, sales, and service. Acqueon will operate as a separate business unit within Five9, with plans to eventually integrate its brand into the larger Five9 brand. Overall, despite the quarter’s challenges, Five9 had a strong performance. It achieved a record-breaking billion ARR run rate for the first time, with total subscription revenue growing by 17%. The company maintains a robust balance sheet with over $1 billion in cash. The recent organizational changes, including new leadership and headcount adjustments, are indicative of Five9’s maturation and aim to return the company to its pattern of strong performance and growth. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce Data Migration

Salesforce Data Migration

In today’s era of rapid digital transformation, efficient data migration has become increasingly important as cloud adoption gains momentum. Foundry’s research indicates that 63% of IT leaders have accelerated their cloud migrations, but 90% encounter challenges, often related to budget constraints. This emphasizes the need for meticulous planning and strategic execution. This insight focuses on Salesforce data migration, outlining why it’s essential and providing a nine-step plan for a successful migration. Additionally, we look into data preparation solutions and highlight Salesforce data migration tools, turning potential challenges into growth opportunities. Salesforce Data Migration Checklist Why is Data Migration Important? In 2011, we faced the challenge of transferring data from an old phone to a first smartphone. The contacts were especially important, but the outdated phone lacked any data transfer capabilities. Unwilling to manually re-enter everything, we researched extensively and discovered a method to extract the data into a CSV file. Converting it into vCard format, we successfully migrated all contacts. This personal experience illustrates the significance of data migration, not just for businesses but for everyday scenarios as well. For organizations, having a structured data migration plan is critical when transitioning from legacy systems to modern platforms like Salesforce. It enhances efficiency, scalability, and accessibility, supporting business growth through better data management, cost savings, and improved decision-making. Data migration also ensures integrity and security, aligning IT capabilities with evolving business needs and driving innovation in a fast-changing technological landscape. Learn how we helped Cresa migrate over 8,000 records to Salesforce with 100% accuracy. What is Salesforce Data Migration? Salesforce data migration refers to the process of transferring information from external systems—such as legacy CRM platforms or local databases—into Salesforce. This process not only preserves data integrity but also supports better decision-making, enhances customer service, and enables business growth. A well-planned Salesforce data migration strategy is critical for unlocking the full benefits of the platform and ensuring a seamless transition. Salesforce Data Migration Plan: 9 Key Steps Need Help with Data Migration to Salesforce?We offer consulting services to help you navigate your data migration challenges, from auditing to strategy execution. Contact Tectonic today. Practical Salesforce Data Migration ExampleUsing Data Loader, here’s a step-by-step guide to migrating a list of companies. After logging into Salesforce and selecting the Accounts object, you map fields from your CSV file, execute the migration, and review the logs to ensure accuracy. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Healthcare IT and CrowdStrike

Healthcare IT and CrowdStrike

Learning from the CrowdStrike Outage: Enhancing Resilience and Incident Response Overview: In the wake of the CrowdStrike outage, businesses around the globe are focusing on restoring business continuity and bolstering their resilience for future incidents. On Friday, July 19, 2024, a faulty content update triggered crashes across approximately 8.5 million Windows devices, displaying the infamous blue screen of death. This affected a range of sectors, including hospitals and airlines. Although less than 1% of all Windows machines were impacted, the outage caused significant disruptions, particularly in healthcare. For instance, Mass General Brigham hospitals and clinics canceled all non-urgent visits on the day of the outage. Other major healthcare providers, such as Memorial Sloan Kettering Cancer Center, Cleveland Clinic, and Mount Sinai, also faced operational challenges. This incident was not a result of a cyberattack but rather a defective content configuration update to CrowdStrike’s Falcon threat detection platform. According to the company’s preliminary post-incident review, a bug in the content validator allowed the faulty update to pass through validation despite containing errors. “What we’re hearing is that the recovery is well underway. Most healthcare organizations I’ve been talking to are back up and running,” said David Finn, Executive Vice President of Governance, Risk, and Compliance at First Health Advisory, in an interview with TechTarget Editorial. “The scope was much smaller than some of the other issues we’ve seen in the recent past in healthcare, but the response was healthy. Still, I think there are a lot of lessons learned.” Health IT security experts suggest that this incident can serve as a valuable learning opportunity for improving future response and recovery strategies. Planning for the Inevitable “The bad thing is always going to happen,” Finn stated, drawing on his 40 years of experience in health IT security and privacy. “The trick is to plan for it, be prepared, and ensure your ability to recover and remain resilient.” Whether it’s a large-scale cyberattack, like the one at Change Healthcare in February 2024, or a global IT outage without malicious origins, healthcare organizations of all sizes must be ready to respond to a variety of incidents that could disrupt critical systems. Finn emphasized the importance of proactive due diligence and thorough incident response planning, particularly in identifying and addressing single points of failure. Preparing for potential operational challenges in advance can make all the difference when an incident actually occurs. “We have to change the way we think about deploying this stuff,” Finn added. “Software, fortunately or not, is written by human beings, and human beings will always make mistakes. It’s our job to protect against those kinds of mistakes.” The Importance of Resilience Cyber-resilience is essential for enabling organizations to quickly recover and restore operations. By understanding that incidents like the CrowdStrike outage are bound to occur, organizations can focus on building resilience to effectively manage such events. Finn highlighted the need for resilience and redundancy in response to incidents like the CrowdStrike outage. “I still trust CrowdStrike, but that trust doesn’t mean they’re going to be perfect every time,” Finn noted. Healthcare organizations responded quickly to the incident, despite the disruptions it caused. For instance, Mass General Brigham activated its incident command to manage its response, keeping clinics and emergency departments open for urgent cases. By Monday, July 22, they had resumed scheduled appointments and procedures. According to Erik Weinick, co-head of the privacy and cybersecurity practice at New York-based law firm Otterbourg, the CrowdStrike incident underscores the need for organizations to reassess their legal and technical risk protocols. “Although initial reports indicate that the incident was an accident, not an attack, organizations should use this incident as motivation to conduct information audits, penetration testing, update system mapping and software, including security patches, and remind users about best security practices like multifactor authentication and frequently changing difficult-to-guess passwords,” Weinick said. Essentially, organizations can leverage incidents like the CrowdStrike outage to strengthen their risk management strategies and enhance their cyber-resilience. Third-Party Risk Management Challenges Even with strict security controls in place, organizations are still vulnerable to risks from third-party vendors. As the interconnectedness of healthcare systems grows, so does the potential for third-party risks. The global IT outage highlighted the importance of third-party risk management and the associated challenges. In 2023 and 2022, some of the largest healthcare data breaches were caused by third-party vendors. “People probably did a lot of risk analysis around CrowdStrike, but I’ll bet no one ever asked what tools they use to produce their software,” Finn speculated. “Until we get standards in place for software development and certifications for software sold to critical infrastructure sectors, we’re going to have to dig a little deeper.” In response to the incident, CrowdStrike announced plans to enhance its software resilience and testing processes, including adding more validation checks to its Content Validator for Rapid Response Content to prevent the deployment of faulty content. The company also plans to conduct multiple independent third-party security code reviews to prevent similar incidents in the future. “On the legal front, organizations should review their vendor agreements to understand their obligations regarding privacy and data security, who their partners are working with, and what limitations exist on liability for incidents like the CrowdStrike outage,” Weinick advised. He also recommended checking business disruption insurance coverage and conducting tabletop exercises to rehearse business continuity and recovery procedures in the event of a systems outage. Key Takeaways The CrowdStrike outage reinforced essential IT and security considerations for organizations worldwide, particularly in the areas of resilience, third-party risk management, and incident response and recovery. By learning from this event, organizations can better prepare for future challenges and improve their overall cyber-resilience. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent

Read More
2024 AI Glossary

2024 AI Glossary

Artificial intelligence (AI) has moved from an emerging technology to a mainstream business imperative, making it essential for leaders across industries to understand and communicate its concepts. To help you unlock the full potential of AI in your organization, this 2024 AI Glossary outlines key terms and phrases that are critical for discussing and implementing AI solutions. Tectonic 2024 AI Glossary Active LearningA blend of supervised and unsupervised learning, active learning allows AI models to identify patterns, determine the next step in learning, and only seek human intervention when necessary. This makes it an efficient approach to developing specialized AI models with greater speed and precision, which is ideal for businesses aiming for reliability and efficiency in AI adoption. AI AlignmentThis subfield focuses on aligning the objectives of AI systems with the goals of their designers or users. It ensures that AI achieves intended outcomes while also integrating ethical standards and values when making decisions. AI HallucinationsThese occur when an AI system generates incorrect or misleading outputs. Hallucinations often stem from biased or insufficient training data or incorrect model assumptions. AI-Powered AutomationAlso known as “intelligent automation,” this refers to the integration of AI with rules-based automation tools like robotic process automation (RPA). By incorporating AI technologies such as machine learning (ML), natural language processing (NLP), and computer vision (CV), AI-powered automation expands the scope of tasks that can be automated, enhancing productivity and customer experience. AI Usage AuditingAn AI usage audit is a comprehensive review that ensures your AI program meets its goals, complies with legal requirements, and adheres to organizational standards. This process helps confirm the ethical and accurate performance of AI systems. Artificial General Intelligence (AGI)AGI refers to a theoretical AI system that matches human cognitive abilities and adaptability. While it remains a future concept, experts predict it may take decades or even centuries to develop true AGI. Artificial Intelligence (AI)AI encompasses computer systems that can perform complex tasks traditionally requiring human intelligence, such as reasoning, decision-making, and problem-solving. BiasBias in AI refers to skewed outcomes that unfairly disadvantage certain ideas, objectives, or groups of people. This often results from insufficient or unrepresentative training data. Confidence ScoreA confidence score is a probability measure indicating how certain an AI model is that it has performed its assigned task correctly. Conversational AIA type of AI designed to simulate human conversation using techniques like NLP and generative AI. It can be further enhanced with capabilities like image recognition. Cost ControlThis is the process of monitoring project progress in real-time, tracking resource usage, analyzing performance metrics, and addressing potential budget issues before they escalate, ensuring projects stay on track. Data Annotation (Data Labeling)The process of labeling data with specific features to help AI models learn and recognize patterns during training. Deep LearningA subset of machine learning that uses multi-layered neural networks to simulate complex human decision-making processes. Enterprise AIAI technology designed specifically to meet organizational needs, including governance, compliance, and security requirements. Foundational ModelsThese models learn from large datasets and can be fine-tuned for specific tasks. Their adaptability makes them cost-effective, reducing the need for separate models for each task. Generative AIA type of AI capable of creating new content such as text, images, audio, and synthetic data. It learns from vast datasets and generates new outputs that resemble but do not replicate the original data. Generative AI Feature GovernanceA set of principles and policies ensuring the responsible use of generative AI technologies throughout an organization, aligning with company values and societal norms. Human in the Loop (HITL)A feedback process where human intervention ensures the accuracy and ethical standards of AI outputs, essential for improving AI training and decision-making. Intelligent Document Processing (IDP)IDP extracts data from a variety of document types using AI techniques like NLP and CV to automate and analyze document-based tasks. Large Language Model (LLM)An AI technology trained on massive datasets to understand and generate text. LLMs are key in language understanding and generation and utilize transformer models for processing sequential data. Machine Learning (ML)A branch of AI that allows systems to learn from data and improve accuracy over time through algorithms. Model AccuracyA measure of how often an AI model performs tasks correctly, typically evaluated using metrics such as the F1 score, which combines precision and recall. Natural Language Processing (NLP)An AI technique that enables machines to understand, interpret, and generate human language through a combination of linguistic and statistical models. Retrieval Augmented Generation (RAG)This technique enhances the reliability of generative AI by incorporating external data to improve the accuracy of generated content. Supervised LearningA machine learning approach that uses labeled datasets to train AI models to make accurate predictions. Unsupervised LearningA type of machine learning that analyzes and groups unlabeled data without human input, often used to discover hidden patterns. By understanding these terms, you can better navigate the AI implementation world and apply its transformative power to drive innovation and efficiency across your organization. Tectonic 2024 AI Glossary Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Einstein Knowledge Edits

Einstein Knowledge Edits

Get Quick Revisions on Knowledge Articles with Einstein Knowledge Edits (Beta) Enhance your Knowledge articles quickly using Einstein generative AI with predefined revision styles. These styles can help improve grammar, conciseness, and readability. You can also customize these styles using the Prompt Builder to tailor the revisions to your business needs. This allows you to specify what information Einstein includes, how the content is formatted, and adjust the voice and tone. Where: This feature is available in Unlimited and Enterprise editions with the Einstein for Service add-on in Lightning Experience. Important: Einstein Knowledge Edits is currently in beta and is subject to Salesforce’s Beta Services Terms or a written Unified Pilot Agreement if executed by the Customer. Participation in this beta service is at the Customer’s discretion. Who: To access Knowledge Edits, you must have the following enabled: Agents also need the Prompt Template User and Einstein Knowledge Creation permission sets. How: To revise a Knowledge article: Quickly and effectively refine your Knowledge articles to meet your business standards with Einstein Knowledge Edits! Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Winter 25 Permission Set Groups

Winter 25 Permission Set Groups

Salesforce’s Winter ’25 release introduces a host of updates across the platform, with a particular emphasis on security and user management improvements. Among these, the enhancements to Permission Set Groups stand out, offering more efficiency in managing user access and permissions. Let’s take a closer look at these updates and how they can benefit your Salesforce environment. What Are Permission Set Groups? Before diving into the new enhancements, it’s essential to understand Permission Set Groups. Salesforce created these groups to simplify the assignment of permissions to users. Instead of assigning multiple individual permission sets, administrators can bundle them into a Permission Set Group. This approach streamlines the process, making it easier to manage permissions for users with complex roles requiring access to multiple features and objects. What’s New in Winter ’25? The Winter ’25 release brings several key updates to Permission Set Groups, making them more robust and flexible. Here’s a breakdown of the major improvements: Key Benefits of the Winter ’25 Enhancements The Winter ’25 updates to Permission Set Groups offer several advantages for Salesforce admins and organizations: Getting Started To begin utilizing these new features, head to the Permission Set Group settings in Salesforce Setup. Review your current permission sets and explore how these new features can streamline your processes. The expiration date feature, in particular, will be valuable if you manage temporary roles or frequently changing project teams. Winter 25 Permission Set Groups The Winter ’25 Salesforce release delivers significant improvements to Permission Set Groups, equipping admins with enhanced tools to manage user permissions securely and efficiently. By incorporating these features into your Salesforce environment, you can strengthen security, optimize user access management, and ensure your organization operates smoothly. For a deeper dive into these updates, check the Salesforce Winter ’25 release notes or join discussions in Salesforce communities and forums. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce Revenue Cloud

Transition from Salesforce CPQ to Revenue Cloud

As organizations look to optimize their revenue processes, Salesforce has been encouraging customers to transition from Salesforce CPQ (Configure, Price, Quote) to Revenue Cloud (Rev Cloud). However, while the advantages of Revenue Cloud are often highlighted, clear, actionable steps to make the migration worthwhile are not always readily available. After consulting with Salesforce teams and partners, it’s evident that many customers remain hesitant due to concerns about cost, disruption, and customization complexities.

Read More
gettectonic.com