File Directory for Large Document RAG Systems
By using a Knowledge Table to categorize and tag documents and then converting that information into a Knowledge Graph, enterprises can streamline the search process in RAG systems
By using a Knowledge Table to categorize and tag documents and then converting that information into a Knowledge Graph, enterprises can streamline the search process in RAG systems
MOIRAI-MoE represents a groundbreaking advancement in time series forecasting by introducing a flexible, data-driven approach that addresses the limitations of traditional models. Its sparse mixture of experts architecture achieves token-level specialization, offering significant performance improvements and computational efficiency. By dynamically adapting to the unique characteristics of time series data, MOIRAI-MoE sets a new standard for foundation models, paving the way for future innovations and expanding the potential of zero-shot forecasting across diverse industries.
Introduction to Visual Language Models (VLMs): The Future of Computer Vision Achieving a 28% Boost in Multimodal Image Search Accuracy with VLMs Until recently, artificial intelligence models were narrowly focused—either excelling in understanding text or interpreting images, but rarely both. This siloed approach limited their potential to process and connect different types of data. The development of general-purpose language models, like GPTs, marked a significant leap forward. These models transitioned AI from task-specific systems to powerful, versatile tools capable of handling a wide range of language-driven tasks. Yet, despite their advancements, language models and computer vision systems evolved separately, like having the ability to hear without seeing—or vice versa. Visual Language Models (VLMs) bridge this gap, combining the capabilities of language and vision to create multimodal systems. In this article, we’ll explore VLMs’ architecture, training methods, challenges, and transformative potential in fields like image search. We’ll also examine how implementing VLMs revolutionized an AI-powered search engine. What Are Visual Language Models (VLMs)? VLMs represent the next step in the evolution of AI: multimodal models capable of processing multiple data types, including text, images, audio, and video. Why Multimodal Models? The rise of general-purpose approaches has outpaced narrow, specialized systems in recent years. Here’s why: At their core, VLMs receive input in the form of images and text (called “instructs”) and produce textual responses. Their range of applications includes image classification, description, interpretation, and more. For example: These tasks showcase VLMs’ ability to tackle a variety of challenges, including zero-shot and one-shot scenarios, where minimal prior training is required. How Do VLMs Work? Core Components VLMs typically consist of three main components: Workflow Adapters: The Key to Multimodality Adapters facilitate interaction between the image encoder and LLM. There are two primary types: Training VLMs Pre-Training VLMs are built on pre-trained LLMs and image encoders. Pre-training focuses on linking text and image modalities, as well as embedding world knowledge from visual data. Three types of data are used: Alignment Alignment fine-tunes VLMs for high-quality responses. This involves: Quality Evaluation Evaluating VLM performance involves: Revolutionizing Image Search with VLMs Incorporating VLMs into search engines transforms user experience by integrating text and image inputs. Previous Pipeline VLM-Powered Pipeline With VLMs, image and text inputs are processed together, creating a streamlined, accurate system that outperforms traditional pipelines by 28%. Conclusion Visual Language Models are a game-changer for AI, breaking down barriers between language and vision. From multimodal search engines to advanced problem-solving, VLMs unlock new possibilities in computer vision and beyond. The future of AI is multimodal—and VLMs are leading the way. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more
How to Deploy Metadata to Salesforce Orgs: A Step-by-Step Guide Deploying metadata effectively is critical for building Salesforce orgs that align with your business goals. However, deployment processes often hit roadblocks due to missed dependencies, overwritten changes, or failed deployments at the end of development cycles. Deploy Metadata to Salesforce Orgs. As your team grows and business requirements become more complex, finding a deployment solution that works for everyone may feel overwhelming. While Salesforce’s clicks-not-code philosophy simplifies development, deployments can create bottlenecks if your process isn’t robust. Whether you’re working with custom objects, Flows, or Apex code, this guide will help you streamline your metadata deployments for the whole team. Understanding Salesforce Metadata Deployment Options Salesforce offers several methods for deploying metadata, most of which leverage the Metadata API. This API facilitates metadata transfers between Salesforce orgs or between orgs and version control systems like Git. Although most metadata types are supported by the API, a few exceptions exist. Fortunately, these unsupported types are rarely critical, ensuring the API remains a dependable foundation for most deployment needs. Change Sets: The Starting Point Salesforce’s change sets are often the default tool for new users. These provide a clicks-not-code approach and are easy to enable within the Salesforce UI. However, their limitations can hinder long-term scalability and DevOps maturity. Using Change Sets Limitations of Change Sets For teams seeking a scalable deployment solution, change sets often fall short. DevOps Center: Salesforce’s Modern Alternative Salesforce’s DevOps Center offers an improved approach to metadata deployments, especially for teams adopting collaborative and source-driven development workflows. Benefits of DevOps Center Limitations of DevOps Center While DevOps Center is a significant improvement over change sets, it may not fully meet the needs of growing teams striving for DevOps maturity. Salesforce CLI: A Developer-Focused Tool For teams with development expertise, Salesforce’s Command Line Interface (CLI) enables metadata deployments through scripting. As part of Salesforce DX, the CLI facilitates package-based development and automation. Key Advantages Challenges Gearset: A Complete Metadata Deployment Solution For teams seeking a streamlined, all-in-one deployment solution, Gearset provides the ideal balance of simplicity, collaboration, and DevOps maturity. Why Choose Gearset? How to Deploy Metadata with Gearset: Step-by-Step Deploy with Confidence Whether you’re working with simple change sets or advanced DevOps processes, the right tools can transform your Salesforce deployments. Gearset ensures seamless, reliable metadata transfers with a 99% deployment success rate—empowering teams to focus on innovation rather than troubleshooting. Ready to simplify your metadata deployments? Start your free trial of Gearset today and experience hassle-free releases! Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more
From smarter search engines to AI-powered creative workflows, vectors are the bridge between unstructured data and actionable intelligence. By fully leveraging this technology, businesses can transform their approach to data and secure their place in a rapidly evolving, data-centric world.
Salesforce Data Governance Best Practices Salesforce provides a centralized platform for managing customer relationships, but without proper data governance, the system can quickly become unmanageable. Data governance ensures the accuracy, security, and usability of the vast amounts of information collected, helping teams make better decisions and maximizing the value of Salesforce investments. By establishing robust processes and policies, organizations can maintain clean, compliant, and reliable data. Here’s an overview of data governance in Salesforce, its importance, and strategies to implement it effectively. What Is Data Governance in Salesforce? Data governance in Salesforce refers to the practices that monitor and manage data accuracy, security, and compliance. Proper governance ensures your Salesforce data remains trustworthy and actionable, avoiding issues like errors, duplicates, and regulatory violations. Key Components of Salesforce Data Governance: Strong governance enables organizations to make informed decisions and unlock Salesforce’s full potential. The Impact of Data Governance on Decision-Making Accurate and well-governed data empowers leaders to make strategic, data-driven decisions. With clean and current records, organizations can: Good governance ensures data integrity, leading to smarter decisions and improved business performance. Principles of Effective Salesforce Data Governance Building a strong data governance framework starts with these core principles: 1. Data Ownership Assign clear ownership of datasets to specific individuals, teams, or departments. Owners are accountable for maintaining data quality, ensuring compliance, and resolving issues efficiently. Benefits include: 2. Monitoring and Compliance Conduct regular audits to ensure data accuracy, detect unauthorized access, and maintain compliance with regulations. Tools like Salesforce’s built-in monitoring features or third-party solutions (e.g., Validity DemandTools) can streamline this process. Audit checks should include: Consistent monitoring safeguards sensitive data and avoids costly fines, particularly in heavily regulated industries like healthcare and finance. Steps to Develop a Data Governance Strategy Techniques for Maintaining High-Quality Data High-quality data is the backbone of Salesforce governance. Apply these techniques to ensure your data meets quality standards: Standardizing Data for Better Governance Data standardization ensures consistency across Salesforce records, improving analysis and operational efficiency. Examples include: Leveraging Data Management Tools Data management tools are essential for maintaining data integrity and enhancing governance. Benefits include: By integrating these tools into your Salesforce processes, you can establish a solid foundation for data governance while boosting operational efficiency. Final Thoughts Effective data governance in Salesforce is critical for maintaining data quality, ensuring compliance, and empowering teams to make strategic decisions. By following best practices and leveraging the right tools, organizations can maximize the value of their Salesforce investment and drive long-term success. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more
In the current age of digital transformation, effective data migration is essential, especially as cloud adoption accelerates. Research from Foundry reveals that 63% of IT leaders have increased their cloud migrations, but 90% face challenges—primarily budgetary constraints. This highlights the importance of thoughtful planning and strategic execution. In this context, we’ll explore the significance of successful Salesforce data migration and present a nine-step roadmap to ensure a seamless transition. Additionally, we’ll cover solutions for data preparation and the top five Salesforce data migration tools that can help turn migration challenges into growth opportunities. Salesforce Data Migration ChecklistDownload our e-book to quickly and efficiently migrate data from Excel spreadsheets and CRM systems to Salesforce. Why is Data Migration Important?In 2010, I bought my first smartphone and struggled to transfer data from my outdated phone. My contacts were vital, but the old phone lacked proper data transfer options. Determined not to re-enter everything manually, I searched for a solution. Eventually, I found a method to extract data into a CSV file, which I converted to vCard format to transfer successfully. This experience reinforced how essential data migration is—not only for businesses but also for everyday situations. For organizations looking to modernize, data migration is a crucial step in upgrading IT infrastructure. It enables smooth transitions from legacy systems to modern platforms like Salesforce, enhancing efficiency, scalability, and data accessibility. Effective data migration improves data management, reduces costs tied to outdated systems, and supports better decision-making through improved analytics. It also ensures data integrity and security, aligning IT capabilities with evolving business needs, fostering innovation, and keeping a competitive edge. What is Data Migration in Salesforce?Whether you are already using Salesforce or considering adoption, one common question arises: “How do I migrate my data to Salesforce?” Salesforce data migration involves moving information from external systems like legacy CRMs or local databases into Salesforce. This process is critical not only for protecting data integrity but also for enabling better decision-making, improving customer service, and promoting organizational growth. A well-planned data migration strategy ensures a smooth transition to Salesforce, maximizing its potential and enhancing business efficiency. 9-Step Salesforce Data Migration PlanPreparing for a Salesforce data migration? Follow these nine essential steps for a seamless process: Need Help with Data Migration to Salesforce?We offer consulting services to guide you through the data migration process, from auditing data sources to executing the migration strategy. Tectonic is here to help. Top 5 Salesforce Data Migration ToolsHere’s a quick comparison of five Salesforce data migration tools to help you choose the right solution: For hassle-free data migration, reach out to Tectonic for a tailored plan that minimizes downtime and maximizes operational efficiency. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more
Snowflake Unveils AI Development and Enhanced Security Features At its annual Build virtual developer conference, Snowflake introduced a suite of new capabilities focused on AI development and strengthened security measures. These enhancements aim to simplify the creation of conversational AI tools, improve collaboration, and address data security challenges following a significant breach earlier this year. AI Development Updates Snowflake announced updates to its Cortex AI suite to streamline the development of conversational AI applications. These new tools focus on enabling faster, more efficient development while ensuring data integrity and trust. Highlights include: These features address enterprise demands for generative AI tools that boost productivity while maintaining governance over proprietary data. Snowflake aims to eliminate barriers to data-driven decision-making by enabling natural language queries and easy integration of structured and unstructured data into AI models. According to Christian Kleinerman, Snowflake’s EVP of Product, the goal is to reduce the time it takes for developers to build reliable, cost-effective AI applications: “We want to help customers build conversational applications for structured and unstructured data faster and more efficiently.” Security Enhancements Following a breach last May, where hackers accessed customer data via stolen login credentials, Snowflake has implemented new security features: These additions come alongside existing tools like the Horizon Catalog for data governance. Kleinerman noted that while Snowflake’s previous security measures were effective at preventing unauthorized access, the company recognizes the need to improve user adoption of these tools: “It’s on us to ensure our customers can fully leverage the security capabilities we offer. That’s why we’re adding more monitoring, insights, and recommendations.” Collaboration Features Snowflake is also enhancing collaboration through its new Internal Marketplace, which enables organizations to share data, AI tools, and applications across business units. The Native App Framework now integrates with Snowpark Container Services to simplify the distribution and monetization of analytics and AI products. AI Governance and Competitive Position Industry analysts highlight the growing importance of AI governance as enterprises increasingly adopt generative AI tools. David Menninger of ISG’s Ventana Research emphasized that Snowflake’s governance-focused features, such as LLM observability, fill a critical gap in AI tooling: “Trustworthy AI enhancements like model explainability and observability are vital as enterprises scale their use of AI.” With these updates, Snowflake continues to compete with Databricks and other vendors. Its strategy focuses on offering both API-based flexibility for developers and built-in tools for users seeking simpler solutions. By combining innovative AI development tools with robust security and collaboration features, Snowflake aims to meet the evolving needs of enterprises while positioning itself as a leader in the data platform and AI space. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more
Have you ever wondered how AI agents understand tabular data, such as that in CSV or Excel files? Or how a file loaded into a platform like ChatGPT can be instantly understood and processed? This insight explores the creation of a custom AI agent capable of achieving these tasks from scratch. AI Agents and Tabular Data. Context Jen, an AI Engineer at AI United, leads a team developing an AI agent within a 30-day timeline. This agent will generate tailored interactive charts based on uploaded data files, enabling users to better visualize and interpret the data. To achieve this, the team needed to ensure the AI agent could analyze the file’s data context and autonomously recommend the most appropriate chart types. The agent development was broken down into four main steps: Here’s a look at how the team developed the AI system to ingest CSV data and aggregate it into an actionable format. Setup The development began by configuring package installations and defining the language model to be used: pythonCopy code%pip install langchain_openai %pip install langchain_core %pip install langchain_community %pip install langchain_experimental from langchain_openai.chat_models import ChatOpenAI openai_key = os.environ.get(“OPENAI_API”) gpt4o = ChatOpenAI(temperature=0.0, model=”gpt-4o”, openai_api_key=openai_key) Step 1: Context Creation Before generating the code to process raw data, the team created context around the dataset to enhance the AI’s response accuracy. Metadata extraction included: For demonstration, a wine reviews dataset was used, and metadata was extracted as follows: pythonCopy codeimport pandas as pd def extract_metadata(df): metadata = { ‘Number of Columns’: df.shape[1], ‘Schema’: df.columns.tolist(), ‘Data Types’: str(df.dtypes), ‘Sample’: df.head(1).to_dict(orient=”records”) } return metadata df = pd.read_csv(“wine_reviews.csv”) metadata = extract_metadata(df) Step 2: Prompt Augmentation To help the AI model interpret the dataset, prompts were augmented with extracted metadata using a template: pythonCopy codeprompt_template = ”’ Assistant is an AI model that suggests charts to visualize data based on the following metadata. SCHEMA: ——– {schema} DATA TYPES: ——– {data_types} SAMPLE: ——– {sample} ”’.format(schema=metadata[“Schema”], data_types=metadata[“Data Types”], sample=metadata[“Sample”]) gpt40.invoke(prompt_template) Step 3: Simple Agent Code Generation & Execution With the prompt augmented, the model was able to suggest suitable charts. For data transformation, an agentic workflow with a Python REPL tool was used, where the AI generated code for aggregating data and then executed it to provide the necessary structure for plotting. A REPL instance was created to pass data into Python functions, enabling the AI to perform aggregation. pythonCopy codefrom langchain_experimental.utilities import PythonREPL repl = PythonREPL() repl.globals[‘df’] = df from langchain_core.tools import tool @tool def python_repl(code: str): try: result = repl.run(code) except BaseException as e: return f”Failed to execute. Error: {repr(e)}” result_str = f”Successfully executed:n“`pythonn{code}n“`nStdout: {result}” return result_str tools = [python_repl] Step 4: Final Data Aggregation and Charting Finally, the AI suggested the Bar Chart type for plotting the top 10 wineries by average points, and the REPL instance executed the code to transform the data for the chart: pythonCopy code# Code to aggregate and convert data into dictionary format top_wineries = df.groupby(‘winery’)[‘points’].mean().sort_values(ascending=False).head(10) top_wineries_dict = top_wineries.to_dict() print(top_wineries_dict) The aggregated data was output as: jsonCopy code{‘Macauley’: 96.0, ‘Heitz’: 95.5, ‘Bodega Carmen Rodríguez’: 95.5, ‘Maurodos’: 95.0, ‘Blue Farm’: 95.0, ‘Numanthia’: 95.0, ‘Château Lagrézette’: 95.0, ‘Patricia Green Cellars’: 95.0, ‘Ponzi’: 95.0, ‘Muga’: 95.0} With this approach, the AI agent was capable of not only understanding the data in the uploaded file but also generating interactive visualizations, making complex datasets more accessible and insightful. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more
Generative AI has emerged as the most dominant trend in data management and analytics, overshadowing all other technologies. This prominence began with the launch of ChatGPT by OpenAI in November 2022, which significantly advanced the capabilities of large language models (LLMs) and demonstrated the transformative potential of generative AI (GenAI) for enterprises. Generative AI’s impact is profound, particularly in making advanced business intelligence tools accessible to a broader range of employees, not just data scientists and analysts. Before the advent of GenAI, complex data management and analytics platforms required computer science skills, statistical expertise, and extensive data literacy. Generative AI has reduced these barriers, enabling more people to leverage data insights for decision-making. Another key advantage of generative AI is its ability to greatly enhance efficiency. It can automate time-consuming, repetitive tasks previously performed manually by data engineers and experts, acting as an independent agent in managing data processes. The landscape of generative AI has evolved rapidly. Following the launch of ChatGPT, a wave of competing LLMs has emerged. Initially, the transformative potential of these technologies was theoretical, but it is now becoming tangible. Companies like Google are developing tools to help customers build and deploy their own generative AI models and applications. Enterprises are increasingly moving from pilot testing to developing and implementing production models. Generative AI does not operate in isolation. Enterprises are also focusing on complementary aspects such as data quality and governance. Ensuring that the data feeding and training generative AI is reliable is crucial. Additionally, real-time data and automation are essential for making generative AI a proactive technology rather than a reactive one. Generative AI has highlighted the need for a robust data foundation. The main challenge now is ensuring that enterprise data is trusted, governed, and ready for AI applications. With the rise of multimodal data, enterprises require a unified approach to manage and govern diverse data types effectively. In addition to generative AI, other significant trends in data management and analytics include the focus on real-time data processing and automation. Integrating generative AI with real-time data streams and automated systems is expected to drive substantial business transformation. By enabling real-time insights and actions, businesses can achieve a level of operational efficiency previously unattainable. The convergence of these technologies is transforming business operations. Unified and simplified technology stacks, integrating foundational technologies, LLMs, and real-time data platforms, are essential for driving this transformation. The industry is making strides towards creating integrated solutions that support comprehensive data management and analytics. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Top Ten Reasons Why Tectonic Loves the Cloud The Cloud is Good for Everyone – Why Tectonic loves the cloud You don’t need to worry about tracking licenses. Read more
Leveraging Website Activity Data in Salesforce Marketing Cloud Understanding how users interact with your website is essential for delivering personalized customer experiences. Salesforce Marketing Cloud (SFMC) offers robust tools to capture website activity and transform this data into actionable insights, enhancing your marketing strategies. This guide walks you through the process of collecting website activity data in SFMC. Marketing Cloud Website Activity Collection Before diving into the setup process, it’s important to understand the benefits of collecting website activity data: Now, let’s explore how to set up website activity tracking in Salesforce Marketing Cloud. Set Up Marketing Cloud Website Activity Collection Step 1: Install Salesforce Marketing Cloud Tracking Code To begin collecting website activity, install the Salesforce Marketing Cloud tracking code on your website. Known as the “Web Collect” code, this script captures visitor behavior data and sends it to SFMC. Step 2: Configure Data Extensions After installing the tracking code, set up data extensions in SFMC to store the website activity data you collect. Step 3: Set Up Behavioral Triggers To maximize the value of your data, set up behavioral triggers in SFMC. These triggers can automatically send personalized communications based on specific website actions. Step 4: Leverage Advertising Studio for Retargeting To further enhance your marketing efforts, use Advertising Studio to create retargeting campaigns based on website activity data. Step 5: Monitor and Optimize After setting up website activity tracking, regularly monitor the performance of your campaigns and the quality of your collected data. Final Thoughts Collecting website activity data in Salesforce Marketing Cloud enables you to understand customer behavior better and deliver more personalized experiences. By following these steps—installing the tracking code, configuring data extensions, setting up behavioral triggers, and leveraging retargeting—you can effectively harness website activity data to elevate your marketing efforts. Start implementing these strategies today to unlock the full potential of Salesforce Marketing Cloud and drive deeper engagement and conversions. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more
Key Dates for the Winter ’25 Release Winter ’25 is Approaching! Can you believe it’s that time of year again? Salesforce is gearing up for its most anticipated release of the year, and Demand Chain is here to assist you in getting ready. There’s a lot to explore, so grab your coffee (or your preferred caffeinated beverage) and let’s dive in! Starting August 8, 2024, admins can sign up for a pre-release Developer Edition environment to access new features. Be sure to check out the Release Notes available from August 14, 2024, to stay updated on the latest enhancements for the products you use. Each release includes a Sandbox Preview Guide to help you determine the best next steps for your sandbox environment. The Sandbox previews, which began on August 30, 2024, allow you to test, identify, and report bugs before the major release hits your production instance. Let’s work together to ensure we’re “Release Ready”! Admin Insights User Access Summary To streamline the process of determining user permissions, administrators can now easily view all permissions granted to a specific user in just a few clicks. The “Access Granted By” feature, introduced in Spring ’24, is enhanced in Winter ’25 to show the specific sources of a user’s access. Inline Editing with Enhanced User List View With the Winter ’25 release, you can configure user list views just like any other list view in Salesforce. Now, you can view, sort, filter, and modify user records directly within the list view without navigating away. Conditional Formatting for Fields Conditional formatting can now be applied to fields on pages with Dynamic Forms enabled. Using rulesets, you can customize icons and colors based on specific criteria related to field values. Enhanced Custom Report Setup (Beta) Enhance your custom report types with the redesigned Custom Report page in Setup. The new summary page offers greater flexibility and visibility into report details, with features like drag-and-drop sections, field search, name customization, and the Lookup Fields button for adding up to 1,000 fields to the report type layout. Sales Cloud Updates Optimize Strategic Planning with Account Plans Empower your sales team to take a strategic approach to account planning and growth. Account plans allow you to view opportunity details, conduct a SWOT analysis, capture customer needs, set measurable objectives, and visualize key stakeholders. Seller Home Available for All Apps Seller Home now provides your sales team with a comprehensive view of their business and can be added to any standard or custom app with the Winter ’25 release. Forecasting Enhancements Service Cloud Innovations My Service Journey (Beta) Explore Service Cloud with My Service Journey, where you can discover relevant capabilities and filter by business areas, Salesforce edition, and new features. If you find something of interest but lack access, you can easily contact your Salesforce Account Executive directly from the platform. Enhanced Messaging Channels and Capabilities Salesforce has broadened its messaging options to include additional channels like LINE Messaging, Bring Your Own Channel (BYOC), BYOC for Contact Center as a Service (CCaaS), and Unified Messaging for SMS. Flow Updates Flow Action Buttons Now Generally Available With the removal of the beta flag, Flow Action Buttons can now be disabled based on specific criteria. Flow Repeater Enhancements Improve user experience by prepopulating data within screen flows, allowing users to easily change a collection of records. You can also control whether users can add new items. Transform Element Enhancements New data types (e.g., Boolean, Date, Picklist, Text) are now available for use as Target Data values. Developer Insights Real-Time Preview of Lightning Web Components Developers can now preview Lightning Web Components in real-time by enabling Local Dev (Beta) in their developer sandbox. Changes to the source code will automatically update the preview, eliminating the need to refresh manually. Manufacturing Updates Inventory Search and Transfer Empower inventory managers to maintain optimal levels, fulfill product demands, and enhance inventory processes by easily searching and tracking stock across various locations. Consider Decimal Values for Forecasts Accurately forecast quantities with new decimal fields on Sales Agreements, supported by a new node in the Data Processing Engine templates. AI/Data Cloud Highlights The Winter ’25 release introduces a variety of new and enhanced Einstein AI features, transforming how businesses leverage artificial intelligence. Salesforce Einstein is leading the way with tools that boost productivity, enhance customer engagement, and facilitate data-driven decision-making. Main Einstein Feature Summary Table The release notes highlight key Einstein AI features across various clouds, such as Communications, Education, Field Service, Industries, Marketing, Sales, and Service. Features like Einstein Quick Quote, Alumni Metrics, and AI-driven insights are designed to enhance productivity and decision-making. Einstein Personalization: Data Cloud Einstein Personalization works with Data Cloud to deliver personalized experiences across Salesforce clouds. Customized Work Summaries in Copilot You can now tailor how Einstein drafts Work Summaries in Copilot by setting your formatting rules or restrictions for the prompt template. Sales Signals Sales managers and teams can access enhanced features from Sales Signals, including relevant metrics, key themes, and featured mentions. The revamped Signals page matches the updated ECI interface, providing more metrics and relevant data. AI Enhancements to Emails AI-driven improvements are being introduced in areas of Salesforce where emails are generated. Flow Builder Enhancements The release includes several AI enhancements to the Flow Builder, improving how flows are constructed and optimized. AI Accelerator and Scoring Framework Utilize AI Accelerator to obtain predictions across multiple Industry clouds, enabling the creation of generic propensity models without coding by leveraging the Scoring Framework. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following
Cortex Framework: Integration with Salesforce (SFDC) This insight outlines the process of integrating Salesforce (SFDC) operational workloads into the Cortex Framework Data Foundation. By integrating Salesforce data through Dataflow pipelines into BigQuery, Cloud Composer can schedule and monitor these pipelines, allowing you to gain insights from your Salesforce data. Cortex Framework Integration with Salesforce explained. Prerequisite: Before configuring any workload integration, ensure that the Cortex Framework Data Foundation is deployed. Configuration File The config.json file in the Cortex Framework Data Foundation repository manages settings for transferring data from various sources, including Salesforce. Below is an example of how Salesforce workloads are configured: jsonCopy code”SFDC”: { “deployCDC”: true, “createMappingViews”: true, “createPlaceholders”: true, “datasets”: { “cdc”: “”, “raw”: “”, “reporting”: “REPORTING_SFDC” } } Explanation of Parameters: Parameter Meaning Default Value Description SFDC.deployCDC Deploy CDC true Generates Change Data Capture (CDC) processing scripts to run as DAGs in Cloud Composer. SFDC.createMappingViews Create mapping views true Creates views in the CDC processed dataset to show the “latest version of the truth” from the raw dataset. SFDC.createPlaceholders Create placeholders true Creates empty placeholder tables if they aren’t generated during ingestion, ensuring smooth downstream reporting deployment. SFDC.datasets.raw Raw landing dataset (user-defined) The dataset where replication tools land data from Salesforce. SFDC.datasets.cdc CDC processed dataset (user-defined) Source for reporting views and target for records processed by DAGs. SFDC.datasets.reporting Reporting dataset for SFDC “REPORTING_SFDC” Name of the dataset accessible for end-user reporting, where views and user-facing tables are deployed. Salesforce Data Requirements Table Structure: Loading SFDC Data into BigQuery The Cortex Framework offers several methods for loading Salesforce data into BigQuery: CDC Processing The CDC scripts rely on two key fields: You can adjust the CDC processing to handle different field names or add custom fields to suit your data schema. Configuration of API Integration and CDC To configure Salesforce data integration into BigQuery, Cortex provides the following methods: Example Configuration (settings.yaml): yamlCopy codesalesforce_to_raw_tables: – base_table: accounts raw_table: Accounts api_name: Account load_frequency: “@daily” Data Mapping and Polymorphic Fields Cortex Framework supports mapping data fields to the expected format. For example, a field named unicornId in your source system would be mapped to AccountId in Cortex with the string data type. Polymorphic Fields: Fields whose names vary but have the same structure can be mapped in Cortex using [Field Name]_Type, such as Who_Type for the Who.Type field in the Task object. Modifying DAG Templates You can customize DAG templates as needed for CDC or raw data processing. To disable CDC or raw data processing from API calls, set deployCDC=false in the configuration file. Setting Up the Extraction Module Follow these steps to set up the Salesforce to BigQuery extraction module: Cloud Composer Setup To run Python scripts for replication, install the necessary Python packages depending on your Airflow version. For Airflow 2.x: bashCopy codegcloud composer environments update my-composer-instance –location us-central1 –update-pypi-package apache-airflow-providers-salesforce>=5.2.0 Security and Permissions Ensure Cloud Composer has access to Google Secret Manager for retrieving stored secrets, enhancing the security of sensitive data like passwords and API keys. Conclusion By following these steps, you can successfully integrate Salesforce workloads into Cortex Framework, ensuring a seamless data flow from Salesforce into BigQuery for reporting and analytics. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more
In today’s era of rapid digital transformation, efficient data migration has become increasingly important as cloud adoption gains momentum. Foundry’s research indicates that 63% of IT leaders have accelerated their cloud migrations, but 90% encounter challenges, often related to budget constraints. This emphasizes the need for meticulous planning and strategic execution. This insight focuses on Salesforce data migration, outlining why it’s essential and providing a nine-step plan for a successful migration. Additionally, we look into data preparation solutions and highlight Salesforce data migration tools, turning potential challenges into growth opportunities. Salesforce Data Migration Checklist Why is Data Migration Important? In 2011, we faced the challenge of transferring data from an old phone to a first smartphone. The contacts were especially important, but the outdated phone lacked any data transfer capabilities. Unwilling to manually re-enter everything, we researched extensively and discovered a method to extract the data into a CSV file. Converting it into vCard format, we successfully migrated all contacts. This personal experience illustrates the significance of data migration, not just for businesses but for everyday scenarios as well. For organizations, having a structured data migration plan is critical when transitioning from legacy systems to modern platforms like Salesforce. It enhances efficiency, scalability, and accessibility, supporting business growth through better data management, cost savings, and improved decision-making. Data migration also ensures integrity and security, aligning IT capabilities with evolving business needs and driving innovation in a fast-changing technological landscape. Learn how we helped Cresa migrate over 8,000 records to Salesforce with 100% accuracy. What is Salesforce Data Migration? Salesforce data migration refers to the process of transferring information from external systems—such as legacy CRM platforms or local databases—into Salesforce. This process not only preserves data integrity but also supports better decision-making, enhances customer service, and enables business growth. A well-planned Salesforce data migration strategy is critical for unlocking the full benefits of the platform and ensuring a seamless transition. Salesforce Data Migration Plan: 9 Key Steps Need Help with Data Migration to Salesforce?We offer consulting services to help you navigate your data migration challenges, from auditing to strategy execution. Contact Tectonic today. Practical Salesforce Data Migration ExampleUsing Data Loader, here’s a step-by-step guide to migrating a list of companies. After logging into Salesforce and selecting the Accounts object, you map fields from your CSV file, execute the migration, and review the logs to ensure accuracy. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more
The first notable change in the field of language models is the significant expansion of context window sizes and a reduction in token costs. For instance, Anthropic’s largest model, Claude, has a context window exceeding 200,000 tokens, while recent reports indicate that Gemini’s context window can reach up to 10 million tokens. Under such circumstances, Retrieval-Augmented Generation (RAG) may no longer be necessary for many tasks, as all required data can be accommodated within the expanded context window. Several financial and analytical projects have already demonstrated that tasks can be solved without needing a vector database as intermediate storage. This trend of reducing token costs and increasing context window sizes is likely to continue, potentially decreasing the need for external mechanisms in LLMs, although they are still relevant for the time being. If the context window remains insufficient, methods for summarization and context compression have been introduced. LangChain, for example, offers a class called ConversationSummaryMemory to address this challenge. pythonCopy codellm = OpenAI(temperature=0) conversation_with_summary = ConversationChain( llm=llm, memory=ConversationSummaryMemory(llm=OpenAI()), verbose=True ) conversation_with_summary.predict(input=”Hi, what’s up?”) Knowledge Graphs As the volume of data continues to grow, navigating through it efficiently becomes increasingly critical. In certain cases, understanding the structure and attributes of data is essential for effective use. For example, if the data source is a company’s wiki, an LLM might not recognize a phone number unless the structure or metadata indicates that it’s the company’s contact information. Humans can infer meaning from conventions, such as the subdirectory “Company Information,” but standard RAG may miss such connections. This challenge can be addressed by Knowledge Graphs, also known as Knowledge Maps, which provide both raw data and metadata that illustrates how different entities are interconnected. This method is referred to as Graph Retrieval-Augmented Generation (GraphRAG). Graphs are excellent for representing and managing structured, interconnected information. Unlike vector databases, they excel at capturing complex relationships and attributes among diverse data types. Creating a Knowledge Graph The process of creating a knowledge graph typically involves collecting and structuring data, which requires expertise in both the subject matter and graph modeling. However, LLMs can automate a significant portion of this process by analyzing textual data, identifying entities, and recognizing their relationships, which can then be represented in a graph structure. In many cases, an ensemble of vector databases and knowledge graphs can improve accuracy, as discussed previously. For example, search functionality might combine keyword search through a regular database (e.g., Elasticsearch) and graph-based queries. LangChain can also assist in extracting structured data from entities, as demonstrated in this code example: pythonCopy codedocuments = parse_and_load_data_from_wiki_including_metadata() graph_store = NebulaGraphStore( space_name=”Company Wiki”, tags=[“entity”] ) storage_context = StorageContext.from_defaults(graph_store=graph_store) index = KnowledgeGraphIndex.from_documents( documents, max_triplets_per_chunk=2, space_name=space_name, tags=[“entity”] ) query_engine = index.as_query_engine() response = query_engine.query(“Tell me more about our Company”) Here, searching is conducted based on attributes and related entities, instead of similar vectors. If set up correctly, metadata from the company’s wiki, such as its phone number, would be accessible through the graph. Access Control One challenge with this system is that data access may not be uniform. For instance, in a wiki, access could depend on roles and permissions. Similar issues exist in vector databases, leading to the need for access management mechanisms such as Role-Based Access Control (RBAC), Attribute-Based Access Control (ABAC), and Relationship-Based Access Control (ReBAC). These access control methods function by evaluating paths between users and resources within graphs, such as in systems like Active Directory. To ensure the integrity of data during the ingestion phase, metadata related to permissions must be preserved in both the knowledge graph and vector database. Some commercial vector databases already have this functionality built in. Ingestion and Parsing Data needs to be ingested into both graphs and vector databases, but for graphs, formatting is especially critical since it reflects the data’s structure and serves as metadata. One particular challenge is handling complex formats like PDFs, which can contain diverse elements like tables, images, and text. Extracting structured data from such formats can be difficult, and while frameworks like LLama Parse exist, they are not always foolproof. In some cases, Optical Character Recognition (OCR) may be more effective than parsing. Enhancing Answer Quality Several new approaches are emerging to improve the quality of LLM-generated answers: While these advancements in knowledge graphs, access control, and retrieval mechanisms are promising, challenges remain, particularly around data formatting and parsing. However, these methods continue to evolve, enhancing LLM capabilities and efficiency. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more Alphabet Soup of Cloud Terminology As with any technology, the cloud brings its own alphabet soup of terms. This insight will hopefully help you navigate Read more