JSON Archives - gettectonic.com - Page 2
Cross Cloud Zero-Copy Data

Cross Cloud Zero-Copy Data

Simplifying Secure Data Access Across Clouds In today’s data-driven world, secure and prompt access to information is crucial. However, with critical analytics data spread across various cloud vendors, achieving this expediency can be challenging. Cross-cloud zero-copy data sharing doesn’t have to be complex. By leveraging your Autonomous Database, you can swiftly establish secure data sharing with your Salesforce CRM Data Stream in just seconds. This guide will walk you through the straightforward process of connecting your Salesforce CRM data to your Autonomous Database using the Salesforce CRM data connector type. Requirements for Salesforce Integration To connect Salesforce CRM data with your Autonomous Database, you’ll need the following: 1. Confirm Data Stream Configuration On the Data Streams Dashboard, verify the Data Stream Name, Data Connector Type, and Data Stream Status. 2. Set Up Your Autonomous Database Create Your Credentials: sqlCopy codeBEGIN DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => ‘<your credential name>’, username => ‘<your salesforce log-in id>’, password => ‘<your salesforce password>’); END; / Create Your Database Link: sqlCopy codeBEGIN DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK( db_link_name => ‘<your database link name>’, hostname => ‘<your host>.my.salesforce.com’, port => ‘19937’, service_name => ‘salesforce’, ssl_server_cert_dn => NULL, credential_name => ‘<your credential name>’, gateway_params => JSON_OBJECT( ‘db_type’ value ‘salesforce’, ‘security_token’ value ‘<your security token>’)); END; / 3. Check Connectivity Details The HETEROGENEOUS_CONNECTIVITY_INFO view provides information on credential and database link requirements for external databases. For example: sqlCopy codeSELECT database_type, required_port, sample_usage FROM heterogeneous_connectivity_info WHERE database_type = ‘salesforce’; 4. Demonstration: Connecting to Salesforce Data Follow these steps to connect to your Salesforce CRM organization using the Salesforce Data Cloud Sales synthetic data in the Account_Home Data Stream: 5. Set Up Connectivity Using DBMS_CLOUD.CREATE_CREDENTIAL, create the necessary credentials to connect to Salesforce. Then, use DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK to establish the database link. Once configured, execute the SELECT statement against the ACCOUNT data to verify successful connection. 6. Utilize Zero-Copy Data Sharing With zero-copy data access to the Salesforce CRM Data Lake ACCOUNT object, you can: Conclusion As demonstrated, secure and efficient cross-cloud zero-copy data access can be straightforward. By following these simple steps, you can bypass cumbersome ETL operations and gain immediate, secure access to your Salesforce CRM data. This approach eliminates the overhead of complex data pipelines and provides you with real-time access to critical business data. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Exploring Large Action Models

Exploring Large Action Models

Exploring Large Action Models (LAMs) for Automated Workflow Processes While large language models (LLMs) are effective in generating text and media, Large Action Models (LAMs) push beyond simple generation—they perform complex tasks autonomously. Imagine an AI that not only generates content but also takes direct actions in workflows, such as managing customer relationship management (CRM) tasks, sending emails, or making real-time decisions. LAMs are engineered to execute tasks across various environments by seamlessly integrating with tools, data, and systems. They adapt to user commands, making them ideal for applications in industries like marketing, customer service, and beyond. Key Capabilities of LAMs A standout feature of LAMs is their ability to perform function-calling tasks, such as selecting the appropriate APIs to meet user requirements. Salesforce’s xLAM models are designed to optimize these tasks, achieving high performance with lower resource demands—ideal for both mobile applications and high-performance environments. The fc series models are specifically tuned for function-calling, enabling fast, precise, and structured responses by selecting the best APIs based on input queries. Practical Examples Using Salesforce LAMs In this article, we’ll explore: Implementation: Setting Up the Model and API Start by installing the necessary libraries: pythonCopy code! pip install transformers==4.41.0 datasets==2.19.1 tokenizers==0.19.1 flask==2.2.5 Next, load the xLAM model and tokenizer: pythonCopy codeimport json import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_name = “Salesforce/xLAM-7b-fc-r” model = AutoModelForCausalLM.from_pretrained(model_name, device_map=”auto”, torch_dtype=”auto”, trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_name) Now, define instructions and available functions. Task Instructions: The model will use function calls where applicable, based on user questions and available tools. Format Example: jsonCopy code{ “tool_calls”: [ {“name”: “func_name1”, “arguments”: {“argument1”: “value1”, “argument2”: “value2”}} ] } Define available APIs: pythonCopy codeget_weather_api = { “name”: “get_weather”, “description”: “Retrieve weather details”, “parameters”: {“location”: “string”, “unit”: “string”} } search_api = { “name”: “search”, “description”: “Search for online information”, “parameters”: {“query”: “string”} } Creating Flask APIs for Business Logic We can use Flask to create APIs to replicate business processes. pythonCopy codefrom flask import Flask, request, jsonify app = Flask(__name__) @app.route(“/customer”, methods=[‘GET’]) def get_customer(): customer_id = request.args.get(‘customer_id’) # Return dummy customer data return jsonify({“customer_id”: customer_id, “status”: “active”}) @app.route(“/send_email”, methods=[‘GET’]) def send_email(): email = request.args.get(’email’) # Return dummy response for email send status return jsonify({“status”: “sent”}) Testing the LAM Model and Flask APIs Define queries to test LAM’s function-calling capabilities: pythonCopy codequery = “What’s the weather like in New York in fahrenheit?” print(custom_func_def(query)) # Expected: {“tool_calls”: [{“name”: “get_weather”, “arguments”: {“location”: “New York”, “unit”: “fahrenheit”}}]} Function-Calling Models in Action Using base_call_api, LAMs can determine the correct API to call and manage workflow processes autonomously. pythonCopy codedef base_call_api(query): “””Calls APIs based on LAM recommendations.””” base_url = “http://localhost:5000/” json_response = json.loads(custom_func_def(query)) api_url = json_response[“tool_calls”][0][“name”] params = json_response[“tool_calls”][0][“arguments”] response = requests.get(base_url + api_url, params=params) return response.json() With LAMs, businesses can automate and streamline tasks in complex workflows, maximizing efficiency and empowering teams to focus on strategic initiatives. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
LLMs Turn CSVs into Knowledge Graphs

LLMs Turn CSVs into Knowledge Graphs

Neo4j Runway and Healthcare Knowledge Graphs Recently, Neo4j Runway was introduced as a tool to simplify the migration of relational data into graph structures. LLMs Turn CSVs into Knowledge Graphs. According to its GitHub page, “Neo4j Runway is a Python library that simplifies the process of migrating your relational data into a graph. It provides tools that abstract communication with OpenAI to run discovery on your data and generate a data model, as well as tools to generate ingestion code and load your data into a Neo4j instance.” In essence, by uploading a CSV file, the LLM identifies the nodes and relationships, automatically generating a Knowledge Graph. Knowledge Graphs in healthcare are powerful tools for organizing and analyzing complex medical data. These graphs structure information to elucidate relationships between different entities, such as diseases, treatments, patients, and healthcare providers. Applications of Knowledge Graphs in Healthcare Integration of Diverse Data Sources Knowledge graphs can integrate data from various sources such as electronic health records (EHRs), medical research papers, clinical trial results, genomic data, and patient histories. Improving Clinical Decision Support By linking symptoms, diagnoses, treatments, and outcomes, knowledge graphs can enhance clinical decision support systems (CDSS). They provide a comprehensive view of interconnected medical knowledge, potentially improving diagnostic accuracy and treatment effectiveness. Personalized Medicine Knowledge graphs enable the development of personalized treatment plans by correlating patient-specific data with broader medical knowledge. This includes understanding relationships between genetic information, disease mechanisms, and therapeutic responses, leading to more tailored healthcare interventions. Drug Discovery and Development In pharmaceutical research, knowledge graphs can accelerate drug discovery by identifying potential drug targets and understanding the biological pathways involved in diseases. Public Health and Epidemiology Knowledge graphs are useful in public health for tracking disease outbreaks, understanding epidemiological trends, and planning interventions. They integrate data from various public health databases, social media, and other sources to provide real-time insights into public health threats. Neo4j Runway Library Neo4j Runway is an open-source library created by Alex Gilmore. The GitHub repository and a blog post describe its features and capabilities. Currently, the library supports OpenAI LLM for parsing CSVs and offers the following features: The library eliminates the need to write Cypher queries manually, as the LLM handles all CSV-to-Knowledge Graph conversions. Additionally, Langchain’s GraphCypherQAChain can be used to generate Cypher queries from prompts, allowing for querying the graph without writing a single line of Cypher code. Practical Implementation in Healthcare To test Neo4j Runway in a healthcare context, a simple dataset from Kaggle (Disease Symptoms and Patient Profile Dataset) was used. This dataset includes columns such as Disease, Fever, Cough, Fatigue, Difficulty Breathing, Age, Gender, Blood Pressure, Cholesterol Level, and Outcome Variable. The goal was to provide a medical report to the LLM to get diagnostic hypotheses. Libraries and Environment Setup pythonCopy code# Install necessary packages sudo apt install python3-pydot graphviz pip install neo4j-runway # Import necessary libraries import numpy as np import pandas as pd from neo4j_runway import Discovery, GraphDataModeler, IngestionGenerator, LLM, PyIngest from IPython.display import display, Markdown, Image Load Environment Variables pythonCopy codeload_dotenv() OPENAI_API_KEY = os.getenv(‘sk-openaiapikeyhere’) NEO4J_URL = os.getenv(‘neo4j+s://your.databases.neo4j.io’) NEO4J_PASSWORD = os.getenv(‘yourneo4jpassword’) Load and Prepare Medical Data pythonCopy codedisease_df = pd.read_csv(‘/home/user/Disease_symptom.csv’) disease_df.columns = disease_df.columns.str.strip() for i in disease_df.columns: disease_df[i] = disease_df[i].astype(str) disease_df.to_csv(‘/home/user/disease_prepared.csv’, index=False) Data Description for the LLM pythonCopy codeDATA_DESCRIPTION = { ‘Disease’: ‘The name of the disease or medical condition.’, ‘Fever’: ‘Indicates whether the patient has a fever (Yes/No).’, ‘Cough’: ‘Indicates whether the patient has a cough (Yes/No).’, ‘Fatigue’: ‘Indicates whether the patient experiences fatigue (Yes/No).’, ‘Difficulty Breathing’: ‘Indicates whether the patient has difficulty breathing (Yes/No).’, ‘Age’: ‘The age of the patient in years.’, ‘Gender’: ‘The gender of the patient (Male/Female).’, ‘Blood Pressure’: ‘The blood pressure level of the patient (Normal/High).’, ‘Cholesterol Level’: ‘The cholesterol level of the patient (Normal/High).’, ‘Outcome Variable’: ‘The outcome variable indicating the result of the diagnosis or assessment for the specific disease (Positive/Negative).’ } Data Analysis and Model Creation pythonCopy codedisc = Discovery(llm=llm, user_input=DATA_DESCRIPTION, data=disease_df) disc.run() # Instantiate and create initial graph data model gdm = GraphDataModeler(llm=llm, discovery=disc) gdm.create_initial_model() gdm.current_model.visualize() Adjust Relationships pythonCopy codegdm.iterate_model(user_corrections=”’ Let’s think step by step. Please make the following updates to the data model: 1. Remove the relationships between Patient and Disease, between Patient and Symptom and between Patient and Outcome. 2. Change the Patient node into Demographics. 3. Create a relationship HAS_DEMOGRAPHICS from Disease to Demographics. 4. Create a relationship HAS_SYMPTOM from Disease to Symptom. If the Symptom value is No, remove this relationship. 5. Create a relationship HAS_LAB from Disease to HealthIndicator. 6. Create a relationship HAS_OUTCOME from Disease to Outcome. ”’) # Visualize the updated model gdm.current_model.visualize().render(‘output’, format=’png’) img = Image(‘output.png’, width=1200) display(img) Generate Cypher Code and YAML File pythonCopy code# Instantiate ingestion generator gen = IngestionGenerator(data_model=gdm.current_model, username=”neo4j”, password=’yourneo4jpasswordhere’, uri=’neo4j+s://123654888.databases.neo4j.io’, database=”neo4j”, csv_dir=”/home/user/”, csv_name=”disease_prepared.csv”) # Create ingestion YAML pyingest_yaml = gen.generate_pyingest_yaml_string() gen.generate_pyingest_yaml_file(file_name=”disease_prepared”) # Load data into Neo4j instance PyIngest(yaml_string=pyingest_yaml, dataframe=disease_df) Querying the Graph Database cypherCopy codeMATCH (n) WHERE n:Demographics OR n:Disease OR n:Symptom OR n:Outcome OR n:HealthIndicator OPTIONAL MATCH (n)-[r]->(m) RETURN n, r, m Visualizing Specific Nodes and Relationships cypherCopy codeMATCH (n:Disease {name: ‘Diabetes’}) WHERE n:Demographics OR n:Disease OR n:Symptom OR n:Outcome OR n:HealthIndicator OPTIONAL MATCH (n)-[r]->(m) RETURN n, r, m MATCH (d:Disease) MATCH (d)-[r:HAS_LAB]->(l) MATCH (d)-[r2:HAS_OUTCOME]->(o) WHERE l.bloodPressure = ‘High’ AND o.result=’Positive’ RETURN d, properties(d) AS disease_properties, r, properties(r) AS relationship_properties, l, properties(l) AS lab_properties Automated Cypher Query Generation with Gemini-1.5-Flash To automatically generate a Cypher query via Langchain (GraphCypherQAChain) and retrieve possible diseases based on a patient’s symptoms and health indicators, the following setup was used: Initialize Vertex AI pythonCopy codeimport warnings import json from langchain_community.graphs import Neo4jGraph with warnings.catch_warnings(): warnings.simplefilter(‘ignore’) NEO4J_USERNAME = “neo4j” NEO4J_DATABASE = ‘neo4j’ NEO4J_URI = ‘neo4j+s://1236547.databases.neo4j.io’ NEO4J_PASSWORD = ‘yourneo4jdatabasepasswordhere’ # Get the Knowledge Graph from the instance and the schema kg = Neo4jGraph( url=NEO4J_URI, username=NEO4J_USERNAME, password=NEO4J_PASSWORD, database=NEO4J_DATABASE ) kg.refresh_schema() print(textwrap.fill(kg.schema, 60)) schema = kg.schema Initialize Vertex AI pythonCopy codefrom langchain.prompts.prompt import PromptTemplate from langchain.chains import GraphCypherQAChain from langchain.llms import VertexAI vertexai.init(project=”your-project”, location=”us-west4″) llm = VertexAI(model=”gemini-1.5-flash”) Create the Prompt Template pythonCopy codeprompt_template = “”” Let’s think step by

Read More
Salesforce API Gen

Salesforce API Gen

Function-calling agent models, a significant advancement within large language models (LLMs), encounter challenges in requiring high-quality, diverse, and verifiable datasets. These models interpret natural language instructions to execute API calls crucial for real-time interactions with various digital services. However, existing datasets often lack comprehensive verification and diversity, resulting in inaccuracies and inefficiencies. Overcoming these challenges is critical for deploying function-calling agents reliably in real-world applications, such as retrieving stock market data or managing social media interactions. Salesforce API Gen. Current approaches to training these agents rely on static datasets that lack thorough verification, hampering adaptability and performance when encountering new or unseen APIs. For example, models trained on restaurant booking APIs may struggle with tasks like stock market data retrieval due to insufficient relevant training data. Addressing these limitations, researchers from Salesforce AI Research propose APIGen, an automated pipeline designed to generate diverse and verifiable function-calling datasets. APIGen integrates a multi-stage verification process to ensure data reliability and correctness. This innovative approach includes format checking, actual function executions, and semantic verification, rigorously verifying each data point to produce high-quality datasets. Salesforce API Gen APIGen initiates its data generation process by sampling APIs and query-answer pairs from a library, formatting them into standardized JSON format. The pipeline then progresses through a series of verification stages: format checking to validate JSON structure, function call execution to verify operational correctness, and semantic checking to align function calls, execution results, and query objectives. This meticulous process results in a comprehensive dataset comprising 60,000 entries, covering 3,673 APIs across 21 categories, accessible via Huggingface. The datasets generated by APIGen significantly enhance model performance, achieving state-of-the-art results on the Berkeley Function-Calling Benchmark. Models trained on these datasets outperform multiple GPT-4 models, demonstrating substantial improvements in accuracy and efficiency. For instance, a model with 7 billion parameters achieves an accuracy of 87.5%, surpassing previous benchmarks by a notable margin. These outcomes underscore the robustness and reliability of APIGen-generated datasets in advancing the capabilities of function-calling agents. In conclusion, APIGen presents a novel framework for generating high-quality, diverse datasets for function-calling agents, addressing critical challenges in AI research. Its multi-stage verification process ensures data reliability, empowering even smaller models to achieve competitive results. APIGen opens avenues for developing efficient and powerful language models, emphasizing the pivotal role of high-quality data in AI advancements. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
what is a data lake

What is a Data Lake?

A data lake is a centralized repository that stores vast amounts of data, both structured and unstructured, in its native format, enabling organizations to store and analyze diverse data sources for various applications, including analytics, machine learning, and business intelligence.  Here’s a more detailed explanation: Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more Health Cloud Brings Healthcare Transformation Following swiftly after last week’s successful launch of Financial Services Cloud, Salesforce has announced the second installment in its series Read more

Read More
Slack for Manufacturing and Automotive

Slack for Manufacturing and Automotive

Slack for Manufacturing and Automotive Enhance productivity, reduce costs, and provide exceptional experiences with a consolidated perspective of your customer data. Streamline diverse systems, teams, and processes effortlessly through automation. Forge connections with external partners to integrate your entire ecosystem seamlessly. As manufacturing, automotive, and energy organizations transition to developing new powertrains and digital products for innovative service and revenue models, Slack offers an efficient platform for innovation. Pioneering Enterprise Security Setting the standard in enterprise security, Slack ensures data encryption in transit and at rest. It boasts comprehensive compliance and assurance programs, along with features such as audit logs, data loss prevention, and single sign-on. As the productivity platform for manufacturing, automotive, and energy, Slack ensures a secure environment. Utilizing Slack AI for Smarter Work Engage with a Slack sales representative or join the waitlist to experience the empowering capabilities of Slack AI throughout your organization. Leverage AI-powered search for swift answers, summarize conversations effortlessly, and rest assured with secure data handling by Slack AI. Explore Slack’s pivotal role in accelerating innovation across manufacturing, automotive, and energy sectors. Empowering Software Developers Discover how Slack empowers teams to introduce novel digital products and services, driving revenue and transforming customer experiences. For software developers, Slack accelerates the delivery of high-quality code, making it a preferred choice for the world’s leading producers of software, hardware, and services. Explore Slack’s webinar to uncover its potential for your team. Revolutionizing Fleet Management with Automile Challenges abound for businesses managing fleets, particularly in integrating solutions seamlessly with existing toolsets for increased productivity. Automile aims to disrupt the billion fleet management market by introducing a mobile-first API-centric solution. With REST-based JSON APIs and SDKs for PHP, Java, and C# .NET, Automile simplifies fleet management, offering web and mobile apps. Slack Integration with Automile Automile is set to release new features in March, including integrations such as Slack. By submitting the app to Slack’s App Directory, Automile aims to provide businesses with a streamlined fleet management experience within Slack. The upcoming Slack App supports Slash Commands, Interactive Messages, and Incoming Webhooks. Security First Approach Automile prioritizes security with the new Slack App, ensuring that authorized Slack team members have access. The app supports Slash Commands, enabling users to achieve specific tasks, such as checking out drivers and locating vehicles. The admin can control user access to these commands for added security. Fleet Management Commands Automile’s Slack App introduces Slash Commands for drivers and vehicles. The Driver command allows fleet managers to search for drivers, check their status, and interact with them directly from Slack. Similarly, the Vehicle command provides information on vehicle location, status, and enables task assignment. Driving Field Service Efficiency with Slack and Salesforce Service Cloud Witness how manufacturers harness the combined capabilities of Slack and Salesforce Service Cloud to empower field employees and enhance customer satisfaction. Slack’s Continued Impact Slack continues to thrive globally, supporting businesses of all sizes in achieving growth and skyrocketing productivity. Acquired by Salesforce in 2021, Slack remains an influential force in the business communication and collaboration landscape. Like1 Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Einstein Web Recommendations

Einstein Web Recommendations

Einstein Web Recommendations leverage Einstein’s capabilities to analyze user behavior, construct preference profiles, and deliver personalized content tailored to each website visitor. Utilize application scenarios to fine-tune recommendations according to your specific business rules. Web Recommendations are provided through two methods: a JSON response or HTML/JS. While the JSON response is the recommended delivery method due to its flexibility, HTML/JS can be used if the web team is unable to work with JSON. As the JSON method allows for greater flexibility, you are responsible for parsing and styling the recommendations within your web environment. Marketing Cloud Einstein Recommendations enable the creation of product or content recommendations for display on your website. The Einstein Recommendation Engine necessitates a minimum of three active items in your product catalog. Incorporate any catalog field into the web recommendation call, emphasizing a clear understanding of the data driving recommendations during catalog setup. A unique web recommendation call is generated for each page type, with Home, Product, Category, Cart, and Conversion pages recommended as a best practice. However, there is no restriction on the number of pages that can be configured in the UI. Page templates are utilized by Einstein Recommendations, treating a Product Page as a template for building personalized recommendations specific to a product page. Different types of page templates may have distinct scenarios and contexts. For instance, recommendations on a product page may be based on the viewed product, adding context. Conversely, homepage recommendations rely on overall user affinity, lacking specific context. Integrate the Einstein Web Recommendations code into the designated page’s code, incorporating both the JavaScript for the recommendation call and the HTML recommendation zone placeholder provided. Select and configure the content to be included in your web recommendations: Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Summer 24 Salesforce Customization Release Notes

Summer 24 Salesforce Customization Release Notes

Manage users more easily with the user access, public group, permission set, and permission set group summaries. Give record page users more of what they need where and when they need it with Lightning record page enhancements such as blank space support and visibility rules on individual tabs. Summer 24 Salesforce Customization Release Notes. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Salesforce JSON

Salesforce JSON

Today we are diving into JSON (JavaScript Object Notation) and exploring why it’s a crucial concept for you to understand. JSON is a data representation format widely used across the internet for APIs, configuration files, and various applications JSON Class Contains methods for serializing Apex objects into JSON format and deserializing JSON content that was serialized using the serialize method in this class. Usage Use the methods in the System.JSON class to perform round-trip JSON serialization and deserialization of Apex objects. Roundtrip Serialization and Deserialization Use the JSON class methods to perform roundtrip serialization and deserialization of your JSON content. These methods enable you to serialize objects into JSON-formatted strings and to deserialize JSON strings back into objects. What does JSON serialize do in Salesforce? JSON. serialize() accepts both Apex collections and objects, in any combination that’s convertible to legal JSON. String jsonString = JSON. What is the difference between JSON parse and JSON deserialize? The parser converts the JSON data into a data structure that can be easily processed by the programming language. On the other hand, JSON Deserialization is the process of converting JSON data into an object in a programming language. What is the difference between JSON and XML in Salesforce? JSON supports numbers, objects, strings, and Boolean arrays. XML supports all JSON data types and additional types like Boolean, dates, images, and namespaces. JSON has smaller file sizes and faster data transmission. XML tag structure is more complex to write and read and results in bulky files. Which is more secure XML or JSON? Generally speaking, JSON is more suitable for simple and small data, more readable and maintainable for web developers, faster and more efficient for web applications or APIs, supports native data types but lacks a standard schema language, and is more compatible with web technologies but less secure than XML. What is Salesforce JSON heap size limit? Salesforce enforces an Apex Heap Size Limit of 6MB for synchronous transactions and 12MB for asynchronous transactions. How to store JSON data in Salesforce object? If you need to store the actual JSON payload in Salesforce for audit purposes, Tectonic would recommend just using a Long Text Area field to store JSON content. You wouldn’t have any performance impacts when interacting with records, and if required you could add this to the layout of the child object storing this data. Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
Tableau Pulse and Tableau GPT

Announcing Tableau Pulse and Tableau GPT

It’s fair to say that many are familiar with ChatGPT, the groundbreaking Large Language Model from OpenAI that has transformed how we work and interact with AI. At TC 2023, Tableau announced a new tool called Tableau GPT. But what exactly is Tableau GPT, and how does it fit into Tableau’s suite of products? Announcing Tableau Pulse and Tableau GPT. Tableau GPT Tableau GPT is an assistant leveraging the advanced capabilities of generative AI to simplify and democratize data analysis. Built from Einstein GPT, a Salesforce product developed in collaboration with OpenAI, Tableau GPT integrates generative AI into Tableau’s user experience. This integration aims to help users work smarter, learn faster, and communicate more effectively. During the Devs on Stage segment of the keynote at TC, Matthew Miller, Senior Director of Product Management, showcased Tableau GPT’s ability to generate calculations. For example, with a prompt like “Extract email addresses from JSON,” Tableau GPT quickly produces a calculation that users can copy into the calculation window. Tableau Pulse Tableau GPT also powers a new tool called Tableau Pulse, designed to generate powerful insights swiftly. Tableau Pulse provides “data digests” on a personalized metrics homepage, offering a curated, ‘newsfeed’-like experience of key KPIs. As users interact with Pulse, it learns to deliver more personalized results based on their interests. For example, Tableau Pulse highlights metrics that require attention, derived from recent data trends identified by Tableau GPT. The tool provides the latest metric values, visual trends, and AI-generated insights for user-selected KPIs. Tableau Pulse also enables users to ask questions about their data in natural language. For instance, when asked, “What is driving change in Appliance Sales?” Tableau Pulse responded with a brief answer and visualization. Further inquiries, such as “What else should I know about air fryers?” revealed that the “inventory fill rate” for air fryers is forecasted to fall below a set threshold, providing actionable insights that users can share across their organization. Future Impact and Availability Tableau GPT and Pulse promise to revolutionize interactions with Tableau products, enabling quicker visualization creation and making data accessible to non-technical users. Salesforce announced that Tableau Pulse and Tableau GPT would enter pilot testing later this year. When they do, we’ll be ready to share new insights. Follow us on LinkedIn to stay updated on all the latest developments and features in Tableau! Like Related Posts Salesforce OEM AppExchange Expanding its reach beyond CRM, Salesforce.com has launched a new service called AppExchange OEM Edition, aimed at non-CRM service providers. Read more The Salesforce Story In Marc Benioff’s own words How did salesforce.com grow from a start up in a rented apartment into the world’s Read more Salesforce Jigsaw Salesforce.com, a prominent figure in cloud computing, has finalized a deal to acquire Jigsaw, a wiki-style business contact database, for Read more Service Cloud with AI-Driven Intelligence Salesforce Enhances Service Cloud with AI-Driven Intelligence Engine Data science and analytics are rapidly becoming standard features in enterprise applications, Read more

Read More
gettectonic.com