When using CRM Analytics, keep these limits in mind. API Call Limits These limits apply to all supported editions. API Call Limit Maximum concurrent CRM Analytics API calls per org 100 Maximum CRM Analytics API calls per user per hour 10,000 Dataset Row Storage Allocations per License In Salesforce org, your total row storage limit for all registered datasets combined depends on your license combination. Each license allocates a different number of rows. Baseline Row Allocation Allocated Rows CRM Analytics Plus 10 billion CRM Analytics Growth 100 million Sales Analytics 25 million Service Analytics 25 million Event Monitoring Analytics 50 million B2B Marketing Analytics 25 million CRM Analytics for Financial Services Cloud 25 million CRM Analytics for Health Cloud 25 million Extra Data Rows license 100 million Your total row storage limit is a combination of your active licenses. For example: Because the CRM Analytics Plus license includes the Sales Analytics and Service Analytics licenses, your total row allocation remains 10 billion. Similarly, the CRM Analytics Growth license includes the Sales Analytics and the Service Analytics licenses, so your total row allocation remains 100 million. However, if you obtain another Sales Analytics or Services Analytics license, your row limit increases by 25 million for each added license. Dataset Row Limits Each dataset supports up to 2 billion rows. If your Salesforce org has less than 2 billion allocated rows, each dataset supports up to your org’s allocated rows. Dataset Field Limits Value Limit Maximum number of fields in a dataset 5,000 (including up to 1,000 date fields) Maximum number of decimal places for each value in a numeric field in a dataset (overflow limit) 17 decimal placesWhen a value exceeds the maximum number of decimal places, it overflows. Both 100,000,000,000,000,000 and 10,000,000,000,000,000.0 overflow because they use more than 17 decimal places. A number also overflows if it’s greater (or less) than the maximum (or minimum) supported value. 36,028,797,018,963,968 overflows because its value is greater than 36,028,797,018,963,967. -36,028,797,018,963,968 overflows because it’s less than -36,028,797,018,963,967.When a number overflows, the resulting behavior in CRM Analytics is unpredictable. Sometimes CRM Analytics throws an error. Sometimes it replaces a numeric value with a null value. And sometimes mathematical calculations, such as sums or averages, return incorrect results. Occasionally, CRM Analytics handles numbers up to 19 digits without overflowing because they are within the maximum value for a 64-bit signed integer (263 – 1). But numbers of these lengths aren’t guaranteed to process.As a best practice, stick with numbers that are 17 decimal places or fewer. If numbers that would overflow are necessary, setting lower precision and scale on the dataset containing the large numbers sometimes prevents overflow. If your org hasn’t enabled the handling of numeric values, the maximum number of decimal places for each value in a numeric field in a dataset is 16. All orgs created after Spring ’17 have Null Measure Handling enabled. Maximum value for each numeric field in a dataset, including decimal places 36,028,797,018,963,967For example, if three decimal places are used, the maximum value is 36,028,797,018,963.967 Minimum value for each numeric field in a dataset, including decimal places -36,028,797,018,963,968For example, if five decimal places are used, the minimum value is -36,028,797,018,9.63968 Maximum number of characters in a field 32,000 Data Sync Limits If you extract more than 100 objects in your dataflows, contact Salesforce Customer Support before you enable data sync. Value Limit Maximum number of concurrent data sync runs 3 Maximum number of objects that can be enabled for data sync, including local and remote objects 100 Maximum amount of time each data sync job can run for local objects 24 hours Maximum amount of time each data sync job can run for remote objects 12 hours Data sync limits for each job:Marketo Connector (Beta)NetSuite ConnectorZendesk Connector Up to 100,000 rows or 500 MB per object, whichever limit is reached first Data sync limits for each job:Amazon Athena ConnectorAWS RDS Oracle ConnectorDatabricks ConnectorGoogle Analytics ConnectorGoogle Analytics Core Reporting V4 ConnectorOracle Eloqua ConnectorSAP HANA Cloud ConnectorSAP HANA Connector Up to 10 million rows or 5 GB per object, whichever limit is reached first Data sync limits for each job*:AWS RDS Aurora MySQL ConnectorAWS RDS Aurora PostgresSQL ConnectorAWS RDS MariaDB ConnectorAWS RDS MySQL ConnectorAWS RDS PostgreSQL ConnectorAWS RDS SQL Server ConnectorGoogle Cloud Spanner ConnectorMicrosoft Azure Synapse Analytics ConnectorMicrosoft Dynamics CRM ConnectorSalesforce External ConnectorSalesforce Contacts Connector for Marketing Cloud EngagementSalesforce OAuth 2.0 Connector for Marketing Cloud Engagement Up to 20 million rows or 10 GB per object, whichever limit is reached first Data sync limits for each job*:Amazon Redshift ConnectorAmazon S3 ConnectorCustomer 360 Global Profile Data Connector (Beta)Google BigQuery for Legacy SQL ConnectorGoogle BigQuery Standard SQL ConnectorHeroku Postgres ConnectorMicrosoft Azure SQL Database ConnectorSnowflake Input Connector Up to 100 million rows or 50 GB per object, whichever limit is reached first *When using these connectors, Salesforce Government Cloud org data is protected in transit with advanced encryption and can sync up to 10 million rows or 5 GB for each connected object, whichever limit is reached first. Note When using a Salesforce local input connection, CRM Analytics bulk API usage doesn’t count towards Salesforce bulk API limits. Use of the external Salesforce connection and output connection impacts your limits. The dataflow submits a separate bulk API call to extract data from each Salesforce object. The dataflow uses a batch size of 100,000–250,000, depending on whether the dataflow or the bulk API chunks the data. As a result, to extract 1 million rows from an object, the dataflow creates 4–10 batches. Recipe and Dataflow Limits Important In Winter ‘24, recipe runs over 2 minutes are counted against the limit. Previously, the recipe run counts weren’t correct. For more information, see Known Issue – Recipe runs are not counting towards the daily maximum run limit. Value Limit Maximum amount of time each recipe or dataflow can run 48 hours Maximum number of recipes 1,000 Maximum number of dataflows definitions (with data sync enabled) 100 Maximum number of dataflow and recipe runs in a rolling