Once you’ve configured, ingested and modelled your data, once you’ve unified profiles for each customer and you’ve gained and calculated insights…
The next step, before the final one, is to leverage Salesforce Data Cloud Segmentation based on that data.
Salesforce Data Cloud Segmentation Overview
Segmentation is the process of grouping data based on shared patterns, attributes or criteria.
In this case, customer data.
Some examples of Customer Segmentations:
- Top in-store purchasers.
- Most frequent laptop accessories buyers.
- All gym members who pay the premium membership type.
- Consumers with at least 3 Service Cloud cases resolved in the last year.
Let’s see how this works in Data Cloud, step by step.
Salesforce Data Cloud Segmentation Basics
The logical execution order for Segmentation in Data Cloud would be the following:
- Data is first ingested
- Data is then modelled into the canonical 360 Data Model
- Identity Resolution creates a Unified Profile for each customer with all their data lineage
- Insights add aggregations, measures, etc to enrich that data.
- Segmentation can then sort all the data into actionable groups.
How does Data Cloud do this “sorting”?
Let’s imagine the final picture of a Unified Individual in a customer database for sports equipment stores.
We have 1 Unified Individual for a customer, David, and then this Unified Profile has 2 source profiles that were merged during Identity Resolution:
1. Unified Individual DMO
1.1 Source Individual
Name: David
Email: david@personal.com
Hobby: Fitness
1.2 Source Individual
Name: David
Email: david@work.com
Hobby: Padel
Now, we create a Segmentation in Data Cloud saying, from the Unified Individual DMO:
“select all the customers in our database whose hobby is Fitness”
Since the first source profile of David meets the criteria (Hobby: Fitness), this Unified Individual would be included in our segment. As long as 1 source meets the criteria, the Unified Individual is added to Segmentation.
What is Segmentation useful for?
- Creating useful groups of customers
- Analysing our data
- Preparing audiences for different use cases
- A/B Testing our data to learn business insights
All the Use Cases your company wants to build will need a certain Segmentation, so this step is one of the most crucial ones when using Data Cloud.
Please check the following link to see the official Salesforce documentation on Data Cloud Segmentation:
https://help.salesforce.com/s/articleView?id=sf.c360_a_create_segments.htm&type=5
Creating Data Cloud Standard Segments
In Data Cloud, a Segment refers to a “grouping” of your data.
You can create Segments based on different Data Model Objects (DMO), as long as they are of Type Profile.
Examples of Profile DMOs: Account, Individual, Unified Individual, etc.
We say you “segment ON” one DMO or another. E.g: create a segment ON the Individual DMO.
If your company wants to leverage the full potential of Data Cloud, you will have to almost always Segment On the Unified Individual DMO (as it contains all the other linked data sources, attributes, etc).
To create a Standard Segment:
- Go to Segments tab
- Click New > Use Visual Builder
- Choose Standard Segment
- Enter details: Segment On which DMO + Data Space + Name
- Select a Publish Type setting and its Publish Schedule
Here’s a table with some key considerations when using Standard Segments:
Feature | Limits | Description |
---|---|---|
Total N of Active Segments that you can have in your PROD Org | 9950 | A Segment is considered Active when it was created and has all its settings configured. |
Total N of Active Segments that you can have in your DEV Org | 25 | A Segment is considered Active when it was created and has all its settings configured. |
Total N of Standard Segment publishing occurrences at the same time | 50 | Total number of concurrent segments that can be publishing at any given time. |
Total N of months for events that are queried from standard Segments | 24 | The maximum number of months that you can query event dates in your Segment is up to 24 months. |
Maximum N of Scheduled Publishing occurences per standard segment per day in a PROD Org | 2 | How many times within a 24-hour period a Segment can run when scheduled. |
Maximum N of Scheduled Publishing occurences per standard segment per day in a DEV Org | 0 | In a Data Cloud Developer Org, segments cannot be scheduled. You need to Publish them manually. |
Please check the following link to see the official Salesforce documentation on Data Cloud Standard Segments:
https://help.salesforce.com/s/articleView?id=sf.c360_a_create_a_segment.htm&type=5
Segmenting On and Segment Membership DMO
As it was explained above, the term Segment On refers to the DMO (Data Model Object) that your Segmentation is based on.
You can only Segment On DMOs that are Profile Type (e.g: Account, Individual, Unified Individual…).
It is a best practice to always Segment On the Unified Individual DMO if Identity Resolution is in place.
Why?
- The Unified Individual DMO has all the data lineage of source profiles.
- The Unified Individual DMO contains all the linked attributes as the “master” level object.
- Segmenting On the Individual DMO can lead to duplicates.
Remember that the Category Type of an object is defined when configuring the first Data Stream. You cannot change this category after you’ve saved the Data Stream.
When a Segment is published, Data Cloud automatically creates a Data Model Object (DMO) to store the membership information: the Segment Membership DMO.
Membership = a customer profile included in a Segment population means that they are a “member” of that segment.
These are the 2 types of Membership DMOs that are created for each object you Segment On:
DMO | Object API Name | Type | Description | Example |
---|---|---|---|---|
Objectname - History | Objectname_SMH__dlm | Segment_Membership | This DMO shows all the profile IDs that were included in the last 30 days of segment publishes. | Individual_SMH__dlm |
Objectname - Latest | Objectname_SM__dlm | Segment_Membership | This DMO shows all the profile IDs that were included in the latest publish of the segment. | Account_SM__dlm |
Please check the following link to see the official Salesforce documentation on Data Cloud Segment Membership DMO:
https://help.salesforce.com/s/articleView?id=sf.c360_a_segment_membership_data.htm&type=5
Creating Segments – Canvas Interface
Once you go to the Segments tab and click on creating a new Segment using the Visual Builder, these are the basic layout elements:
- Segment On: what DMO the segment is being created on
- Publish Schedule: what the publish schedule of the segment is
- Segment Counts: current count of members for the segment population. The term “segment population” is used to refer to the total members of a segment.
- Attributes sidebar tab: this is where you select direct attributes of the Segment On DMO or Related Attributes (have a relationship with the Segment On DMO).
- Segments sidebar tab: this is where you can select other Segments to use within your new Segment, as Nested Segments.
- Segment Canvas: this is where you drag and drop the Attributes and Segments from the sidebar, to create your final segment rules. The canvas has 2 main tabs: INCLUDE and EXCLUDE. Basically, criteria to include in the Segment and criteria of Exclusion for the Segment.
- Container: each “row” of the set of criteria on the canvas, based on one Attribute that is dragged and dropped from the available attributes. The different containers can then organised with AND/OR logic rules, up to 10 levels.
Attribute Containers – Expression Operators
When you drag and drop an Attribute onto the canvas, you create a Container.
To compare the values in Attributes, Data Cloud uses Expression Operators.
The Expression Operators are defined by the data type of the Attribute. For example, you cannot use an expression operator “is anniversary of” unless the attribute is a date type.
The Data Types available for Attributes in Segmentation are:
- Text
- Number
- Date
- DateTime
Some operators available:
- Has no value
- Has value
- Is On
- Is Before
- Is After
- Is Between
- Is True
- Is False
- Is In
- Begins With
- etc
Check this link for the official Salesforce documentation on Data Cloud Segment Expression Operators:
https://help.salesforce.com/s/articleView?id=sf.c360_a_datatype_expression_operators.htm&type=5
Please keep in mind that the available Expression Operators will be based on the Data Type of the mapped Attribute.
E.g: if you have a data source ingested with a DateTime field and during mapping, you map it as a Text attribute, Segmentation will not allow you to use the DateTime expression operators (for instance, “is anniversary of”).
Attribute Containers – Aggregate Functions
For each Container in the Segment canvas, apart from having the dependency on Expression Operators we just mentioned, there are Aggregate Functions you can use.
Of course, Aggregate functions will be applicable based on Number attributes (e.g: OrderItem_Quantity, TotalAmount, etc).
These are the Aggregate Functions available in Data Cloud Segments:
- Count: meet criteria based on X count of a number attribute (e.g: at least 3 purchases).
- Sum: the segment criteria is met based on the sum of a number attribute (e.g: greater than)
- Average: the segment criteria is met depending on the average of a number attribute (e.g: customers whose average order value is $500).
- Minimum: the criteria is met based on a minimum value of a number attribute (e.g: customers who have purchased a total of at least 1000€ on our ecommerce).
- Maximum: same as the previous one, but based on a maximum value (e.g: customers who have spent more than $500 on accessories).
Aggregate Functions are only applicable to Related Attributes (those of connected DMOs to the DMO we are segmenting on), not to Direct Attributes (the direct level 1 attributes of the Segment On DMO).
As an example, image you have these attributes in the segment canvas for the Unified Individual DMO Segment On:
– x1 Direct Attribute: Unified Individual.Phone Number
– x2 Related Attributes: SalesOrder.Purchases, SalesOrder.TotalQuantity
It makes no sense to apply an Aggregate on a direct attribute because there is only 1 record at the Level 1 Segment On DMO. However, when using Related Attributes, like Sales Order in this case, there might be a 1:N relationship, so aggregating makes sense.
Attribute Containers – Container Logic Operators
You can create Segments based on multiple rules and containers, but adding the Logical Operators (AND, OR) at different levels will result in different output.
Example #1 – 2 Different Containers with AND logical operator
Container #1
Select all customers where Email has 1 Click
AND
Container #2
Select all customers where Email Subject = Christmas Promo
Total Segment Population = 200 (100 members who have 1 Click on any email AND 100 members who received the Christmas Promo email).
Example #2 – 1 Container with AND logical operator
Container #1
Select all customers where Email has 1 Click
AND
Select all customers where Email Subject = Christmas Promo
Total Segment Population = 50 (out of the total 200, there are 50 members who meet BOTH criteria: 1 Click on email engagement AND received Christmas Promo email).
Attribute Containers – Container Paths
Imagine you drag and drop an Attribute related to an Order made by a customer.
An Order has multiple relationships with the Segment On DMO, in this case, Individual DMO. For example:
- An order can be an Order made in the DMO Sales Order, related to a purchase in a store.
- An order can be the Order created which is associated to a Case in Service Cloud.
When you have a Container where the Related Attribute has these multiple relationships, Data Cloud allows you to select which path you would like the segment to use.
Calculated Insights in Segments
You may add Calculated Insights in Data Cloud Segments.
These are the requirements to use Calculated Insights in Segments:
- Calculated Insights appear in the Direct Attributes section of the Segment creation screen.
- The Calculated Insight must reference the DMO of the Segment On property.
- The Calculated Insight must have the Primary Key of the Segment On DMO as a Dimension.
Example of a business Use Case:
You use Data Cloud Insights to create a robust and reliable RFM scoring model.
RFM stands for: Recency, Frequency and Monetary value of each customer. This insight gives you a clear scoring of each of your customers, based on a lot of complex metrics and dimensions (e.g: purchase frequency, total order value, Lifetime Value, etc).
You then create marketing Segments with this RFM scoring as one of the attributes, to create very complex segmented audiences for your communications.
Publishing Segments
The concept of a Segment is closely linked to the Activation phase of Data Cloud.
The whole purpose of creating Segments is to send them to a target destination, that is, to “activate” them. E.g: make a segment and send that audience to a Salesforce Marketing Cloud Journey.
The process of the segment data being refreshed is called Publishing.
To change the Publish Schedule of an Active Segment:
- Go to Segments tab
- Select your Segment
- Click on Edit Properties > Next
- Choose Publish Type: Standard OR Rapid
- Select a Publish Schedule
- Save
To Publish a Segment for the first time, after creating it:
- Go to Segments tab
- Select your Segment
- Dropdown arrow menu > Publish Now
Publish Types
There are 2 Publish Types:
- Standard Publish
Minimum: every 12 hours
Maximum: every 24 hours
This type of Publish uses up to the last 2 years of Engagement data.
You may select a Start and End date and time. - Rapid Publish
Minimum: every 1 hour
Maximum: every 4 hours
This type of Publish uses up to the last 7 days of Engagement data.
This type of Publish can only send data to Salesforce Marketing Cloud as a target.
Once a Segment is Publishing, you can check the tab Publish History to check the following data:
- Activation Name
- Platform
- Segment
- Activation
- Publish Type
- Publish Status
- Last Publish Completed
Segment and Publish Status
Check the following tables to see all the information that you can check on both your Segment and Publish Status:
Segment Status
Segment Status | Description |
---|---|
Inactive | An Inactive Segment can only be deleted. |
Recounting | The counting of the segment population is in process. |
Processing | The publishing of the Segment is taking place right now. |
Error | The Segment cannot be published. |
Active | The Segment has been created and all the settings are correctly configured. |
Publish Status
Publish Status | Description |
---|---|
Publishing | The Segment Publishing is taking place at the moment. |
Skipped | The Publishing of this Segment was delayed 30 mins due to the max concurrent publishes in Data Cloud. |
Deferred | The Segment Publish time was deferred due to the max concurrent publishes in Data Cloud. |
Error | This Segment could not be published. |
Success | This Segment was published correctly to the Activation Target. |
Blank | The Segment has been created but it has not been published yet. |
Segments need an Activation to be Scheduled, otherwise the system will not publish them.
Also, the Status “Deferred” and “Skipped” are very similar. The only difference is in how the system manages the publishing of a Segment due to reaching total capacity for simultaneous segment publishes. Skipped means that the publishing is postponed for 30 mins whereas Deferred means that it is sort of “on queue” until there is capacity.
Publish Schedules
These are all the Publish Schedules currently available in Data Cloud:
Publish Schedule | Publish Type | Activation Platform | Data Lookback Window | Description |
---|---|---|---|---|
1 hour | Scheduled - Rapid Publish | Only SFMC | Last 7 days | The segment is published every 1 hour. |
4 hours | Scheduled - Rapid Publish | Only SFMC | Last 7 days | The segment is published every 4 hours. |
12 hours | Scheduled - Standard Publish | All | Last 2 years | The segment is published every 12 hours. |
24 hours | Scheduled - Standard Publish | All | Last 2 years | The segment is published every 24 hours. |
Don't Refresh | N/A | N/A | N/A | Segment is not published to any activation target. |
Manual | Manual Publish | All | Based on Publish Type | The segment is published by clicking Publish Now. |
Some considerations on Publishing Segments:
- Manual publishes take priority over scheduled ones.
- Available Activation Platforms include: B2C Commerce Cloud, any External Activation Platform (Ecosystem), Interaction Studio (MC Personalization), Loyalty, SFMC, Amazon S3.
- Remember to always have an Activation configured for Scheduled Publishes.
- Check the different limits explained in further sections to understand how Data Cloud manages this functionality as there are differences depending on the type of Segment.
Editing Segments
Once you have created a Segment, you may edit it in Data Cloud.
- Go to Segments tab
- Select the segment
- Click on Edit Rules to modify the segmentation rules
- Click on Edit Properties to modify the Segment Name, Description, Publish Type and Schedule
- Save
You can also Delete a Segment, but take into account that if it’s configured with an Activation, you will have to delete the Activation first (it’s a dependency).
You may also Copy or Clone a Segment.
Finally, you can also Deactivate the Segment, but please keep in mind that deactivating it will no longer allow you to re-enable it, publish it again, etc.
Instead of Deactivating or Deleting a Segment, it is a best practice to just stop its Publishing. If you delete it and then want to restore it from the Recycling Bin, or if you Deactivate it, you will no longer be able to re-enable it, edit, etc.
Real-Time Segments
As in Real-time Insights in Data Cloud Insights, you can also create Real-time Segments.
These are Segments that publish in miliseconds, but there are some considerations:
- You cannot use real-time segments with nested segments.
- You cannot use real-time segments with exclusion criteria.
- To have real-time segments in Data Graphs, you need to add the Segment ID and Timestamps fields from the Membership DMO to the real-time Graph.
- You cannot manually publish real-time segments.
To create a Real-time Segment:
- Go to the Segments tab
- Click New > Use a Visual Builder
- Select Real-time Segment > Next
- Provide the Data Space, Segment Name and the Real-time Graph to build the segment on
Here’s a table with important considerations and limits for Real-Time Segments:
Feature | Limit |
---|---|
Total maximum N of Real-time Segments that you can have | 35 |
Using exclusions in Real-time Segments | Not Available |
Using Nested Segments in Real-time Segments | Not Available |
Total maximum N of streaming events in a Real-time Segment | 1 |
Manual Publish Type on Real-time Segments | Not Available |
Check this link for the official Salesforce documentation on Data Cloud Real-time Segments:
https://help.salesforce.com/s/articleView?id=sf.c360_a_create_a_realtime_segment.htm&type=5
Waterfall Segments
Waterfall Segments are hierarchical segments.
Imagine you run a PC accesories online store and have a Black Friday campaign coming soon.
You want to create multi-layered, prioritised segments, as there will be 1 big, general campaign and then, based on what product categories customers purchase, multiple sub-campaigns will follow.
E.g: a generic, Black Friday campaign for all and then, if you purchase a laptop, in the following days you will be part of a laptop-oriented segment and series of communications (for example, with cross-sell or upsell suggestions).
- You create the Waterfall Segment based on Purchased Product Category.
- All the sub-segments are mutually exclusive, meaning each customer will be in only one segment, based on their Priority order.
- You configure the different sub-segments. E.g: Purchased Product Category = Laptop, Purchased Product Category = hard drive, etc.
- As the campaign develops and different customers purchase different items, each customer is taken to the corresponding sub-segment without duplicates, overlaps, respecting the priority order you defined, etc.
Follow these steps to create a Waterfall Segment:
- Go to the Segments tab
- New > Use a Visual Builder > Waterfall Segment
- Enter the Name, Segment On DMO and Data Space
- Define Publish Type (Rapid is not available) and Schedule
- Create the Waterfall hierarchal priority order by using the drag and drop builder, selecting segments in order.
Some considerations of Waterfall Segments:
– They do not allow the use of Rapid Publish Type
– They can only use Segments that are Active, based on the selected DMO and aren’t already in a waterfall segment
– They do not allow the use of Nested Segments.
– One given segment can only be in 1 Waterfall segment at a time.
Check this link for the official Salesforce documentation on Data Cloud Waterfall Segments:
https://help.salesforce.com/s/articleView?id=sf.c360_a_create_a_waterfall_segment.htm&type=5
Creating Segments From Data Kits
Data Kits save you configuration time by packaking pre-defined Data Cloud components.
If you create a Segment from a Data Kit, you just need to follow these steps:
- Go to the Segments tab
- New > Create from a Data Kit > Next
- Select your Data Kit and predefined Segment
- Enter the Segment details: Name, Segment On DMO, Data Space
- Configure Publish Type and Frequency of Schedule
- Save
Check this link for the official Salesforce documentation on Data Cloud Segments from Data Kits:
https://help.salesforce.com/s/articleView?id=sf.c360_a_create_segment_from_data_kit.htm&type=5
Data Cloud Einstein Segments
You can leverage Data Cloud Einstein Generative AI capabilities to create segments.
That is, you can let the AI generate a relevant segment for you, with all its attributes and metadata ready, so you can decide to publish it right away, make some final adjustments, etc.
In order to leverage Einstein Segments, you need to follow these 2 pre-requisite steps:
- Set up your Data for Einstein Segments
- Enable Einstein Segments
1. Setting up your Data Cloud Data for Einstein Segments
This is a checklist so that your company can make the most of Einstein Segments:
- Identity Resolution is in place, with a Ruleset to create the Unified Individual DMO.
- The Unified Individual DMO has records and its output has been validated and confirmed.
- In order to be able to use data from other DMOs, these Data Model Objects must be related to the Unified Individual DMO.
- Check that the DMO and field names are accurate.
- (Optional) Enhance your metadata with Einstein Data Prism (e.g: use suggested descriptions).
This is a logical checklist.
Before letting the generative AI create segments, make sure that data is accurate, reliable, adheres to security and privacy and it is well organised.
2. Enabling Einstein Segments in Data Cloud
Einstein Segments require the activation of Generative AI in your Org, which in turn requires having Data Cloud correctly configured.
Although a dedicated guide will be written on this topic, as Generative AI works across the entire Salesforce Org, here’s a recap summary of the configuration steps for Data Cloud Einstein Segments:
- Enable and configure Data Cloud in your Org. You may follow the guide I wrote here for a full breakdown of all the steps:
https://davidpalencia.com/salesforce-data-cloud-setup-and-provisioning/ - Activate Einstein Generative AI:
– From Setup > Quick Find box > Einstein Setup
– Once you’re in Einstein Setup, click on Turn On Einstein. - Configure the Einstein Trust Layer
– From Setup > Quick Find box > Einstein Trust Layer
– Configure settings according to your organizational security and privacy policies (data masking must be turned on). - Activate Einstein Data Collection and Storage
This activation means that you agree to store your Salesforce generative AI activity log and feedback in Data Cloud, including its corresponding associated costs.
– From Setup > Quick Find box > Einstein Feedback > Turn on “Collect and Store Einstein Generative AI Audit Data”
– Select the target Data Space for this data
Check this link for the official Salesforce documentation on Setting Up Einstein Generative AI in your Salesforce Org:
https://help.salesforce.com/s/articleView?id=sf.generative_ai_enable.htm&language=en_US&type=5
Einstein Generative AI is a much wider capability than just a pre-requisite for being able to create quick Einstein Segments in Data Cloud.
The use of Generative AI must adhere to your organizational security, privacy and governance guidelines and policies. It also involves extra costs.
Please take this into consideration when planning your Data Cloud implementation strategy.
Once you’ve finished all the pre-requisites and configuration, follow these steps to start creating Einstein Segments in Data Cloud:
- Go to the Segments tab
- New > Create with Einstein
- Select the properties: Segment On DMO and Data Space
- The Einstein Segment chat will appear. Type your segment description in simple words and press Enter.
- Einstein will translate your prompt into a full Segment, including:
– Segment description
– Suggested Attributes (most relevant ones based on your prompt)
– Additional Attributes (secondary, optional ones)
– Reasoning behind the attributes suggested - You may click on Refine Segment to re-start the generated segment output
- You may click on Edit Segment Rules to manually adjust any attribute or value (note: this creates the segment as you need to be taken to the segment canvas to edit).
- Click on Create Segment when you’re happy with the suggested Einstein Segment.
Check this link for the official Salesforce documentation on Data Cloud Einstein Segments creation, best practices, etc:
https://help.salesforce.com/s/articleView?id=sf.c360_a_create_segment_einstein.htm&type=5