Salesforce Data Cloud Segmentation

Once you’ve configured, ingested and modelled your data, once you’ve unified profiles for each customer and you’ve gained and calculated insights…

The next step, before the final one, is to leverage Salesforce Data Cloud Segmentation based on that data.

Salesforce Data Cloud Segmentation Overview

Segmentation is the process of grouping data based on shared patterns, attributes or criteria.

In this case, customer data.

Some examples of Customer Segmentations:

  • Top in-store purchasers.
  • Most frequent laptop accessories buyers.
  • All gym members who pay the premium membership type.
  • Consumers with at least 3 Service Cloud cases resolved in the last year.

Let’s see how this works in Data Cloud, step by step.

Salesforce Data Cloud Segmentation Basics

The logical execution order for Segmentation in Data Cloud would be the following:

  1. Data is first ingested
  2. Data is then modelled into the canonical 360 Data Model
  3. Identity Resolution creates a Unified Profile for each customer with all their data lineage
  4. Insights add aggregations, measures, etc to enrich that data.
  5. Segmentation can then sort all the data into actionable groups.

How does Data Cloud do this “sorting”?

Let’s imagine the final picture of a Unified Individual in a customer database for sports equipment stores.

We have 1 Unified Individual for a customer, David, and then this Unified Profile has 2 source profiles that were merged during Identity Resolution:

1. Unified Individual DMO

1.1 Source Individual
Name: David
Email: david@personal.com
Hobby: Fitness

1.2 Source Individual
Name: David
Email: david@work.com
Hobby: Padel

Now, we create a Segmentation in Data Cloud saying, from the Unified Individual DMO:

“select all the customers in our database whose hobby is Fitness”

Since the first source profile of David meets the criteria (Hobby: Fitness), this Unified Individual would be included in our segment. As long as 1 source meets the criteria, the Unified Individual is added to Segmentation.

What is Segmentation useful for?

  • Creating useful groups of customers
  • Analysing our data
  • Preparing audiences for different use cases
  • A/B Testing our data to learn business insights

All the Use Cases your company wants to build will need a certain Segmentation, so this step is one of the most crucial ones when using Data Cloud.

Creating Data Cloud Standard Segments

In Data Cloud, a Segment refers to a “grouping” of your data.

You can create Segments based on different Data Model Objects (DMO), as long as they are of Type Profile.

Examples of Profile DMOs: Account, Individual, Unified Individual, etc.

We say you “segment ON” one DMO or another. E.g: create a segment ON the Individual DMO.

If your company wants to leverage the full potential of Data Cloud, you will have to almost always Segment On the Unified Individual DMO (as it contains all the other linked data sources, attributes, etc).

To create a Standard Segment:

  1. Go to Segments tab
  2. Click New > Use Visual Builder
  3. Choose Standard Segment
  4. Enter details: Segment On which DMO + Data Space + Name
  5. Select a Publish Type setting and its Publish Schedule

Here’s a table with some key considerations when using Standard Segments:

FeatureLimitsDescription
Total N of Active Segments that you can have in your PROD Org9950A Segment is considered Active when it was created and has all its settings configured.
Total N of Active Segments that you can have in your DEV Org25A Segment is considered Active when it was created and has all its settings configured.
Total N of Standard Segment publishing occurrences at the same time50Total number of concurrent segments that can be publishing at any given time.
Total N of months for events that are queried from standard Segments24The maximum number of months that you can query event dates in your Segment is up to 24 months.
Maximum N of Scheduled Publishing occurences per standard segment per day in a PROD Org2How many times within a 24-hour period a Segment can run when scheduled.
Maximum N of Scheduled Publishing occurences per standard segment per day in a DEV Org0In a Data Cloud Developer Org, segments cannot be scheduled. You need to Publish them manually.

Segmenting On and Segment Membership DMO

As it was explained above, the term Segment On refers to the DMO (Data Model Object) that your Segmentation is based on.

You can only Segment On DMOs that are Profile Type (e.g: Account, Individual, Unified Individual…).

It is a best practice to always Segment On the Unified Individual DMO if Identity Resolution is in place.

Why?

  • The Unified Individual DMO has all the data lineage of source profiles.
  • The Unified Individual DMO contains all the linked attributes as the “master” level object.
  • Segmenting On the Individual DMO can lead to duplicates.

When a Segment is published, Data Cloud automatically creates a Data Model Object (DMO) to store the membership information: the Segment Membership DMO.

Membership = a customer profile included in a Segment population means that they are a “member” of that segment.

These are the 2 types of Membership DMOs that are created for each object you Segment On:

DMOObject API NameTypeDescriptionExample
Objectname - HistoryObjectname_SMH__dlmSegment_MembershipThis DMO shows all the profile IDs that were included in the last 30 days of segment publishes.Individual_SMH__dlm
Objectname - LatestObjectname_SM__dlmSegment_MembershipThis DMO shows all the profile IDs that were included in the latest publish of the segment.Account_SM__dlm

Creating Segments – Canvas Interface

Once you go to the Segments tab and click on creating a new Segment using the Visual Builder, these are the basic layout elements:

  • Segment On: what DMO the segment is being created on
  • Publish Schedule: what the publish schedule of the segment is
  • Segment Counts: current count of members for the segment population. The term “segment population” is used to refer to the total members of a segment.
  • Attributes sidebar tab: this is where you select direct attributes of the Segment On DMO or Related Attributes (have a relationship with the Segment On DMO).
  • Segments sidebar tab: this is where you can select other Segments to use within your new Segment, as Nested Segments.
  • Segment Canvas: this is where you drag and drop the Attributes and Segments from the sidebar, to create your final segment rules. The canvas has 2 main tabs: INCLUDE and EXCLUDE. Basically, criteria to include in the Segment and criteria of Exclusion for the Segment.
  • Container: each “row” of the set of criteria on the canvas, based on one Attribute that is dragged and dropped from the available attributes. The different containers can then organised with AND/OR logic rules, up to 10 levels.

Attribute Containers – Expression Operators

When you drag and drop an Attribute onto the canvas, you create a Container.

To compare the values in Attributes, Data Cloud uses Expression Operators.

The Expression Operators are defined by the data type of the Attribute. For example, you cannot use an expression operator “is anniversary of” unless the attribute is a date type.

The Data Types available for Attributes in Segmentation are:

  1. Text
  2. Number
  3. Date
  4. DateTime

Some operators available:

  • Has no value
  • Has value
  • Is On
  • Is Before
  • Is After
  • Is Between
  • Is True
  • Is False
  • Is In
  • Begins With
  • etc

Attribute Containers – Aggregate Functions

For each Container in the Segment canvas, apart from having the dependency on Expression Operators we just mentioned, there are Aggregate Functions you can use.

Of course, Aggregate functions will be applicable based on Number attributes (e.g: OrderItem_Quantity, TotalAmount, etc).

These are the Aggregate Functions available in Data Cloud Segments:

  • Count: meet criteria based on X count of a number attribute (e.g: at least 3 purchases).
  • Sum: the segment criteria is met based on the sum of a number attribute (e.g: greater than)
  • Average: the segment criteria is met depending on the average of a number attribute (e.g: customers whose average order value is $500).
  • Minimum: the criteria is met based on a minimum value of a number attribute (e.g: customers who have purchased a total of at least 1000€ on our ecommerce).
  • Maximum: same as the previous one, but based on a maximum value (e.g: customers who have spent more than $500 on accessories).

Attribute Containers – Container Logic Operators

You can create Segments based on multiple rules and containers, but adding the Logical Operators (AND, OR) at different levels will result in different output.

 

Example #1 – 2 Different Containers with AND logical operator
Container #1
Select all customers where Email has 1 Click
AND
Container #2
Select all customers where Email Subject = Christmas Promo

Total Segment Population = 200 (100 members who have 1 Click on any email AND 100 members who received the Christmas Promo email).

 

Example #2 – 1 Container with AND logical operator
Container #1
Select all customers where Email has 1 Click
AND
Select all customers where Email Subject = Christmas Promo

Total Segment Population = 50 (out of the total 200, there are 50 members who meet BOTH criteria: 1 Click on email engagement AND received Christmas Promo email).

Attribute Containers – Container Paths

Imagine you drag and drop an Attribute related to an Order made by a customer.

An Order has multiple relationships with the Segment On DMO, in this case, Individual DMO. For example:

  • An order can be an Order made in the DMO Sales Order, related to a purchase in a store.
  • An order can be the Order created which is associated to a Case in Service Cloud.

When you have a Container where the Related Attribute has these multiple relationships, Data Cloud allows you to select which path you would like the segment to use.

Calculated Insights in Segments

You may add Calculated Insights in Data Cloud Segments.

These are the requirements to use Calculated Insights in Segments:

  • Calculated Insights appear in the Direct Attributes section of the Segment creation screen.
  • The Calculated Insight must reference the DMO of the Segment On property.
  • The Calculated Insight must have the Primary Key of the Segment On DMO as a Dimension.

Example of a business Use Case:

You use Data Cloud Insights to create a robust and reliable RFM scoring model.

RFM stands for: Recency, Frequency and Monetary value of each customer. This insight gives you a clear scoring of each of your customers, based on a lot of complex metrics and dimensions (e.g: purchase frequency, total order value, Lifetime Value, etc).

You then create marketing Segments with this RFM scoring as one of the attributes, to create very complex segmented audiences for your communications.

Publishing Segments

The concept of a Segment is closely linked to the Activation phase of Data Cloud.

The whole purpose of creating Segments is to send them to a target destination, that is, to “activate” them. E.g: make a segment and send that audience to a Salesforce Marketing Cloud Journey.

The process of the segment data being refreshed is called Publishing.

To change the Publish Schedule of an Active Segment:

  1. Go to Segments tab
  2. Select your Segment
  3. Click on Edit Properties > Next
  4. Choose Publish Type: Standard OR Rapid
  5. Select a Publish Schedule
  6. Save

To Publish a Segment for the first time, after creating it:

  1. Go to Segments tab
  2. Select your Segment
  3. Dropdown arrow menu > Publish Now

Publish Types

There are 2 Publish Types:

  • Standard Publish
    Minimum: every 12 hours
    Maximum: every 24 hours
    This type of Publish uses up to the last 2 years of Engagement data.
    You may select a Start and End date and time.
  • Rapid Publish
    Minimum: every 1 hour
    Maximum: every 4 hours
    This type of Publish uses up to the last 7 days of Engagement data.
    This type of Publish can only send data to Salesforce Marketing Cloud as a target.

Once a Segment is Publishing, you can check the tab Publish History to check the following data:

  • Activation Name
  • Platform
  • Segment
  • Activation
  • Publish Type
  • Publish Status
  • Last Publish Completed

Segment and Publish Status

Check the following tables to see all the information that you can check on both your Segment and Publish Status:

Segment Status

Segment StatusDescription
InactiveAn Inactive Segment can only be deleted.
RecountingThe counting of the segment population is in process.
ProcessingThe publishing of the Segment is taking place right now.
ErrorThe Segment cannot be published.
ActiveThe Segment has been created and all the settings are correctly configured.

Publish Status

Publish StatusDescription
PublishingThe Segment Publishing is taking place at the moment.
SkippedThe Publishing of this Segment was delayed 30 mins due to the max concurrent publishes in Data Cloud.
DeferredThe Segment Publish time was deferred due to the max concurrent publishes in Data Cloud.
ErrorThis Segment could not be published.
SuccessThis Segment was published correctly to the Activation Target.
BlankThe Segment has been created but it has not been published yet.

Publish Schedules

These are all the Publish Schedules currently available in Data Cloud:

Publish SchedulePublish TypeActivation PlatformData Lookback WindowDescription
1 hourScheduled - Rapid PublishOnly SFMCLast 7 daysThe segment is published every 1 hour.
4 hoursScheduled - Rapid PublishOnly SFMCLast 7 daysThe segment is published every 4 hours.
12 hoursScheduled - Standard PublishAllLast 2 yearsThe segment is published every 12 hours.
24 hoursScheduled - Standard PublishAllLast 2 yearsThe segment is published every 24 hours.
Don't RefreshN/AN/AN/ASegment is not published to any activation target.
ManualManual PublishAllBased on Publish TypeThe segment is published by clicking Publish Now.

Some considerations on Publishing Segments:

  • Manual publishes take priority over scheduled ones.
  • Available Activation Platforms include: B2C Commerce Cloud, any External Activation Platform (Ecosystem), Interaction Studio (MC Personalization), Loyalty, SFMC, Amazon S3.
  • Remember to always have an Activation configured for Scheduled Publishes.
  • Check the different limits explained in further sections to understand how Data Cloud manages this functionality as there are differences depending on the type of Segment.

Editing Segments

Once you have created a Segment, you may edit it in Data Cloud.

  1. Go to Segments tab
  2. Select the segment
  3. Click on Edit Rules to modify the segmentation rules
  4. Click on Edit Properties to modify the Segment Name, Description, Publish Type and Schedule
  5. Save

You can also Delete a Segment, but take into account that if it’s configured with an Activation, you will have to delete the Activation first (it’s a dependency).

You may also Copy or Clone a Segment.

Finally, you can also Deactivate the Segment, but please keep in mind that deactivating it will no longer allow you to re-enable it, publish it again, etc.

Real-Time Segments

As in Real-time Insights in Data Cloud Insights, you can also create Real-time Segments.

These are Segments that publish in miliseconds, but there are some considerations:

  • You cannot use real-time segments with nested segments.
  • You cannot use real-time segments with exclusion criteria.
  • To have real-time segments in Data Graphs, you need to add the Segment ID and Timestamps fields from the Membership DMO to the real-time Graph.
  • You cannot manually publish real-time segments.

To create a Real-time Segment:

  1. Go to the Segments tab
  2. Click New > Use a Visual Builder
  3. Select Real-time Segment > Next
  4. Provide the Data Space, Segment Name and the Real-time Graph to build the segment on

Here’s a table with important considerations and limits for Real-Time Segments:

FeatureLimit
Total maximum N of Real-time Segments that you can have35
Using exclusions in Real-time SegmentsNot Available
Using Nested Segments in Real-time SegmentsNot Available
Total maximum N of streaming events in a Real-time Segment1
Manual Publish Type on Real-time SegmentsNot Available

Waterfall Segments

Waterfall Segments are hierarchical segments.

Imagine you run a PC accesories online store and have a Black Friday campaign coming soon.

You want to create multi-layered, prioritised segments, as there will be 1 big, general campaign and then, based on what product categories customers purchase, multiple sub-campaigns will follow.

E.g: a generic, Black Friday campaign for all and then, if you purchase a laptop, in the following days you will be part of a laptop-oriented segment and series of communications (for example, with cross-sell or upsell suggestions).

  • You create the Waterfall Segment based on Purchased Product Category.
  • All the sub-segments are mutually exclusive, meaning each customer will be in only one segment, based on their Priority order.
  • You configure the different sub-segments. E.g: Purchased Product Category = Laptop, Purchased Product Category = hard drive, etc.
  • As the campaign develops and different customers purchase different items, each customer is taken to the corresponding sub-segment without duplicates, overlaps, respecting the priority order you defined, etc.

Follow these steps to create a Waterfall Segment:

  1. Go to the Segments tab
  2. New > Use a Visual Builder > Waterfall Segment
  3. Enter the Name, Segment On DMO and Data Space
  4. Define Publish Type (Rapid is not available) and Schedule
  5. Create the Waterfall hierarchal priority order by using the drag and drop builder, selecting segments in order.

Creating Segments From Data Kits

Data Kits save you configuration time by packaking pre-defined Data Cloud components.

If you create a Segment from a Data Kit, you just need to follow these steps:

  1. Go to the Segments tab
  2. New > Create from a Data Kit > Next
  3. Select your Data Kit and predefined Segment
  4. Enter the Segment details: Name, Segment On DMO, Data Space
  5. Configure Publish Type and Frequency of Schedule
  6. Save

Data Cloud Einstein Segments

You can leverage Data Cloud Einstein Generative AI capabilities to create segments.

That is, you can let the AI generate a relevant segment for you, with all its attributes and metadata ready, so you can decide to publish it right away, make some final adjustments, etc.

In order to leverage Einstein Segments, you need to follow these 2 pre-requisite steps:

  1. Set up your Data for Einstein Segments
  2. Enable Einstein Segments

1. Setting up your Data Cloud Data for Einstein Segments

This is a checklist so that your company can make the most of Einstein Segments:

  • Identity Resolution is in place, with a Ruleset to create the Unified Individual DMO.
  • The Unified Individual DMO has records and its output has been validated and confirmed.
  • In order to be able to use data from other DMOs, these Data Model Objects must be related to the Unified Individual DMO.
  • Check that the DMO and field names are accurate.
  • (Optional) Enhance your metadata with Einstein Data Prism (e.g: use suggested descriptions).

This is a logical checklist.

Before letting the generative AI create segments, make sure that data is accurate, reliable, adheres to security and privacy and it is well organised.

2. Enabling Einstein Segments in Data Cloud

Einstein Segments require the activation of Generative AI in your Org, which in turn requires having Data Cloud correctly configured.

Although a dedicated guide will be written on this topic, as Generative AI works across the entire Salesforce Org, here’s a recap summary of the configuration steps for Data Cloud Einstein Segments:

  1. Enable and configure Data Cloud in your Org. You may follow the guide I wrote here for a full breakdown of all the steps:
    https://davidpalencia.com/salesforce-data-cloud-setup-and-provisioning/
  2. Activate Einstein Generative AI:
    – From Setup > Quick Find box > Einstein Setup
    – Once you’re in Einstein Setup, click on Turn On Einstein.
  3. Configure the Einstein Trust Layer
    – From Setup > Quick Find box > Einstein Trust Layer
    – Configure settings according to your organizational security and privacy policies (data masking must be turned on).
  4. Activate Einstein Data Collection and Storage
    This activation means that you agree to store your Salesforce generative AI activity log and feedback in Data Cloud, including its corresponding associated costs.
    – From Setup > Quick Find box > Einstein Feedback > Turn on “Collect and Store Einstein Generative AI Audit Data”
    – Select the target Data Space for this data

Once you’ve finished all the pre-requisites and configuration, follow these steps to start creating Einstein Segments in Data Cloud:

  1. Go to the Segments tab
  2. New > Create with Einstein
  3. Select the properties: Segment On DMO and Data Space
  4. The Einstein Segment chat will appear. Type your segment description in simple words and press Enter.
  5. Einstein will translate your prompt into a full Segment, including:
    – Segment description
    – Suggested Attributes (most relevant ones based on your prompt)
    – Additional Attributes (secondary, optional ones)
    – Reasoning behind the attributes suggested
  6. You may click on Refine Segment to re-start the generated segment output
  7. You may click on Edit Segment Rules to manually adjust any attribute or value (note: this creates the segment as you need to be taken to the segment canvas to edit).
  8. Click on Create Segment when you’re happy with the suggested Einstein Segment.