Send Data from Salesforce to Data Cloud using Ingestion API and Flow : Meera R Nair

Send Data from Salesforce to Data Cloud using Ingestion API and Flow
by: Meera R Nair
blow post content copied from  Meera R Nair - Salesforce Insights
click here to view original post

As part of this blog post, we are going to see a Sprint 24 feature - Send Data to Data Cloud using Flows and Ingestion API. The release note is available here.


As we all know, Data Cloud helps us to build a unique view of customers by harmonizing data from multiple source systems and coming up with meaningful data segmentation and insights that can be used in different platforms for additional processing. 

If you are new to Data Cloud refer to Salesforce documentation or these videos that are part of Salesforce Developers youtube channel.

What is Ingestion API

As per Salesforce documentationIngestion API is a REST API and offers two interaction patterns: bulk and streaming. The streaming pattern accepts incremental updates to a dataset as those changes are captured, while the bulk pattern accepts CSV files in cases where data syncs occur periodically. The same data stream can accept data from the streaming and the bulk interaction.

This Ingestion API will be helpful to send interaction-based data to Data Cloud in near real-time and

 this is getting processed every 15 minutes.

Use Case

When external users are submitting donations from a public site, the donation details need to be sent to Data Cloud for segmentation and creating insights.

Implementation Details

Let us see the details:

1. Create a Donation custom Object

We can create a custom object in Salesforce with minimum details like: 
  • Name 
  • Category
  • Donation Amount

2. Create a screen flow to input donation details

Add screen elements to input the above 3 variables and create a record once the values are entered.

3. Add this flow component to a Public site/ Salesforce page

Normally donations are executed from the experience cloud site. But for test purposes let us add this to the internal home page.

4. Create Ingestion API Connector and upload the Schema

From Data Cloud Set up screen, create a new Ingestion API Connector

  • Click on New Button from Data Cloud setup->Ingestion API Connector

  • Input API name

  • Upload Schema of Ingestion API. For this, we need to give the file in yaml format following open API format. Sample Schema file:

  • Click on Upload Schema file

  • Select the file

  • Preview schema and save it

5. Create Data Stream from Ingestion API

Data streams in the Data Cloud allows you to ingest data to the Data Cloud from Salesforce and other data sources. To ingest data to the Data Cloud using the Donation_Ingestion API that we just created, a data stream needs to be defined as shown below:

  • Create new Data Stream 
Go to Data Cloud from Appa Launcher and open Data Streams Tab. Click on new button:

  • Select the source for Data Stream. In our case it is Ingestion API

  • Select the newly created Donation Ingestion API

  • Configure the selected object and field details as shown below:
Here choose the Data Category as "Engagement" and primary key as donation_id which is getting mapped to the standard name field of Donation that is configured as auto number and map the Event Time field to the created date.

  • Deploy the Data Stream

  • The final output of the Data Stream will be a new Data Lake object as shown below:

6. Do Data mapping and create a Data Model Object

For additional processing of these ingested data, they need to be mapped to Data Model objects in Data Cloud. A single Data Lake Object can be mapped to one or more standard/custom Data model objects. After that, all additional processing can be applied to this data mapped to Data Model Objects.

In our case, we are going to create a custom Data Model Object and do the mapping.

  • Start Data Mapping and click on Select Objects
  • Select the Custom Data Model Tab and click on the new Custom Object

  • Select the fields to be created in the new DM Object from Data Lake

  • The mapping looks like below and save it

  • In the Data Model Object Tab, we can see the newly created Object

7. Modify the flow to send the new Donation record to Data Cloud

As per the Sprint 24 feature, once an Ingestion API connector has been successfully created, it will be available by default as an action in the Screen flow/ record triggered flow.

In record triggered flow this action can be added using the scheduled path only, since this is a callout operation
  • Modify flow to retrieve new Donation record name and created date

  • Add a new Action in the screen flow and select "Send to Data Cloud" from the category

  • Newly created Ingestion API - Donation Ingestion, will be available by default

  • Do all the input variable mapping

  • Screen flow got updated successfully, save and activate it

8. Test the flow Action

  • Open the screen flow and enter the details

  • Record created successfully

  • In Data Cloud App, Go to Data Explorer Tab and check Data Lake Object - Donation. We can see that the new record created in Salesforce got synched with Data Cloud now:

  • Similarly, check the corresponding Data Model Object, and we can see the data is synched there also:

Thus with this Sprint 24 capability Data sync from Salesforce flow to the Data Cloud has been made very easy.

February 06, 2024 at 07:03PM
Click here for more details...

The original post is available in Meera R Nair - Salesforce Insights by Meera R Nair
this post has been published as it is through automation. Automation script brings all the top bloggers post under a single umbrella.
The purpose of this blog, Follow the top Salesforce bloggers and collect all blogs in a single place through automation.