Marketing Cloud Send Time Optimization

Marketing Cloud Send Time Optimization

Marketing Cloud Einstein Send Time Optimization (STO) determines the best time to send a message. Using machine learning, Einstein predicts optimal send times so that the message is likely to be opened.

Einstein Send Time Optimization helps marketers increase email and push notification engagement. Using automated analysis, Einstein determines the best time to send a message to each individual contact. Each week, Einstein uses each contact’s latest engagement data for sends with the Commercial send classification to recreate its sending model with that data. Email sends with the Transactional send classification are not used for modeling.

The Einstein STO Journey Builder activity sends messages to each contact at the time when that contact is most likely to open the message. When Einstein doesn’t have enough data to create a model for a contact, the contact receives messages according to a generalized model. The generalized model is derived from sending data in your Enterprise. The Einstein Send Time Optimization dashboard displays analytics that predict future message engagement.

To use this activity, activate Einstein Send Time Optimization for your business unit.

NOTE: Einstein Send Time Optimization is included with Corporate and Enterprise Edition or available as an add-on purchase to Pro Edition. Einstein Send Time Optimization is also available as an add-on to customers with Pro Edition who have Journey Builder. For customers who have purchased a Corporate or Enterprise Edition prior to 10/27/2017, Einstein Send Time Optimization is available at no additional charge upon execution of an Order Form with Additional Terms. Contact your account representative for details.

Send Time Optimization Summary Dashboard

The Send Time Optimization summary provides great visual insights into how your subscriber base engages with your marketing during certain hours of the day and days of the week.


Note: Results will come after 72 hours after activation to complete the data transfer and analysis required to optimize sends through Journey Builder.

Send Time Optimization in Journey builder

Select a journey and segment off of a small percentage (10%) – Campaign holdout of the traffic using a random split. Then drop in the Send Time Optimization tile only on that small branch. See the screenshot below:

Einstein Send Time Optimization in Journey Builder 


Salesforce Marketing Cloud Group Connect OOTB Reports

Salesforce Marketing Cloud Group Connect OOTB Reports

There is a set of reports in Analytics Builder available for GroupConnect.

Navigation -> Reports -> GroupConnect:

  • LINE Outbound Message Report to display a summary of Outbound Messages sent to LINE followers within a specified date range.
  • Mobile Chat Messaging Journey Builder Activity Summary Report containing all journey builder activity related message sends.
  • Mobile Chat Messaging Summary Report to view all chat messaging api send activities.
  • LINE Triggered Sends Summary Report with a summary of Follow and Response Messages sent to users within a specified date range.

Data Views:

  • _ MobileLINEAddressContactSubscriptionView to find active LINE followers and users who have blocked your brand (data view detail).
  • _MobileLINEOrphanContactView containing unfollowed contacts (data view detail).


Salesforce Marketing Cloud Group Connect with Journey Builder and GroupConnect API

Salesforce Marketing Cloud Group Connect with Journey Builder and GroupConnect API

Group Connect is a tool inside the Mobile Studio capable of delivering personalized Facebook and LINE messages to your customers.

GroupConnect offers you highly targeted messaging for a specific group of contacts. Because GroupConnect interacts solely with the LINE messaging app and Facebook Messenger (via REST API), you already know a lot about your audience based on their profiles. In this unit, you learn the best ways to make use of this information and bring your GroupConnect strategy into Journey Builder.

How to set up triggered messages for both LINE and Facebook Messenger using GroupConnect API

GroupConnect already includes automated messages for the LINE app, and you can use the GroupConnect API to set up triggered messages for both LINE and Facebook Messenger. Automation Studio also helps you set up recurring messages in GroupConnect. All of these features help you maintain regular activity with your contacts without overwhelming them with irrelevant or too-frequent messages.

Send LINE Messages with Journey Builder

LINE messages can include several different types of content.

  1. Text
  2. Images
  3. Videos (from a URL)
  4. Audio (from a URL)
  5. Imagemaps
  6. Carousels
  7. LINE stickers

And get this: A LINE message can include five of these elements, and the larger carousel messages can include up to 10 messages in a single send! Simply put, you can send a lot of content. Once you’ve crafted your messages, you can include them in Journey Builder activities. For example, you can send a message to a new contact, then follow up with a random split to test messages and see which gets a better response.

Journeys tab in Journey Builder showing the message configuration for a GroupConnect message named NTO Spring Deals.
Source: Trailhead

GroupConnect API

The GroupConnect API helps you interface with both LINE and Facebook Messenger accounts. In fact, the only way to use GroupConnect with Facebook Messenger is with API calls, so you’ll want to be comfortable with a little coding to make the most of the integration.

Follow these steps to integrate with the GroupConnect API (or, if this isn’t your thing, have your preferred software developer lend a hand).

1. Make a registration on LINE Developer website (Once you log in to your LINE developers console, you create a Provider and a Messaging API channel.)

2. Integration with SFMC (In the main setup in SFMC, search for LINE to register the channel.)

In the main setup in SFMC, search for LINE to register the channel.

Fill in the LINE Channel ID and LINE Channel Secret provided by LINE in the previous step.

The destination Webhook allows SFMC to reach LINE for delivering the messages you send, and Marketing Cloud Endpoint allows LINE to reach your marketing cloud instance with events coming from LINE such as Follows and Unfollows. These two fields are automatically populated based on the information you provide.

Once you have the webhook URL of your Marketing Cloud Endpoint populated, add it to LINE developers console where you can finish the configuration.

Campaign Tips

📌 For one time sends, create a LINE Outbound Message directly in Mobile Studio -> GroupConnect, where you can define the content, select contacts, and review the message before sending it. Single sends can be performed only to GroupConnect Lists.

📌 For recurrent sends and to use data extensions as a source, use journey builder. Find the “LINE Message” activity and drag and drop it on your canvas. Once dropped, click on the activity and select from the existing messages in Content Builder or create a new one directly in journey builder.

📌 For advanced personalization, include AMPScript. Before the final campaign execution, do not forget to preview and test the message first.

There are different types of content that you can use with LINE messages:

  • Text
  • Images
  • Videos (from a URL)
  • Audio (from a URL)
  • Imagemaps
  • Carousels
  • LINE stickers

GroupConnect Chat Messaging API:


Salesforce Marketing Cloud SSO implementation and the troubleshooting tips

Salesforce Marketing Cloud SSO implementation and the troubleshooting tips

This Blog walks you through setting up Single Sign On (SSO) for an Identity provider with Salesforce Marketing Cloud as a Service Provider. The following guidance helps with the specifics of implementing SSO alongside our existing

Marketing Cloud supports identity providers that use the SAML 2.0 specification, such as Salesforce Identity, Shibboleth, PingFederate, and Active Directory Federation Services (ADFS). The configuration for the identity provider must trust the Marketing Cloud product as a service provider, sometimes called a relying party.

1. Enable SSO

a) Enable SSO on your Salesforce Marketing Cloud Account. SSO could already be enabled on the Enterprise account. To verify, log on to the Main Enterprise account Id on your MC instance and then go to Setup > Administration > Data Management > Key Management.b) If SSO is enabled, the SSO Metadata radio button appears. If the radio-button doesn’t appear in the UI, then either SSO isn’t enabled or you are within a Business Unit. If you are on the Enterprise-level business unit and SSO isn’t enabled, raise a case to have support enable SSO for your account.

NOTE: You can only have one active SSO Metadata active at a time.

2. Retrieve SAML Metadata

After SSO has been enabled, you must retrieve your SAML Metadata from the MC account. It’s located under Setup > Settings > Security > Security Settings > Single Sign-On Settings > SSO SAML Metadata (Button) A url looks similar to the following:

NOTE: If you have an option to select a certificate version to choose the one with the latest expiration data example (Jan 2021 – (expires February 5, 2022))

3. Apply to your IDP

You now must apply the SFMC Metadata to your IDP.

4. Create Key

After the SFMC Metadata has been applied, you’ll then take the metadata from your IDP and input it into the Key Management section of SFDC. Within your Org go to Setup > Administration > Data Management > Key Management. Click the Create Button then select SAML_SSO Metadata A <NameIDFormat> Value is required the IDP Metadata entered into the SFMC configuration add one of the following lines to the metadata if you receive an error saying the <NameIDFormat> is missing or invalid. If the <NameIDFormat> is in the wrong location it will also error.


The <NameIdFormat> must be placed in between the </KeyDescriptor> closing tag and the <SingleSignOnService> Open tag. 

<md:SingleSignOnService ... >

NOTE: the MD: is an XML namespace, and if you IDP Metadata doesn’t use it or it’s different, you must remove or change it accordingly. The <NameIDFormat> opening and closing tags must match the name space used in the closing <KeyDescriptor> and opening <SingleSignOnService> elements.

5. Save Key

Now hit Save, if the key is accepted, a green banner appears and the Key was saved successfully. If an error occurs, and you can’t resolve the issue, then open a Support Case. 

6. Enable SSO Setting

After you have a green banner and a key in place, You must enable SSO for your MC account under Setup > Settings > Security > Security Settings > Edit > Single Sign-On Settings > Enable SSO by selecting or checking the setting and then selecting Save.

: SFMC requires MFA to be enabled on SSO connections by the 2022 deadline. We recommend it to be introduced prior for a more secure experience.

NOTE: Certificate version may vary or there may be multiple versions listed. 

7. Configure User SSO Settings

The next step will be to go to Setup > Users > Users, then click a User. Select the enable SSO option and add the Federation ID that was configured on the IDP side. If the value is unknown, you must verify with your IDP or IT team to gather that information to continue.

NOTE: A common situation with SSO enablement is the End-User attempting to log in can’t reset their user password or login via the URL. The user and pass log in route is ignored when SSO is enabled. This is functioning as designed as an End-User is only able to log in via the SP initiated link provided under Setup > Settings > Security > Security Settings > Edit > Single Sign-On Settings > Marketing Cloud SP Initiated Link, or an IDP initiated connection via a dashboard or another method to start the conversation. ALL requests are received but not processed for any user that has the SSO Enabled box enabled. If a user is not SSO enabled then this issue will not present and they can request password normally.

8. Test the SSO Configuration

Test the newly appointed SSO user, either via an incognito window or a freshly purged cache browser. If you receive an error, open a case with support. If you log in without issue you can go ahead and implement further SSO users.

BestPractice: Leave at least one admin user not on SSO so you can recover the account and login to the SFMC to correct any configuration SSO issue. 

Troubleshooting tips

SSO No FederationID


Go to Marketing Cloud account –> Login with your user details –> Click on Setup –> Manage users –> Select the user who are facing this issue –> click on the user –> Click on enable SSO checkbox –> Enter Federation ID as “employee ID or email id” as per your IDP setup.

SSO Fatal Profile Error


The root cause of the above screenshot was an improper value included in the ACS URL on the Okta side of the prebuilt configuration. The SLO link was used in place of the ACS link. This caused the SP to validate the Logout request when the request was a login request throwing the error.

The second error was caused but using the RequestInitiator Link in the ACS link. That should have been caught by me, but it was not.

The proper link to use would look like this in the Metadata

<md:AssertionConsumerService Binding=”urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST” Location=”” index=”1″/>

This value is what corrected the issue.

Single Sign on Certificate


SFMC is integrated with Azure for SSO which has been working fine, Recently when i try to login, SFMC prompting a message that i need to update SSO certificate which i have to take from SFMC Metadata. Problem is SFMC Metadata simply gave the key for .cert format…but our Azure is expecting .pfx file…Also to convert .cert to .pfx…it needs private key and password as well… so where do i need to get them.

SFMC Notice

Steps to update

Invalid Assertion

Looks like, the Fed ID was correct.  The problem ended up being SAML Type was set to Assertion Contains the User’s Salesforce username.  changed it to the second option Assertion contains the Federation IS from the User Object.


Marketing Data Sharing Rules in Pardot Business Units

Marketing Data Sharing Rules in Pardot Business Units

Marketing Data Sharing Rules are what determine which records should sync to Pardot from Salesforce. Use these rules to restrict which data is sent to Pardot, perfect for organisations who want only a subset of their Salesforce Leads or Contacts to exist in Pardot, or who want to route records to specific Pardot Business Units.

Fortunately, Pardot Admins can build these Marketing Data Sharing Rules using ‘point-and-click’, and are much more appealing than the trickier Salesforce Connector user permissions (field-level security, sharing rules etc.) that were previously the way to go. Yes, now in one place, you can see the record criteria that are allowed to sync!

Marketing Data Sharing is available in the Pardot Lightning app for Advanced and Premium Pardot editions. Your Pardot instance also needs to be using the V2 Connector. The V2 Connector is the default for any Pardot accounts purchased after February 2019, but if you purchased before then you can learn how to upgrade your connector here

How does Marketing Data Sharing work?

Marketing Data Sharing relies on one rule per object and objects that match the rule’s criteria sync to Pardot. For example, say your Leads and Contacts have a checkbox field called “Product Interest”. If “Product Interest” is checked (true) the Lead or Contact is eligible to sync to Pardot. If the “Product Interest” field is unchecked (false) the Lead or Contact is ineligible to sync to Pardot. If an existing Pardot Prospect’s “Product Interest” field changes from checked to unchecked, they will be archived. 

How To Setup Pardot Marketing Data Sharing

When you are ready to turn on Marketing Data Sharing rules, open Pardot in the Lightning App (you cannot set this up in classic)

  1. In the Lightning app, select Pardot Settings, and then Connectors. Click gear icon next to the Salesforce connector, and select Edit Settings. Select Marketing Data Sharing.
  2. Open a rule for editing.
  3. Configure the rule.
    • Each object can have only one rule. Each rule can be based on one Salesforce field and uses the equals operator.
    • Rules can be based only on Salesforce fields that aren’t mapped to a Pardot field. The field must belong to the rule’s object and the connector user must have read and edit access to it. If needed, change the field-level security to give the connector user access to the field.
    • The Default setting for an object uses the connector user’s permissions to control which records sync. When you create a rule for an object, rule, both the rule and the connector user’s permissions control which records sync.
  4. Save the rule.When an object has a rule, the details appear in the Criteria column on the Marketing Data Sharing tab.

Step by step screenshots given below:

  • In the Lightning app, select Pardot Settings, and then Connectors. Click gear icon next to the Salesforce connector, and select Edit Settings. Select Marketing Data Sharing.
  • Open a rule for editing.
  • Configure the rule.
    • Each object can have only one rule. Each rule can be based on one Salesforce field and uses the equals operator.Rules can be based only on Salesforce fields that aren’t mapped to a Pardot field. The field must belong to the rule’s object and the connector user must have read and edit access to it. If needed, change the field-level security to give the connector user access to the field.The Default setting for an object uses the connector user’s permissions to control which records sync. When you create a rule for an object, rule, both the rule and the connector user’s permissions control which records sync.
  • Considerations When Using Marketing Data Sharing Rules

    There are considerations you need to take into account when planning and building Marketing Data Sharing Rules, including some limitations.

    One rule:

    You can only create one rule per object, for example, region is ‘EMEA’.

    Salesforce fields:

    Rule criteria must be Salesforce fields (and be editable)

    What if there are no rules?

    If we don’t define a rule for an object, for example opportunities, then the connector will fallback on the connector user’s object level permissions (defined by their profile, and extended by permission sets if necessary).

    Leads and contacts:

    If you create a rule for leads or contacts, the criteria used in that rule must be applied to both objects. The criteria used in a rule to define lead sync will be applied to contacts too, and vice versa: the criteria used in a rule to define contact sync will be applied to leads too.

    Multiple Pardot Business Units:

    If you create a rule referencing leads/contacts for one business unit, all other Pardot BUs must have a Marketing Data Sharing Rule for lead/contacts too.

    Records no longer matching criteria:

    If a field update on a syncing record means that it no longer matches the criteria in your rule, Pardot will automatically send that prospect record to the Pardot recycle bin. For example, if a rule is syncing leads with region ‘EMEA’, and a Salesforce user updates the region field to ‘LATAM’, that prospect record will be recycled.

    Importing into Pardot:

    If importing directly to Pardot, and the newly created prospect records do not match the Marketing Data Sharing Rule criteria, these records will Pardot will automatically send that prospect record to the Pardot recycle bin.

    These are the most important considerations – read the full list here.


    Marketing Data Sharing is a valuable tool, especially if only a subset of your leads and contacts are marketable and if you are using multiple Pardot Business Units. Please drop your comments if any..

    How do we control permissions in a Pardot Business Unit?

    How do we control permissions in a Pardot Business Unit?

    When you provision a business unit, the Salesforce-Pardot connector is automatically created in a paused state. The connector user has View All Data and Modify All Data permissions for Salesforce objects that sync between Pardot and Salesforce. To prevent duplicate records and skewed reporting, limit what syncs to each business unit before you unpause the connector for the first time. For example, if you have just provisioned a Business Unit for your AUS team, ensure that the connector’s permissions only allow it to see AUS data in Salesforce.

    The table below outlines how to control the connector’s ability to sync different Salesforce records:

    ObjectHow to control which business unit can sync the object’s records
    LeadsMarketing Data Sharing rules / Salesforce connector permissions
    ContactsMarketing Data Sharing rules / Salesforce connector permissions
    OpportunitiesMarketing Data Sharing rules / Salesforce connector permissions
    Custom ObjectsMarketing Data Sharing rules / Salesforce connector permissions
    UsersProfiles in Salesforce User Sync
    CampaignsCampaign Record Types in Connected Campaign Configuration
    • A lead or contact record can sync with a prospect in one business unit. Giving multiple business units access to a lead or contact causes data conflicts and can affect sync speeds. To have a prospect in multiple business units, create duplicate records that represent the same individual and sync each record with a separate lead or contact.
    • A connected campaign can sync with only one business unit.
    • Pardot and Salesforce have their own data authorization and sharing models. The Salesforce data that’s available to users in the Pardot Lightning app is determined by their Salesforce permissions. The Pardot data that users see in Salesforce is determined by their business unit. For example, let’s say you have two business units, North America and Europe, and a user has access only to the North America business unit. The user’s Salesforce permissions gives access to all Salesforce leads, but they have access only to the prospects in the North America business unit.

    To know more about What are Pardot Business Unites?

    What are Pardot Business Units?

    What are Pardot Business Units?

    Business Units allow Pardot customers to separate or partition their data by region, product or business area. This allows for your marketing teams to personalise their messaging to their relevant data lists.The end goal has always been to send the best leads to the right Sales people, which with the help of Business Units will lead to an efficient and productive sales process.

    Pardot Business Units are separate databases within a Pardot account that allow Pardot customers to partition their prospects, campaigns and assets by regions, products or services. The BUs will remove the necessity to connect multiple instances of Pardot to Salesforce in order to restrict data syncing to each Pardot account. They are available for Pardot Advanced & Premium edition customers only

    Can I use Pardot Business Units?

    There are some requirements before you can start to use Business Units, these are:

    • Pardot Lightning App needs to be enabled within Salesforce Lightning
    • Business Units will only be available to Pardot customers using the Advanced edition from 11th February 2019. Or those who upgraded to Advanced after 11th February 2019.
    • Business Units will only be accessible in the Pardot Lightning App in Salesforce Lightning.
    • Your Salesforce edition will need to be Advanced +
    The Detail

    Each Business Unit has its own partitioned connector, settings and configurations, segmentations and automations, assets and Prospects.

    Things to be aware of when you want to implement Business Units:

    • Once a Business Unit has been created and enabled, you can’t delete it!
    • After the Business Unit has been named in Salesforce, you can’t change it!

    Some of the Key Concepts:

    Restricts Access to Prospect Data

    Before Pardot BUs, access to prospects could not be restricted. Anyone who has access to Pardot (and has a user role greater than Sales), has access to all the prospect records.

    Now, Pardot admins can restrict access based on products, services and regions by partitioning data in BUs and assigning users to each BU.

    Eliminates Selective Sharing Rules

    The new feature also eliminates the need for complex selective sharing rules by introducing the Marketing Data Sharing criteria. A single Salesforce field can now determine which Pardot Business Unit your prospect data will sync with, and any prospect that no longer matches the criteria will automatically be suppressed; for example, when country equals Germany, sync the Lead with the EMEA BU.

    Before Marketing Data Sharing, Consultants and Pardot Admins had to set the Salesforce Organisation-wide Sharing settings (Org-wide Default) to private, just so they could then include Sharing Rules that would open up particular records to the Connector User. In a lot of instances, companies do not want to have an org-wide default of private, but thankfully Marketing Data Sharing and the Pardot Connector v2 are well on their way to simplifying record access for Pardot Admins.

    Business Unit Account Switcher

    If the Pardot account was purchased after 25th April 2019, users will also have the ability to switch between BUs using the new Account Switcher feature. No longer will users need login and out of individual Pardot accounts. This feature can be accessed from Salesforce and will only impact Pardot product elements.

    Multiple Email Sending Domains

    Finally, Pardot BUs will allow customers to use multiple Email Sending domains and Tracker domains across their BUs. Whilst it will still be required to set a Primary CNAME additional CNAMEs will be available as an option when creating new assets, allowing for sub-divisions and regions to have their own unique branding.


    A lot of the new features released as part of the Pardot BU functionality are irreversible, for example:

    • A BU cannot be deleted once enabled
    • The BU name in Salesforce cannot be changed
    • You cannot switch the Salesforce account used for the connector after installing

    The irreversible nature of the Pardot BUs means it is important to plan their structure carefully with careful consideration of all their nuances.

    Ref material:

    ATTENTION: Return Path End of Renewal

    Return Path products will no longer be made available for renewal to Salesforce customers beginning May 1, 2021. Additionally, Salesforce can no longer be certain that Return Path’s third party product functionality will be available after your next order from the anniversary date. Given this change, it is critical that customers build a plan with their Salesforce representative immediately to manage their email deliverability and inbox placement reporting needs.

    While Salesforce Marketing Cloud continues to offer email deliverability and analytics functionality, customers can also work with SparkPost, a strategic partner, to leverage their email deliverability platform. To speak with SparkPost and learn more of their offerings, please reach out to the team here.

    Please reach out to your Salesforce representative to discuss plans to manage your future email deliverability and analytics needs.

    Quip for Customer 360 Data Flow & Troubleshooting Pointers

    quip for 360

    As part of Salesforce’s latest Spring ’19 release, you can now embed Quip’s collaborative documents, spreadsheets, slides, and chat rooms directly within Salesforce objects and records. This powerful addition is the first of its kind and fundamentally changes the way Sales and Service teams get work done together inside the Salesforce Platform.

    There are 2 Quip for Customer 360 experiences:
    1.Salesforce in Quip (Salesforce live apps, Log Activity, and others)
    2.Quip in Salesforce (Quip Lightning components)
    Learn More: Quip & Salesforce Integration Summary

    1. Salesforce in Quip
    Bring live Salesforce data directly into your Quip documents with Salesforce live apps and live Salesforce reports. It’s as easy as @mentioning:

    • Salesforce Record
    • Salesforce List
    • Salesforce Report
    • Einstein Analytics (Beta)

    The Record and List live apps follow similar data flow patterns and use the Salesforce UI API, while live Salesforce reports use the Salesforce Reports and Dashboards REST API.

    The Einstein Analytics (Beta) live app data flow is one-way and goes from Salesforce Analytics to Quip via the Analytics API. Dashboards are displayed using the Einstein Analytics Lightning component via Lightning Out.
    2. Quip in Salesforce
    Use the Quip Lightning components to link Quip documents to a record, create a document from a template, and see Quip notifications in Salesforce.

    Quip Lightning Components

    • Quip Associated Documents Component
    • Quip Document Component
    • Quip Notifications Component

    Salesforce Record and List Live Apps

    salesforce list
    This diagram illustrates the basic pattern of integration and data flow of Salesforce live apps in a Quip document. To bi-directionally sync data to or from the Quip UI, live apps leverage Salesforce’s UI API. This is a modern REST API that abstracts layout complexity by:

    • Checking field-level security settings, sharing settings, and permissions
    • Making SOQL queries to get record data
    • Getting object metadata and theme information
    • Getting layout information

    This diagram outlines the basic data flow between the Salesforce Record live app and the Salesforce UI API. The Salesforce List live app is analogous to this.

    ui api


    Live Salesforce Report:

    The basic integration pattern for Salesforce Report leverages the Salesforce Reports and Dashboards REST API, as opposed to the Salesforce UI API used by the Salesforce Record and List live apps.

    The Quip integration with Salesforce leverages Quip’s Automation API which is REST-based. The APIs are used to:

    • Search and display quip documents directly inside Salesforce
    • Create new Quip documents directly from within the Quip lightning component in Salesforce
    • Quip Lightning components are embedded as iFrames inside of a Salesforce Object page. To use Quip inside of Salesforce, you need to first connect your Quip site to your Salesforce org, and then authenticate.
    • Data stored in Quip is encrypted in transit and at rest.
    • The Quip TLS/HTTPS transport layer uses TLS 1.2, encrypted, and authenticated with AES-128 and using an SHA 256 key exchange/signature algorithm.
    • Authentication to Quip uses OAuth 2.0.

    Einstein Analytics Live App (Beta):
    The data flow of the Einstein Analytics live app is one-way and goes from Salesforce Analytics to Quip via the Analytics API. The source of truth is always the underlying Dashboard on the Analytics Platform. A Quip user can’t modify the data in the Einstein Analytics live app. Einstein Analytics users can modify filters. The Analytics APIs are modern REST APIs that abstract layout complexity by:

    • Checking field-level security settings, sharing settings, and permissions
    • Sending queries directly to the Analytics platform
    • Getting object metadata and theme information


    Troubleshooting Questions and Responses from salesforce support:

    Question from my Quip Consultant (@prashanth Vardhan Polavarapu) :

    I’m unable to fetch the image from the Rich text area field from Salesforce using merge fields in my quip template.

    I’m trying to update the document version whenever a record is updated in salesforce through merge fields. But the record in quip is not updating even though the salesforce record is updated.

    Salesforce Support Agent answer:

    Unfortunately, our merge fields aren’t dynamic objects yet on Quip pages.

    When data is updated in a Salesforce Record, the only type of object we support that syncs automatically is the Salesforce Record LiveApp.

    You’ll need to re-generate the template again to reflect the new information in your updated record. (viz. generating a template will create a new quip document in your folder)


    Einstein Analytics certified… finally!

    June has been really exciting so far as I finally managed to complete my Einstein Analytics and Discovery Consultant certification!

    Preparing for the Einstein Analytics and Discovery Consultant Certification

    As every Salesforce consultant certification, the “Einstein Analytics and Discovery” certification is a mixture of scenarios that you have to solve, some pseudo-“debugging” and a good number of questions that basically test your knowledge. Read the exam guide and make sure that you complete the Trailhead Superbadges for Data Preparation and Analytics & Discovery Insights. Kelsey Shannon has blogged very comprehensively on her certification journey and Charlie Prinsloo has written the definitive preparation guide (some links require partner community access, though).

    Get an EA org (either trial or developer)

    If you don’t have experience with Einstein Analytics, then your starting point is getting an org. There is a free (and more or less perpetual) developer org available, and if you want to have a look at the fully configured “real thing”, Salesforce now offers a fully functional 30 days trial packed with sample data, sample dashboards and apps.

    Watch the academy video training

    If can ‘t attend an in-person “Einstein Analytics Academy” class, the EA team has a great alternative for you: Ziad Fayed has recorded a full training as a series of free webinars. It is your number one resource if you want to pass the Analytics certification and I recommend watching *and building* the solutions Ziad is presenting.

    Use the Templated Apps

    It might sound strange, but it’s highly recommended to use your developer org to create at least two essential apps from the App Templates Salesforce provides:

    • The Learning App has examples for essential techniques such as bindings.
    • The Sales Analytics App has functional examples of a sync scenario, complex dataflows, and dashboard designed as best practice.

    Speaking of templates: You can score some easy points in the exam if you know

    Know Dataflows & Recipes inside out

    Though it’s not a universal truth, for the sake of the exam stick to the best practice that Sync, Uploads and Dataflows ingest all data into analytics, and recipes work off datasets. You’ll see that once you set up synced sources, you can use the synced source straightaway in a recipe to prepare a dataset. Yet this is not what the certification exam is about.

    Know the limitations of synced external sources (such as an Azure DB) as compared to synced Salesforce objects (it’s a good idea to know the limits in this area: How many Salesforce objects can you sync? How many dataflows can you have?)

    For dataflows (aka “the purple nodes”), you should know each node type and understand what you use it for:

    • Dataset Builder (aka “the blue nodes”) is exclusively for Salesforce objects. It helps you to find the finest grain, all related objects and allows you to select fields and relations.
    • sfdcDigest reads from a synced Salesforce object
    • edgemart reads from a dataset in Analytics (read: re-use an existing dataset)
    • sfdcRegister saves a dataset
    • append works like the “union” command – it adds the rows of a second dataset to the existing dataset.
    • augment “joins” one dataset to another by adding fields to existing rows, not new rows. In simple words: you choose the key on the “left hand side” (the data you already have), choose the dataset you want to join, select the field that matches the key (also: decide if you expect single or multiple matches, and which fields you want to add. The outcome will be the same number of rows with more columns/fields.
    • computeExpression lets you create new fields or recalculate values based on the fields in the same row. If row 10 has a value for “Quantity” of 10, the value for “ProductName” is “Cherry Cake”, you can create a formula to create a new field “Line Item Label” with the value ‘Quantity’ + ‘ProductName’ + “s”, which will build “10 Cherry Cakes”.
    • computeRelative allows you to compare or summarize a row with previous ones (row over row, or based on a grouping that is used as a “partition”).
    • dim2mea is a handy tool to convert a dimension to a measure if you need to do that. Unfortunately, there’s no mea2dim (if you accidentally read numeric product number as a measure). If you need this, you’ll have to use computeExpression, generate a String field, copy the value, and convert it to a String.
    • flatten allows you to convert a hierarchy into directory or path like representation of the hierarchy that gets access to the row. You can decide whether or not to include what Analytics call the self_id. The difference it makes is team visibility. Should a team always share their records, you’d need to set “include_self_id” to false. Imagine two records that include self_ids, one has “me/myteam/mymanager/ourboss”, the other one has “mycolleague/myteam/mymanager/ourboss”. They won’t be able to see their records respectively. If you set “include_self_id” to false, both will get “myteam/mymanager/ourboss” as their hierarchy path and by that are eligible to be shared among all members of “myteam”
    • prediction allows you to run a prediction from a Discovery model on your dataset’s rows (only available for Einstein Analytics Plus).
    • filter does, what the name says. It filters records that either match or don’t match a criteria.
    • sliceDataset acts like a filter, but for columns. You can choose whether you want to specify the columns/fields to drop or keep.
    • digest reads from any connected source and object (read: synced external data)
    • update does, what the name says: It updates a dataset with the changes you made. It’s basically a digest node that writes to the same dataset.
    • export WAS used to push a dataset to Discovery. Nowadays you can do that in the UI with a button on the dataset. By default, it only works with Discovery, and Discovery is only available with Einstein Analytics Plus licenses.

    Know that the finest grain of your dataset is always determined by what you want to analyse. If your grain is not fine enough (let’s say you only loaded Opportunities, but not Opportunity Line Items, so there’s no way to get to the product level with this dataset. You can load the Line Items in a separate dataset and augment with the existing Opportunity Data, but in this case, rebuilding the dataset from scratch would be better.

    On the other hand, you can’t run aggregations in Dataflows, so you can’t reduce the grain either. Groupings will help you there.

    Exploration, Visualization & Dashboard Design

    The exam parts that focus on Exploration and Visualization seem to be quite straightforward. If you know how to navigate the application, know key principles (progressive disclosure) for Dashboard design and know how to review (Dashboard inspector) and improve dashboard performance (e.g. pages, global filters, combine steps and such), you should be able to ace this section. Don’t forget to look into Actions and remember the C-A-S-E-S formula for good data analysis!

    A particular focus should be on bindings – there are only a few questions on bindings, but you really need to know them to score these points. Consider building each binding type at least once and make sure that you understand what “results binding” vs. “selection binding” means. Look up, what a “nested binding” is (not a separate type but a specific way to use a binding), and make sure you understand the functional blocks of binding syntax. One top resource for that is Rikke Hovgaard’s blog (start here) – hint, hint: Rikke authored *some* questions for the exam (guess which ones…).

    Security and Access

    Another topic that is both straightforward and tricky at the same time.

    • Review how to get people access to both Einstein Analytics and Apps that you’ve built.
    • Understand the roles (they’re different than “Roles” in Salesforce).
    • Again, there’s a marginal topic in “Inherited Sharing” vs. “Security Predicate” , but you can score some precious points there. Make sure you know the limitations of inherited sharing, and how you can leverage security for cases where you hit a wall with inherited sharing.

    Einstein Discovery

    For Einstein Discovery, it’s crucial to know a bit about how data gets into Discovery, and how to analyze and improve the model quality. The discovery part of the exam is too large to be neglected, but still small enough that it won’t blow up your test immediately if you fail some questions here.

    Data can be pushed from Einstein Analytics and other sources, including CSV. Click the import path for both EA and CSV, review the imported data and select the data types, review the columns, the outcome variable (a single one) and the predictors / actionable variables (up to three). You will see that some columns are closely related and Discovery can prompt you to review if they really represent the same thing (such as Product Number and Product Name) – or if they is just a very high correlation. You typically want to drop data only if you really know that they mean the same thing – when in doubt or you don’t know, then don’t make assumptions.

    Understand the impact of outliers / extreme values: Typically these should NOT be in your analysis because you don’t want edge cases to drive your prediction. Don’t be shy to trim at least everything beyond the 6th standard deviation.

    Finally, you should know how to read and understand the charts used by a Discovery story and the quality metrics. While everyone knows bar charts, Waterfalls charts are lesser known, so it might be a good idea to review if you really understand how Discovery uses both types to present data to you.

    At the time of writing, there are just a few flashcard sets available to memorize the stuff. You can find the handful of them by searching for Einstein Analytics, combined with any EA specific term. While it helps you massively to memorize terms, limits etc, the one thing that will drastically improve your chances is reading the exam guide closely, get hands-on experience and/or actively follow the academy training videos. You can use the old Advanced Accreditation form to test your knowledge still. It will give you an idea what the Analytics team thinks you should focus on, even if the test is only for you to test your knowledge and will neither be scored nor will it give you an accreditation.

    General Guidance

    The general tips for all Salesforce exams apply here as well:

    • know the pass score and know what it means in numbers of questions. There will be 60 questions and the pass score is 68%. So 41 correct answers will let you narrowly pass, and there are up to 19 question that you can miss.
    • use the “mark for review” checkbox whenever you’re not 100% sure about your answer (it will give you a good overview later). Immediately after the last question, you will get the chance to review your checked questions – if your number is 15 or above, it’s a good idea to review all checked questions. Remember that there are probably some wrong answers among those questions that you DIDN’T check for review.
    • Read questions AND answers closely. Really, really! There’s a lot of information that you will only recognize on the second or third read. And you will be more successful to separate bogus answers from the correct ones if you scrutinize every single word.
    • There aren’t just “correct” and “wrong” answers – there are also items that are called “distractors” that could be correct… or almost correct. Scan each question and answers thoroughly for tiny deviations from Salesforce terminology, such as “computeField” (the real term for a dataflow function to compute a field is “computeExpression”). Scan for plural vs. singular, scan for the wrong order of steps.
    • If you don’t know the answer, try to rule out wrong answers.
    • If you still have no clue, check the “mark for review” checkbox and don’t waste more time on this item.

    I hope this helped you a bit. Good luck with the exam, and let me know how you did!