Pages

Wednesday, June 13, 2018

Einstein Analytics: Multi-Select Picklist to Text

As per Summer '18 release, Einstein Analytics is not really friendly with Multi-Select Picklist field. One of the issues, when you have multi-values in a record, it will show only the 1st value when you show the data as Value Table in Lense or Table wizard in a Dashboard.

Service Type contains multi-values in Salesforce


Service Type only show the 1st value in EA


As per this document, we can customize JSON dataflow to treat multi-select values as text. Once, you have the dataflow built:

  • Download the JSON file
  • Edit the file and add ,"isMultiValue": false after the field name and save it (you should backup the original JSON file)
  • Upload back the edited JSON file to Dataflow



Here is the result the multi-select values show as text in Einstein Analytics



Note: if you do not update the JSON dataflow as above, the data will flow into EA as multi-values, and it will work if you use it as list filter as an independent value, but the record selection will work as if the multi-values for the record.


ReferenceLimitations of Multi-Value Fields in Einstein Analytics



Sunday, June 10, 2018

Einstein Analytics: Getting started with cogroup

You can combine data from two or more data streams into a single data stream using cogroup. The data streams must have at least one common field. Only data that exists in both groups appear in the results.

example:
qs = cogroup qsd by 'State', qsp by 'State';
in this sample, data stream qsd contain field State and data stream qsp also contain field State, we can use it for grouping.

q = cogroup ops by 'Account', meetings by 'Company';
Account in ops data stream will have the same value with Company in meetings data stream.


Use case: show death per state percentage from 2 datasets.



Let's use cogroup to combine the dataset:
dsd = load "StateDeath2";
dsp = load "StatePopulation2";
ds = cogroup dsd by 'State', dsp by 'State';
ds = foreach ds generate dsp.'State' as 'State', sum(dsp.'Count') as 'Population', sum(dsd.'Count') as 'Death', (sum(dsd.'Count')/sum(dsp.'Count')*100) as 'Death (%)';

The Result


Let's try to use Opportunity and User datasets from our previous blog.
dsu = load "user1";
dso = load "opportunity";
ds = cogroup dsu by 'Id', dso by 'OwnerId';
ds = foreach ds generate first(dsu.Name) as 'Name', sum(dso.Amount) as 'Sum_Amount';
ds = order ds by Name;


If you notice, Angela is not shown on that chart, because she do not have any Opportunity records. Remember that only data that exists in both groups appear in the results.


Reference:




Saturday, June 9, 2018

Einstein Analytics: Opportunity Dashboard with SAQL Union to show all User

Let's go straight to the business requirements, "show all sales rep with their total sales amount, if there is no opportunity owned by the sales rep, sales rep name must be shown with $0."

High-level solution:
1. Load User master data
2. Load Opportunity data
3. Use UNION to combine the dataset

User data


Opportunity data


Build User master Lens group by Id and Name
q = load "user1";
q = group q by ('Id', 'Name');
q = foreach q generate 'Id' as 'Id', 'Name' as 'Name', count() as 'count';
q = order q by ('Id' asc, 'Name' asc);
q = limit q 2000;

Let's modify necessary SAQL:
1. Rename all q data stream to dsu -- for easier identifier and uniqueness
2. Rename projected 'Id' to 'User_Id', and 'Name' to 'User_Name' -- I'll tell you the reason later
3. Remove 'count' as we do not need it -- User_Id is unique
4. Add 'sum_Amount' with 0 in foreach -- I'll tell you the reason later
5. Remove limit

Here is the result
dsu = load "user1";
dsu = group dsu by ('Id', 'Name');
dsu = foreach dsu generate 'Id' as 'User_Id', 'Name' as 'User_Name', 0 as 'sum_Amount';
dsu = order dsu by 'User_Id';


Build Opportunity Lens group by OwnerId
q = load "opportunity";
q = group q by 'OwnerId';
q = foreach q generate 'OwnerId' as 'OwnerId', sum('Amount') as 'sum_Amount';
q = order q by 'OwnerId' asc;
q = limit q 2000;

Let's modify necessary SAQL:
6. Rename all q data stream to dso -- for easier identifier and uniqueness
7. Rename projected 'OwnerId' to 'User_Id' -- I'll tell you the reason later
8. Add 'User_Name' with "-" in foreach -- I'll tell you the reason later
9. Remove limit

Here is the result
dso = load "opportunity";
dso = group dso by 'OwnerId';
dso = foreach dso generate 'OwnerId' as 'User_Id', "-" as 'User_Name', sum('Amount') as 'sum_Amount';
dso = order dso by 'User_Id';


Combine the dataset with UNION
final = union dsu,dso;
final = group final by ('User_Id');
final = foreach final generate first('User_Name') as 'User_Name', sum('sum_Amount') as 'sum_Amount';


The Complete SAQL
dsu = load "user1";
dsu = group dsu by ('Id', 'Name');
dsu = foreach dsu generate 'Id' as 'User_Id', 'Name' as 'User_Name', 0 as 'sum_Amount';
dsu = order dsu by 'User_Id';

dso = load "opportunity";
dso = group dso by 'OwnerId';
dso = foreach dso generate 'OwnerId' as 'User_Id', "-" as 'User_Name', sum('Amount') as 'sum_Amount';
dso = order dso by 'User_Id';

final = union dsu,dso;
final = group final by ('User_Id');
final = foreach final generate first('User_Name') as 'User_Name', sum('sum_Amount') as 'sum_Amount';


The Moment of Truth



Explanation
  • we rename Id and Name in step (2) to have the same column name with step (7)  and (8)
  • we add 'sum_Amount' in step (4) to have the same column name with dso data stream
  • for our use case, we get the dataset aligned with the same column by adding dummy columns before using UNION to both data stream
  • In the last row, we use aggreagate function first() to return the first user name, as our union  start with dsu which contain user name, while dso at the second/last will always contain "-" for user name, see step (8)
  • In the last row, we also sum the 'sum_Amount' again, practically this is sum the 'sum_Amount'  with 0, remember we add 0 as dummy value in step (4)

Make it simple
Since sum_Amount always 0 in dsu, and User_Name always "-" in dso, we can just simply not need to add them to the data stream, and we will still get the same result, let's remove the unnecessary statement.

dsu = load "user1";
dsu = group dsu by ('Id', 'Name');
dsu = foreach dsu generate 'Id' as 'User_Id', 'Name' as 'Name';

dso = load "opportunity";
dso = group dso by 'OwnerId';
dso = foreach dso generate 'OwnerId' as 'User_Id', sum('Amount') as 'sum_Amount';

final = union dsu,dso;
final = group final by ('User_Id');
final = foreach final generate first('Name') as 'Name', sum('sum_Amount') as 'sum_Amount';
final = order final by 'Name';




Reference:


Monday, June 4, 2018

Einstein Analytics: Filter Conditions in sfdcDigest

When you do not need to bring the whole data from a Salesforce object to Einstein Analytics (EA), you can filter the data when retrieving it in sfdcDigest.

With filter out unnecessary data flow to EA, this will help our dashboard builders do not need to keep filtering out certain unused data, such as inactive users for the User object. Filtering data also will increase performance on the Dataflow and consume less total of records stored in EA.

In this sample, I want to bring in all active users and not included Chatter users. So, there are 2 filters need to add in sfdcDigest:
- isActive = true
- userType = "Standard"

If you come from Salesforce background, you can simply enter isActive = true && userType = Standard, but this is different in EA. Until Summer '18 release, you need to manually enter this filter conditions in JSON format.

Add this filter in sfdcDigest under Filter Conditions:
[
{"field":"isActive","isQuoted":true,"value":"true","operator":"="},{"field":"UserType","value":"Standard","operator":"="}
]

If you see the first filter, isActive is a boolean field, so passing just true without "" will get an error when running the Dataflow, so we need to add "isQuoted": true, this is also applicable for a numeric and date field.

Let's see this in JSON Dataflow:



For the complete reference, check out this documentation Structured Filter in sfdcDigest Transformation.




Thursday, May 31, 2018

Salesforce: Files sharing Query

More than 3 years ago, I wrote a blog Content Architecture sharing about how to query and the content still valid till now. This blog would be the continuation on how to query the File is sharing to who or what, and the level of visibility.

There is an object called ContentDocumentLink (with prefix 06A), this object is a child of ContentDocument (prefix 069). This object represents the link between a Salesforce CRM Content document or Salesforce file and where it's shared. A file can be shared with other users, groups, records, and Salesforce CRM Content libraries. This object is available in versions 21.0 and later for Salesforce CRM Content documents and Salesforce Files.

To query this object, it must be filtered on one of Id, or ContentDocumentId, or LinkedEntityId. Let's see some samples:

SELECT Id, ContentDocumentId, LinkedEntityId FROM ContentDocumentLink 
This query will throw an error because of not filter on anything.
MALFORMED_QUERY: Implementation restriction: ContentDocumentLink requires a filter by a single Id on ContentDocumentId or LinkedEntityId using the equals operator or multiple Id's using the IN operator.


To query for a file is sharing to
SELECT Id, ContentDocumentId, ContentDocument.Title, LinkedEntityId, LinkedEntity.Name, LinkedEntity.Type, ShareType, Visibility FROM ContentDocumentLink WHERE ContentDocumentId = '0690k000000TzTMAA0'



To query all files shared to a record
SELECT Id, ContentDocumentId, ContentDocument.Title, LinkedEntityId, LinkedEntity.Name, LinkedEntity.Type, ShareType, Visibility FROM ContentDocumentLink WHERE LinkedEntityId = '0010k00000IKS3YAAX'



To query all files shared to an object
SELECT Id, ContentDocumentId, LinkedEntityId FROM ContentDocumentLink WHERE LinkedEntityId in (SELECT Id FROM Account)




ReferenceContentDocumentLink



Tuesday, May 15, 2018

Einstein Analytics: Setting Initial Selection

Adding List Selector or Date Selector in Einstein Analytics is pretty simple, just a few clicks, and you will get the result. However, if you notice at the bottom of Step Properties, there is a textbox for Initial Selection, this is to set the initial value for the List when running the dashboard, but we cannot set it from the UI.

When you add a widget, by default, it will be "No Selections".



To set value for Initial Selections, edit JSON dashboard and looks for steps, add "start": with the value. Make sure the value is available in the list.


Now, back to the UI, and check the Initial Selections



Preview the dashboard and check the initial value will be the same as the value we put in JSON.



We can do the same for Date selector. Edit JSON and looks for the related step. Below sample, if you would like to set the date range with current quarter.



This is the result when you run the dashboard, as we set in JSON, initial selection will be from current quarter to current quarter, you can change it to year, month, week, or day, or also using the absolute date.



However, if you hate to edit JSON code for this you can set the initial value from UI too, click ... button next to save icon and select Pick Initial Selections, then select date selector or range selector or list selector to set the initial value, this activity will edit the same JSON at the back end.




Monday, May 14, 2018

Einstein Analytics: The quest for Binding in Dashboard

This blog is written in Summer '18 release, so it may become invalid when Salesforce makes "binding" become more user-friendly in the future.

Until Summer '18 release, if you would like to implement binding in the dashboard, you need to manually edit the dashboard JSON.

Use case: you would like to give user flexibility to change chart grouping, example: group by Region, or by Country, or by Status. Another use case, the user would like to have the flexibility to change chart type without the need to edit the dashboard or would like to implement both in the same dashboard, this makes sense, when you change grouping, the chart type probably needs to change for better visualization.

This blog will not share on how to create binding or static with toggle, you can watch the awesome Peter Lyon's videos Binding Basic, and Rikke Hovgaard's blog The power of static steps.

A. Binding to Chart
In this sample, I have one chart and 3 toggles:
- Chart
   Step Id: step_1
- Field Names for grouping
   Step Id: static_1
- Chart Type to change visualization
   Step Id: static_2
- Order by ascending or descending
   Step Id: sort_1



This is how the dashboard looks


To make the binding work, I am going to edit the JSON with Ctrl-E.

Binding for Grouping
1. Change "groups" query under "step_1"
change from
"groups": ["field_name"]
change to
"groups": ["{{coalesce(cell(static_1.selection, 0, \"value\"), cell(static_1.result, 0, \"value\")).asString()}}"]

When previewing the dashboard, if the selected field in toggle different with the field use in the initial value for grouping, the chart will error:



2. Change "columnMap" in "widgets"
"columnMap": null OR delete the whole columnMap


Binding for change Chart Type
1. Change "columnMap" in "widgets" (if you have not done it)
"columnMap": null OR delete the whole columnMap

2. Change "visualizationType" under "parameters" in "widgets" 
change from
"visualizationType": "hbar"
change to
"visualizationType": "{{coalesce(cell(static_2.selection, 0, \"value\"), cell(static_2.result, 0, \"value\")).asString()}}"



Binding for Order
1. Edit the static step under "sort_1"
change from 
"values": [{"display": "Asc","value": "true"},
           {"display": "Desc","value": "false"}
          ]
change to 
"values": [{"display": "Asc","value": [-1,
                            {"ascending": true}
                        ]},
           {"display": "Desc","value": [-1,
                            {"ascending": false}
                        ]}
          ]

2. add "order" query under "step_1"
"query": { "measures": [["count","*"]],
           "groups": ["{{coalesce(cell(static_1.selection, 0, \"value\"), cell(static_1.result, 0, \"value\")).asString()}}"],
           "order": "{{column(sort_1.selection,[\"value\"]).asObject()}}"
                }

B. Binding table
We also can use a toggle to sort table widget, but this will also cause, out of the box feature to sort the table by clicking table header will no longer work. So, this is particularly needed when the SAQL has been edited.

Here is another sample with JSON snippet from steps:
  "query": {  
           "values": [  
             "Region__c",
             "Skill__c",
             "Name",
             "Id",
             "LastModifiedDate"  
           ],  
           "order": [  
             [  
               "{{ value(selection(static_2)) }}",  
               {  
                 "ascending": "{{ value(selection(sort_2)) }}"  
               }  
             ]  
           ]  
         }  

This is the JSON step for the fields selection:
      "static_2": {  
         "broadcastFacet": true,  
         "label": "static 2",  
         "selectMode": "singlerequired",  
         "type": "staticflex",  
         "values": [             {  
             "display": "Region",  
             "value": "Region__c"  
           },  
           {  
             "display": "Status",  
             "value": "Status__c"  
           }  
         ]  
       }  

This is the JSON step for the fields order:
     "sort_2": {  
         "type": "staticflex",  
         "broadcastFacet": true,  
         "selectMode": "singlerequired",  
         "label": "sort 2",  
         "values": [  
           {  
             "display": "Ascending",  
             "value": true  
           },  
           {  
             "display": "Descending",  
             "value": false  
           }  
         ]  
       }  
notice that true and false not in the double quote ""

Table result:


Here is the full JSON for the dashboard.


Sunday, May 6, 2018

Einstein Analytics: License Assignment

So your company purchase X licenses of Einstein Analytics, perhaps with Event Monitoring too. To check the licenses you have acquired, go to Company Information in setup menu.

In Company Information page, scroll down to Permission Set Licenses section.


From above screenshot:
  • Analytics Platform (yellow highlight): this is the license for Einstein Analytic, I have a total of 2 licenses, but 1 used, so remain 1 license.
  • Event Monitoring Analytics Apps (green highlight): this is the license for Event Monitoring, I have a total of 2 licenses, but 1 used, so remain 1 license.

To assign licenses to the user, go to the user detail, you will notice 
  1. Permission Set Assignments (PS)
  2. Permission Set Assignments: Activation Required
  3. Permission Set License Assignments (PSL)
When assigning licenses, you just need to pay attention to PS (1) and PSL (3) only. I am borrowing a good sample from Trailhead Assign Permissions:
  •  A PSL is like a passport. It grants you the right to travel, but you can’t visit the great land of Analytics without the right visa. 
  • A PS is like a visa. You can get a 3-day tourist visa, a work visa, or a student visa. Each visa type lets you do certain things.
  • Just like a traveler needs both a passport and a visa, your Analytics users need at least one PSL and a PS.
Back to license count, once you assign a user with a PSL, it counts as a license is used. Imagine that your country has a right to issue 100 passport, once a user Mr. X get a passport, as a country, you only can issue another 99 passports for your citizen, no matter if Mr. X apply any visa to USA, UK, or etc. But, for Mr. X to travel to the USA, he needs to obtain a visa, which is Permission Set.

In the real world, user ideally needs to get a passport first (PSL), before applying for a visa (PS). But, Salesforce makes our life as an admin easier, we can grant PS (visa) Einstein Analytics Platform User or Einstein Analytics Platform Admin directly to the user, at the same time, a passport will be issued too (PSL) Analytics Platform.


Now if you check back your license usage, it will mention 2 Analytics Platform has been used. 

Because PSL is required for PS Einstein Analytics Platform User or Einstein Analytics Platform Admin, you cannot delete the license PSL Analytics Platform (passport) before deleting the PS Einstein Analytics Platform User or Einstein Analytics Platform Admin (visa). 


But, when you delete PS from the user detail, the PSL will stay, and will still count to your license usage.


Query License Usage
To understand who is assigned with the PSL, you can do a simple query:
SELECT Id, PermissionSetLicense.MasterLabel, PermissionSetLicense.TotalLicenses, PermissionSetLicense.UsedLicenses, Assignee.Name FROM PermissionSetLicenseAssign



PSEinstein Analytics Platform User or Einstein Analytics Platform Admin
Einstein Analytics Platform User is designed to be assigned to users need to explore dataset with lenses and build dashboards.
Einstein Analytics Platform Admin is designed to be assigned to admin, they will be able to create and customize Apps, Dashboards, Datasets, Dataflows, and Recipes, including Monitor from Data Manager. These users will be able to view all apps in Einstein Analytics, except items stored in My Private App.

Einstein Analytics Platform User Einstein Analytics Platform Admin
Access EA
  • Analytics tab
  • Analytics Studio app
  • Analytics tab
  • Analytics Studio app
Allow creating
  • Dashboard
  • App
  • Dashboard
  • Dataset
View all Apps/data No Yes
Explore Lens Yes Yes
Access Data Manager No Yes (incl. Dataflow & Recipe)



Reference:

Saturday, May 5, 2018

Salesforce: Picklist Default Value

Since Summer '17 release, Salesforce supports Default Value at the field level, this means we can define different default value based on the user, example: when front-end support creates a new case, Priority default value "high", while all other users will have Priority default value "low", although they can change it manually.

But, the field with picklist type will have its own default value, how this works with the default value introduced in Summer '17 release? In summary, it can be up to 3 places for a picklist field to have a default value defined.

Below is a simple logic on how this works, I would agree it would be easier to read this in a flowchart.

If Default Value in General Options (field level) is defined
    if Record Type* for the object is defined
        if Formula General Option resolve** to an active item & available in Record Type
            --> use resolve Value from General Options
        else if there is Default Value selected in Record Type
            --> use selected Default Value in Record Type
        else
            --> no default value
        endif
    else
        if Formula General Option resolve** to an active item
            --> use resolve Value from General Options
        else if there is Default Value selected in Field Picklist
            --> use selected Default Value in Field Picklist
        else
            --> no default value
        endif
    endif
else
    if Record Type for the object is defined
        if there is Default Value selected in Record Type
            --> use selected Default Value in Record Type
        else if there is Default Value selected in Field Picklist
            --> use selected Default Value in Field Picklist
        else
            --> no default value
        endif
    else
        if there is Default Value selected in Field Picklist
            --> use selected Default Value in Field Picklist
        else
            --> no default value
        endif
endif

* even there is only 1 record type defined and active
** the formula resolved is case-sensitive with the item API name

Note: from above logic, default value from field level is processed first, if no or fail fit to API name, check if any record type and the default value, and the last would be the default value in the picklist. The default value in record type is treated in the same level with the default value from picklist, therefore, for object with record type, if there is no default value defined, the system will not check further if any default value selected in picklist field.


Here a few samples:
there is no default value defined in the field level, all user will get default value from the selected item


The default value in field resolved as High1, while the picklist API name is High, default value from field level will not apply - it is case-sensitive too.


Reference:




Friday, May 4, 2018

Einstein Analytics: Convert DateTime field to Date field or Text field

This would be a simple tip for you that start using Einstein Analytics.

Use case: to convert DateTime field (copy from Salesforce) to Date field, or Text field in Einstein Analytics.

In this blog, I'll add new fields in Dataflow using computeExpression. We will use a DateTime field with API name: Submitted_Date_Time__c

Text
formula: substr(Submitted_Date_Time__c,1,10)
This formula will take the first 10 characters.
Original: 2017-05-03T09:43:28.000Z --> 2017-05-03

Date
formula: toDate(Submitted_Date_Time__c_sec_epoch)
Remember to enter "Date Format" (you need to scroll down to find it), otherwise, you can't upload the Dataflow.

Side note: toDate() is case-sensitive.



Reference:


Thursday, April 26, 2018

Salesforce Email Logs

Few details about Salesforce email logs:
  • Email logs should be available within 30 minutes of your request.
  • Email logs are available for messages sent within the past 30 days before your request. 
  • Each email log can span a maximum of 7 days. To see email log data for a duration longer than seven days, create multiple requests. 
  • Email logs include emails sent through email actions, list email, and mass email, as long as the emails are sent through Salesforce.

In this blog, I'll not share on how to request email log, you can refer to this documentation Request an Email Log, and check this documentation Email Log Reference to understand the format and field values of email log files. I'll share columns need to put attention when you are tracing email send out from Salesforce:

Date Time
The date and time here are always in GMT, so make sure to convert into your timezone if you are checking for specific email at certain times. But, when you request for email logs, enter the time in your local timezone.

Mail Event
D - Delivery: the email was successfully sent to the recipients.
R - Reception: the email was successfully received by Salesforce email server.

Recipient
The email address of the person to whom the email is sent.

Sender
The email address of the person who sent the email, but in many cases, this will not show the exact email address, but mailbox name before @ will be there, example: jjohan=mydomain.com__1l6m60naoh7y3pvy.7zbt7fbfwio41y1c@castgfakbl6ndktj.e2g41.2a-z8jeae.cs47.bnc.sandbox.salesforce.com

Sample:
  • There are 3 color highlights in above screenshot, this is for 3 difference emails.
  • Email received by Salesforce email server (R), then send out in a few seconds (D).
  • The yellow highlight means, there are 3 recipients in that one email, row 7 correspondence to row 3, row 6 correspondence to row 4, and row 5 correspondence to row 2. Each pair has the same Internal Message ID.


When you open email sent in Gmail, you will notice email sender contain a portion of the sender in email logs (see above and below screenshots in pink).


Let's see more columns:

Remote Host
This is IP address of the email server that received the email sent.

Bytes Transferred
The size of the email in "byte", not KB or MB.

Salesforce.com User
The Salesforce ID of the user who sent the email.

Message ID Header
If you are using Gmail, you can check this by click "Show Original"



The screenshot below shows the message header in Gmail, see the highlights in green.



Reference:


Wednesday, April 25, 2018

Salesforce: Opportunity Team Report

Two years ago, I wrote a blog about reporting for Account Team, this blog would be for Opportunity Team. When you have Opportunity report, you can get the result based on few options:



Let's see the difference between each option. In this sample, I'll login as "Maria Ann" and she has 2 subordinates: "Song Lee" and "Free Man".

My opportunities
All opportunities owned by me, period!



My team-selling opportunities
All opportunities where I am listed in the Opportunity Team, no matter if I am the opportunity owner or not.



My team-selling and my opportunities 
This is the combination of My opportunities and My team-selling opportunities, although it doesn't mean the opportunity will be listed twice if it fit into both criteria, so you can't just sum number of opportunity from My opportunities with My team-selling opportunities.



My team's opportunities
All opportunities owned by me and all opportunities owned by users below my role hierarchy, including the indirect reports.



My team-selling and their opportunities
All opportunities where me or any users below my role hierarchy listed as in the Opportunity Team, no matter who owned the opportunity. This is similar to My team-selling opportunities but extended to my subordinates.



My territories' opportunities
Only available when Territory Management is enabled.



All Opportunities
All opportunities that you have visibility.





Tuesday, April 24, 2018

Salesforce: User License Reporting

We can check the number of licenses purchased and remained from Company Information.



But, can we get the same info using report? With make it as a report, we can schedule to deliver that information directly to stakeholders inbox.


1. Custom Report Type
Salesforce out-of-the-box provides standard User report type, this is good enough to report user information, including login history. But, that report types not able to tell the number of licenses purchased.

Create custom report type and select Users as the primary object. In the Fields Available for Reports, edit the layout and add lookup field from Profiles, and select all fields under Profile. Even, in this case, we just need Name and Total Licenses information.



You may want to change the Total Licenses label by double-click the field, remove "Users: Profile: User License:".



Make sure to set the new Custom Report Type as deployed.


2. Create report
Create a new report using the new report type. To make it looks nicer, set the report as Summary Report, group by User License and filter only for Active users.
You need to create a few 2 custom summary formula:
1. Active Users: to count all active users, the formula: RowCount
2. License Remain: this is by calculating Total License - Active Users, the formula: User.Profile.UserLicense.TotalLicenses:SUM - RowCount


You can add Conditional Highlighting to get user attention when remaining licenses reach a certain threshold.


3. The Moment of Truth



How is the report looks like in Lightning? Except without Conditional Highlighting, the report will still show nicely.