Pages

Monday, October 28, 2019

Einstein Analytics: Work with Logged in User Detail

Set the Initial Value

You can set the initial value of a List widget based on the Salesforce logged in user tokens: user.id, user.name, user.rolename, and user.roleid. At runtime, Analytics retrieves the values of these tokens from Salesforce environment variables—these values don’t come from datasets.

Sample:
"Owner_Name_1": {
                "datasets": [
                    {
                        "id": "0FbB00000000pNNKAY",
                        "label": "Opportunities",
                        "name": "opportunity",
                        "url": "/services/data/v38.0/wave/datasets/0FbB00000000pNNKAY"
                    }
                ],
                "isFacet": true,
                "isGlobal": false,
                "query": {
                    "measures": [
                        [
                            "count",
                            "*"
                        ]
                    ],
                    "groups": [
                        "Owner.Name"
                    ]
                },
                "selectMode": "multi",
                "start": [
                    "!{User.Name}"
                ],
                "type": "aggregateflex",
                "useGlobal": false,
                "visualizationParameters": {
                    "options": {}
                }
            }

Instead of selecting the default "All" value, the dashboard will auto select the Owner Name widget with the logged-in user. However, if the logged-in user name does not exist in the query result, the widget will show All.


Filter Query Result

Auto-selection by the logged-in user (as above) does not stop the user to select other users from the list. In some scenarios, we do not show the widget at all, and just need to filter the query by logged-in user:

SOQL query to get user attribute
"QueryLoginUser_1": {
                "groups": [],
                "numbers": [],
                "query": "SELECT Id, Name FROM User Where Name = '!{User.Name}'",
                "selectMode": "single",
                "strings": [],
                "type": "soql"
            }

the query above is the same as "query": "SELECT Id, Name FROM User WHERE Id = '!{User.Id}'"

Use the above query result as a filter
"query": {
                    "measures": [
                        [
                            "count",
                            "*"
                        ]
                    ],
                    "groups": [
                        "Owner.Name"
                    ],
                    "filters": [
                        [
                            "OwnerId",
                            "{{cell(QueryLoginUser_1.result,0,\"Id\").asString()}}",
                            "in"
                        ]
                    ]

                }


Display User Detail

The same method can be used to show the User Name in the dashboard, use a text widget then bind it into the query/step

sample:
"text_1": {
                "parameters": {
                    "content": {
                        "displayTemplate": "Hello [Name]",
                        "values": {
                            "Name": {
                                "field": "Name",
                                "sourceType": "result",
                                "step": "QueryLoginUser_1"
                            }
                        }
                    },
                    "fontSize": 16,
                    "showActionMenu": true,
                    "textAlignment": "left",
                    "textColor": "#091A3E"
                },
                "type": "text"
            }

Friday, October 11, 2019

Einstein Analytics: Transpose data from rows to columns

You know, in life you face multiple weird things, the same when you work as Einstein Analytics consultant. In the previous blog, we share how to transpose data from columns to rows.



In this blog, the requirement is the other way rounds.

Background: we need to show child records on the parent level, the number of rows in the table should follow the number of parents, and the good news is, there is a limit of children for a parent, and for this case, the max child is 5.

Solution: use a lot of computeRelative transformation nodes.



Let us go through each computeRelative nodes:

Node-1
- Partition By = Country
- Order By = City ascending

Add 3 fields here:
- IsFirst with SAQL: case when current(City)==first(City) then "Yes" else "No" end
- Ke_1 with Source Field = City and Offset Function = First
- Ke_2_temp with SAQL: case when previous(City)==first(City) then current(City) else "" end


Node-2
- Partition By = Country
- Order By = Ke_2_temp descending

Add a field here:
- Ke_2 with Source Field = Ke_2_temp and Offset Function = First


Node-3
- Partition By = Country
- Order By = City ascending

Add a field here:
- Ke_3_temp with SAQL: case when current(City)==Ke_2 then next(City) else "" end


Node-4
- Partition By = Country
- Order By = Ke_3_temp descending

Add a field here:
Ke_3 with Source Field = Ke_3_temp and Offset Function = First


Repeat above until Node-8, only field in yellow highlight will be used, and the one end with _temp will be drop, we just use it as helpers.

Use filter node to drop all rows without IsFirst == "Yes", and slide node to drop all fields except Country and the ones in yellow.

Here is the source


Here is the result






Wednesday, October 9, 2019

Salesforce Einstein – Where to start to experiment and understand Machine Learning?


by Jean-Michel Mougeolle, Salesforce MVP hall of fame, Salesforce Einstein Champion, SharinPix CEO.


What is the Einstein Champion program?
Let me start this blog with the Einstein Champions Program. The Einstein Champions Program is for Trailblazers that are passionate about the Einstein Platform and want to share their advanced knowledge with peers and evangelize the power of Einstein.

I have the chance to be part of those, certainly, due to the various have made around Einstein Vision at Dreamforce and in many Dreamin’ events. I’m convinced that Einstein Vision is a great way to start learning with Machine Learning in Salesforce.


Why starting by Einstein Vision?
First, it will make you understand very easily the benefits and the approach required by Machine Learning.

Second, you can easily play with it, FOR FREE!
For Free? You mean you don’t need any licenses?
No, you just have to install the Einstein Vision and Language Model Builder by Salesforce Labs, to start playing with it. The creation of models is free, and to test them you have up to 2000 predictions per month for free as well.




So where should we start to create our first model?
I will go with Einstein Vision Image Classification. It only takes a zip file with few images organized by labels in folders to start with something. Of course, you may have to gather enough images per label to get something working, and take care of image format, size and resolution. But if you plan only to create a model for testing, extracting images from some google search should be sufficient to have nice results.


Can you explain the basis of Image Classification?
Yes, for sure, Image Classification makes prediction to identify a picture from examples on which it has been trained. As an example, the model can recognize a cat from a dog if it has been well trained with enough dogs and cats pictures.


For our demo jam with SharinPix we have used images from google to create models to classify food pictures. The model can recognize hot-dog, pizza, burger, drinks, dessert, BBQ meat and more. That’s a good example on how to classify from image line of a menu to make them sorted automatically.



You mean that you can train a model that easily?
Yes, you just have to catalog enough images per label (100), construct a zip file with those and create a dataset with it. The UI from the Salesforce Lab package allows you to easily create a dataset from a zip file. Once you have a dataset, you can train a model from the same package. The model is the « engine » to create predictions.
Once you have a model, you can present a picture and the model will make prediction.


What can we expect to learn from that?
The limits of a poor dataset.  As an example, if you upload only white cats and only black dogs in a dataset, you will get a bad quality dataset. If you present then a black cat to it, it will certainly predict it as a dog.

Getting a good dataset is key, and it’s really easy to understand from example that is not working. As Image Classification is very visual, you can learn easily about the right and wrong approach around Machine learning.


What about Object Detection?
It’s quite the same principle than Image Classification, but it can detect many objects in a picture and get back with the position, the numbers and of course the probability associated to each recognition. The main usage for this is to automate retail execution from Shelf Display pictures.




Is that as easy as for the Image Classification?
Yes and no.
It doesn’t require different technology and it’s the same approach: create a dataset with pictures and train a model to get prediction. But if you need to label the pictures with bounding boxes representing all the objects you want to recognize.

So, in the example of retail execution, you may have to make it learn from shell display images where you have to draw boxes around each object you want to recognize, with the name of it. And this time you don’t need 100 pictures per label, but 200 bounding boxes per label across all the pictures used in the dataset. And the drawing of the box requires to be precise for a good prediction.


What are the main problems that can make you have a bad dataset?
The first is the bad quality of labeling. AI is basing is logic on the examples you feed it with. If you give him wrong examples, it will result in bad predictions. When you label hundreds of images, it’s easy to make mistakes. There, QA is mandatory to avoid any errors in the labeling.
The second, the diversity, frequency, and quality of images are key. You should not use images too angled or with too much light. And you may need as well to get the same frequency for each object to recognize across all the images in the dataset.


You seem to be very well experienced around that, does it come from what you have done with SharinPix?
Yes, we have provided the services to create tons of models for various big retail customers, but also from the company in other industries. We have labeled datasets that can recognize multiple hundreds of objects and with multiple thousands of images.

The quality approach is key in that kind of project, getting organized, having the right level of QA and a good understanding of the risk for each problematic met is really important.
We have constructed an app to help the team that wants to be serious about model making, model optimization, and model maintenance. We use it internally and provide the services around worldwide too many different companies.


Is that available on the AppExchange?
Yes, it’s part of the SharinPix App, but you can reach me for any question about Machine Learning and the app whenever you need!


So, can you recap the best thing to start with if you want to learn about Einstein?
Yes, the first one is if course trailhead, there is an incredible TrailMix that will make you learn a lot: https://sfdc.co/einsteinchampionstrailmix 

Then you can install the Model Builder provided by Salesforce Labs from the AppExchange:
https://appexchange.salesforce.com/appxListingDetail?listingId=a0N3A00000Ed1V8UAJ

And of course, if you want some help and get serious about Image Recognition you can rely on SharinPix App and Labelling Services: http://bit.ly/SharinPixAppExchange



Friday, October 4, 2019

Salesforce: SOQL Picklist Values & API Name

As you are aware that we can have different names between Picklist Values & API Name in Salesforce, see this screenshot:



When users enter the data or run a report, they will only see Values and not API Name.



When admin or developer do a query with SOQL, the result is API Name
SELECT Id, Name, AccountSource FROM Account WHERE AccountSource <> ''



If you need to get the values from SOQL, use tolabel() function. Here is the updated query
SELECT Id, Name, toLabel(AccountSource) FROM Account WHERE AccountSource <> ''




ReferenceTranslating Results



Page-level ad