Dynamics Customer Service Solution 2.0

A while back I released a Customer Service solution to get a DEMO or Simple production system up and running withing an hour.

Due to recent updates to email to case and templates, the solution I had created failed every time on installation. After a few weeks with Microsoft Support we sorted it out, and the solution is back working again!😀😀🙏🎉🎉🥂🍾

Remember to get the latest version (18 or above) of solution from the GitHub folder.

The main change in the setup is the email to case is now a part of Power Automate, and no longer a part of our good friend WorkFlow

I wrote about it in my blog a while back how to create Email -> Case the new way

Quick guide for email using the new solution import

After importing the solution you will now find “Email 2 Case” in the Automatic Record Creation area. Open this via the Customer Service HUB

Make sure you select your Queue that you have added earlier ADD QUEUE

Open the Email 2 Case Flow to see the structure that Microsoft now has create for Email to Case.

If you like adding the Team as owner to the cases (Optional), you have to add this line to the Owner field in the Flow.

NB! You have to retrieve the GUID from the Team in CRM.

Fining the Team GUID

Last step is to Activate the email to case record creation.. At this point you should be able to see emails entering CRM via Cases.

AI Builder + Lobe

During the recent Arctic Cloud Developer Challenge Hackathon I was playing around with AI Builder for the first time. The scenario we were going to build on there was the detection of Good guy / Bad guy.

The idea was that the citizens would be able to take pictures of suspicious behavior. Once the picture was taken, the classification of the picture would let them know if it was safe or not.

For this exercise I would use my trusted BadBoy/GoodBoy nephew toys:)

Lobe

To start it off I downloaded a free tool called Lobe. www.lobe.ai . Microsoft acquired this tool recently, and it’s a great tool to learn more about object recognition in pictures. The really cool thing about the software is that calculations for the AI model are done on your local computer, so you don’t need to setup any Azure services to try out a model for recognition.

Another great feature is that it integrates seamlessly with Power Platform. Once you train you model with the correct data, you just export it to Power Platform!👏

The first thing you need to do is have enough pictures of the object. Just do at least 15 pictures from different angles to make it understand the object you want to detect.

Tagg all of the images with the correct tags.

Next step is to train the model. This will be done using your local PC resources. When the training is complete you can export to Power Platform.

It’s actually that simple!!! This was really surprising to me:)

Power App

Next up was the Power App the citizens were going to use for the pictures. The idea of course that everyone had this app on their phones and licensing wasn’t an issue 😂

I just added a camera control, and used a button to call a Power Automate Cloud Flow, but this is also where the tricky parts began.

An image is not just an image!!!!! 😤🤦‍♂️🤦‍♀️

How on earth anyone is supposed to understand that you need to convert a picture you take, so that you can send it to Flow, only there to convert it to something else that then would make sense???!??!

Image64 and Power Automate – What a shit show

After asking a few friends, and googling tons of different tips/trics I was able to make this line here work. I am sure there are other ways of doing this, but it’s not blatantly obvious to me.

Set(WebcamPhoto, Camera1.Photo);

Set(PictureFormat,Substitute(Substitute(JSON(WebcamPhoto,JSONFormat.IncludeBinaryData),"data:image/png;base64,",""),"""",""));

'PowerAppV2->Compose'.Run(PictureFormat);

The receiving Power Automate Cloud Flow looked like this:

I tried receiving the image as a type image, but I couldn’t figure it out. Therefore I converted it to a Base64 I believe when sending to Flow. In the Flow I again converted this to a binary representation of the image before sending it to the prediction.

The prediction on the other hand worked out really nice!! I found my model that I had imported from Lobe, and used the ouputs from the Compose 3 action (base64 to binary). I have no idea what the differences are, but I just acknowledge that this is how it has to be done.. I am sure there are other ways, but that someone will have to teach me in the future.

All in all it actually worked really well. As you can see here I added all types of outputs to learn from the data, but it was exactly as expected when taking a picture of Winnie the Poo 😊 The bear was categorized as good, and my model was working.

Why Lobe?

One can wonder why I chose to use Lobe for this, when the AI Builder has the training functionality included within the Power Platform. For my simple case it wouldn’t really matter what I chose to use, I just wanted to test out the newest tech.

When working with larger scale (numbers) of images, Lobe seems to be a lot easier for the importing/tagging/training of the model. Everything runs locally, so the speed of training and testing is a lot faster also. It’s also simple to retrain the model an upload again. This being a hackathon it was important to try new things out 😊

More about AI builder

I talked to Joe Fernandez from the AI builder team, and he pointed me to some resources that are nice to checkout regarding this topic.

https://myignite.microsoft.com/sessions/a5da5404-6a25-4428-b4d0-9aba67076a08 <- forward to 11:50 for info regarding the AI Builder

https://youtube.com/watch?v=MQQmDUCufS8 <- Lobe

Business Unit Name Change

Something small, but yet useful. Thank you Tanguy for the tip🤗

As you probably figured out the Default Business unit name is something you can’t change OOTB after you have created a CRM/Dataverse environment.

If you are not careful when creating a new environment the org name will be set for you, and that is also when the Business Unit Name is set.

This can result in the following main Business Unit, and it just doesn’t make any sense. Also you can’t change it because Parent Business needs to be entered. Problem is that there is no parent to the parent in this case 🤣

XrmToolBox – Bulk Data Update to the rescue😎

Open the view for active Business Units and find your result.

⛔NB!! Make sure your search only returns one business unit

Next thing you do is set a fixed value, add attribute to the bulk job, and the update the record.

Power Apps Pay As you GO!!💸

When Microsoft introduced Azure for the Microsoft public, it was a new way of thinking. We were suddenly paying for what we needed and when we needed it. Amazon had been there for a long while, but for Microsoft customers this was a new way of thinking. After a skeptical start, this model has really become somewhat of a system standard.

As of today Power Platform will be available on Azure subscription! It is being introduced as a “Pay as you go” model. It is important that you don’t mistake this for the same as Azure. In Azure you actually only pay for the compute time used (in most cases), but here you will pay for a license once you use an application.

WOW THIS IS SOOOO COOL … Well, is it really?

Let’s just think about the following first. Just a few weeks ago Microsoft dropped the prices to half of what they used to cost. They are now only 5$ and 20$ for the different plans. When you think about the value you get from a Dataverse OOTB that is a BARGAIN already.

So why am I not overly excited about the “Pay as you Go” PAYGO model? Well, I don’t really see the big impact yet. Most of my customers are on the CSP agreement, and can flex as much as they feel for. Planning ahead for apps is also hard, and is counter intuitive for innovation. By releasing a plan as PAYGO, you essentially need to plan financially for all users that might use an app, while you silently hope that not all users actually use the app that month. For every user that didn’t use the app, you save some money.

I am sure that the plan makes sense for many scenarios, but I just don’t really see them yet. The good thing is that “limitations/possibilities” for the new plan will be monitored closely in the beginning to find the correct levels for all types of use cases. Remember to voice your opinion if you see some great opportunity. Microsoft will be listening😀

Pricing comparison

Standard Pricing App and User Plan

Standard Pricing Storage

PAYGO Pricing app

https://docs.microsoft.com/en-us/power-platform/admin/powerapps-flow-licensing-faq#add-ons

PAYGO Storage

https://docs.microsoft.com/en-us/power-platform/admin/powerapps-flow-licensing-faq#add-ons

Personal Thoughts

The only thing that we know for sure is that licensing will always be a situation where we as consumers want changes. We want more more more, and want to pay less less less. Microsoft will continuously find new license models to adapt to our wishes while finding ways to keep profits. Don’t get me wrong. I am all about Microsoft being able to charge what they want. After all it’s a great product!!! I’m just saying that you need to look behind the shining stuff before you automatically assume that everything new is automatically better.

What you need to do as a customer is get help to assess assess your licensing situation. Not only is licensing complex from a rules perspective, but the applications can be modified to adapt to licensing changes. I am not saying PAYGO is bad, but I’m not jumping on the PAYGO train quite yet. Most of my customers are CSP customers and have a lot of freedom with licensing (Up and Down). Just going to see what happens first 😁

I might also have misunderstood quite a lot in regards to the benefits received from this model, and if so I would love feedback to learn new ways of thinking!👌

Dynamics 365 App for Outlook button

Tiny blog for a tiny button 😂 This is only relevant if you have users that work in Outlook Web. Every now and then I do encounter a few Apple users that prefer the Outlook Web, even though it works well with Outlook for Mac.

Outlook client

If you use the Outlook client you know the button from the ribbon. Click the button do load the client.

Outlook Web.

OOTB the Dynamics client is hidden once it is deployed for the user. Only way to find it is to open the actual email and choose the ellipsis

Great thing is that we can change the order of the buttons:)

Solution

Open the Outlook Web settings and choose “View All Outlook Settings”.

Find the Dynamics 365 button in the Customize Actions and click save.

You now have the button easily accessible 🤗

The rise of the Galleries🌌

By now I hope most people know the https://pcf.gallery (run by Guido Preite). A great page for sharing community components (PCF) and exposing awesome contributions to the rest of the world. 🌎

What I like about the PCF Gallery is the simplicity of the site only being about PCF components. This is why I asked Guido if we could create a similar site for other components regarding the Power Platform. He was so kind to share his code for the project, so Matt Beard and I decided to give it a go. 🤗

Connector Gallery

First out in list of future galleries is the CONNECTOR gallery . This site will contain all sorts of custom connectors for Power Platform that you can share with the community. If you want to contribute to this gallery, you only have to share the custom connector file you have on GitHub, and we will post it out!

Dynamics / Dataverse Environment Variables in Power Automate

There have been several posts on the environment variables, but I think Microsoft just released a pretty GREAT update recently. I honestly don’t have a clue when this function was released, but it made my day a LOT easier:) This is how we used to refer to Environmental Variables:

Using Environment Variables in Dynamics 365 CRM – Part 1 | Microsoft Dynamics 365 CRM Tips and Tricks (inogic.com)

Using Environment Variables in Dynamics 365 CRM – Part 2 | Microsoft Dynamics 365 CRM Tips and Tricks (inogic.com)

Basics

When moving solutions from system A to system B we often have hardcoded values in workflows/flows/javascripts etc. These cause problems because the fields often have different values in each environment.

Sure you can migrate variables/records between systems and be sure to never overwrite, but it can be time consuming to maintain.

Microsoft has introduced environmental variables to Dynamics / Dataverse to fix this exact problem. You create a new Environmental variable in your solution, and you can se the value for each environment. Within CRM you can refer to the environmental variable instead of hardcoded values:)

Within the solution explorer you can add Environmental variables that can be used across the different systems. In this test I am using a variable to define what environment I am working in “Test or Production”.

I wont be writing about the Environment variables in detail, as you can read more about them in other posts, but I wanted to cover the new way of using this within Power Automate!

Accessing the Environmental Variables in Power Automate Flow

Start by creating a new Power Automate, and add compose function

Check out that bad boy! 🤗😎 No longer need for get record calls from entities etc. A nice improvement that will save a few minutes here and there.

Power Automate – finding Dataverse environment URL

There are many ways of getting the current environment URL, but this is the quick and dirty version of doing just this 😏 Next week I will post about Environment variables as I know this is a possible approach that is better.

The intention about this blog is to show how to use Json to parse just a small part of an action body to get what you want exposed.

Power Automate

So this could be a typical flow in Power Automate for Dataverse. We have a trigger on top and an action below to get more data from the element that triggered.

The BODY of the trigger doesn’t contain environment information, but only Opportunity data:

But the Action contains a lot more interesting data for this.

In order to get this data we need to parse the Json returned here to retrieve the “Odata.ID” that includes the URL for our environment.

{
    "type": "object",
    "properties": {
        "@@odata.id": {
            "type": "string"
        }
    }
}

Now you can store the string returned in a String Variable

Running the Power Automate

When you run the Power Automate the the variable Environment will now include the URL of the system running. From here you can use string variables in a formula like:

IF “Environment contains org.crm4” etc.

In my next post I will show how the environmental variables functionality in Dataverse / Dynamics, and how we can use this for the same purpose 😀

Dynamics 365 Teams Document Locations – Where art thou?🤷‍♂️

Even though we all wish it wasn’t so, Document Locations still rule the integrations between Dynamics 365 and SharePoint. I’m not saying that I have a better idea what would be a smarter way of solving it, but it all seems a bit “2011” ish.

Last week I encountered a problem with the Document Locations for Teams, and I was surprised when I couldn’t find them in the Document Locations at first. The list only contained the SharePoint sites that the standard SharePoint connector uses.

In this list I was missing all of the Teams locations. Turns out that the view is only showing Active SHAREPOINT locations.. hehe

All you have to do is add the “MS TEAMS” to the search, and you should see all of the document locations for that also

Dynamics 365 + Teams integration error

Recently ran into a problem with the Teams integration OOTB, where the integration continuously threw an error after connecting the Dynamics record to the Teams.

This is a pretty Vanilla environment, so I couldn’t quite figure out what was wrong. I obviously could see that the URL was wrong, but I didn’t understand WHY it was wrong.

https://**.sharepoint.com/sites/SuperCards/Shared%20Documents/General <- Nothing really wrong with this URL at first site.

After a lot of painful digging I finally found the issue. Someone had decided to install SharePoint in Norwegian when they first setup the tenant!!! hehe. This meant that the URL the SharePoint URL was wrong.

Wrong URL ⛔

Correct URL ✅

Solution?

Microsoft Support didn’t see a fix in the near future for language support, so I guess it’s time for a small work around 🙂 Not really exciting fix, but you need to create a Workflow or Power Automate on create to change the name of the document location to your local language.

Why a workflow you may ask? When is the last time a workflow failed you I answer 😎