A while back I released a Customer Service solution to get a DEMO or Simple production system up and running withing an hour.
Due to recent updates to email to case and templates, the solution I had created failed every time on installation. After a few weeks with Microsoft Support we sorted it out, and the solution is back working again!😀😀🙏🎉🎉🥂🍾
Remember to get the latest version (18 or above) of solution from the GitHub folder.
The main change in the setup is the email to case is now a part of Power Automate, and no longer a part of our good friend WorkFlow
The idea was that the citizens would be able to take pictures of suspicious behavior. Once the picture was taken, the classification of the picture would let them know if it was safe or not.
For this exercise I would use my trusted BadBoy/GoodBoy nephew toys:)
To start it off I downloaded a free tool called Lobe. www.lobe.ai . Microsoft acquired this tool recently, and it’s a great tool to learn more about object recognition in pictures. The really cool thing about the software is that calculations for the AI model are done on your local computer, so you don’t need to setup any Azure services to try out a model for recognition.
Another great feature is that it integrates seamlessly with Power Platform. Once you train you model with the correct data, you just export it to Power Platform!👏
The first thing you need to do is have enough pictures of the object. Just do at least 15 pictures from different angles to make it understand the object you want to detect.
Tagg all of the images with the correct tags.
Next step is to train the model. This will be done using your local PC resources. When the training is complete you can export to Power Platform.
It’s actually that simple!!! This was really surprising to me:)
Next up was the Power App the citizens were going to use for the pictures. The idea of course that everyone had this app on their phones and licensing wasn’t an issue 😂
I just added a camera control, and used a button to call a Power Automate Cloud Flow, but this is also where the tricky parts began.
An image is not just an image!!!!! 😤🤦♂️🤦♀️
How on earth anyone is supposed to understand that you need to convert a picture you take, so that you can send it to Flow, only there to convert it to something else that then would make sense???!??!
Image64 and Power Automate – What a shit show
After asking a few friends, and googling tons of different tips/trics I was able to make this line here work. I am sure there are other ways of doing this, but it’s not blatantly obvious to me.
The receiving Power Automate Cloud Flow looked like this:
I tried receiving the image as a type image, but I couldn’t figure it out. Therefore I converted it to a Base64 I believe when sending to Flow. In the Flow I again converted this to a binary representation of the image before sending it to the prediction.
The prediction on the other hand worked out really nice!! I found my model that I had imported from Lobe, and used the ouputs from the Compose 3 action (base64 to binary). I have no idea what the differences are, but I just acknowledge that this is how it has to be done.. I am sure there are other ways, but that someone will have to teach me in the future.
All in all it actually worked really well. As you can see here I added all types of outputs to learn from the data, but it was exactly as expected when taking a picture of Winnie the Poo 😊 The bear was categorized as good, and my model was working.
One can wonder why I chose to use Lobe for this, when the AI Builder has the training functionality included within the Power Platform. For my simple case it wouldn’t really matter what I chose to use, I just wanted to test out the newest tech.
When working with larger scale (numbers) of images, Lobe seems to be a lot easier for the importing/tagging/training of the model. Everything runs locally, so the speed of training and testing is a lot faster also. It’s also simple to retrain the model an upload again. This being a hackathon it was important to try new things out 😊
More about AI builder
I talked to Joe Fernandez from the AI builder team, and he pointed me to some resources that are nice to checkout regarding this topic.
I know I know.. This post is nothing near Power Platform or Dynamics, but it’s just a friendly reminder that you need to check you backup of the Azure Authenticator App on your phone.
I don’t really feel the need to explain what the app is, because I expect everyone reading my blog to be aware of the application on some shape or form (private and business). As a consultant your application might look something like mine (pages and pages of entries):
This is literally pages and pages of logins for customers that have 2 factor authentication setup. Loosing access to this will be a serious pain in the A**!! Well, that is exactly what happened the other day without me having any idea why. 🤬🤬🤬
Of course I headed on over to my backup just to see that THE BACKUP WAS NOT THERE!!
Well, the backup was there, but it was a stale backup from 1 year ago. For some reason the application had stopped doing the auto backups. 🤦♂️🤦♂️
Just make sure backup is turned ON, AND make sure that it is backing up automatically 😂 1. Open Settings 2. Turn on 3. If already on, click Details to look at the latest date for backup.
Something small, but yet useful. Thank you Tanguy for the tip🤗
As you probably figured out the Default Business unit name is something you can’t change OOTB after you have created a CRM/Dataverse environment.
If you are not careful when creating a new environment the org name will be set for you, and that is also when the Business Unit Name is set.
This can result in the following main Business Unit, and it just doesn’t make any sense. Also you can’t change it because Parent Business needs to be entered. Problem is that there is no parent to the parent in this case 🤣
XrmToolBox – Bulk Data Update to the rescue😎
Open the view for active Business Units and find your result.
⛔NB!!Make sure your search only returns one business unit⛔
Next thing you do is set a fixed value, add attribute to the bulk job, and the update the record.
Step 2.. Break down the URL from the first picture like this
Step three… If you don’t really know how to do this, ask a friend!! 🙂
Step three again.. Enter the security settings. When entering the security settings and providing something more than blank, you will be prompted with the credentials first time you create a connection to the connector.
I broke the URL down further with the “api_key” as a query, so that it would show in the URL like the example on the first picture.
Step four.. Create a search tag like the one I had in the URL from the first picture
Step five.. Get the URL from the first picture with your API key, and add this to the import sample
Choose the GET in this case, and add the full URL
Your request should look something like this:
Step 6.. Add a connection to the connector and test with a tag. It should return some info like this:
When you are done, you have a custom connector you can reuse from Power Apps, Flow or any other tool that can use custom connectors.
If you are using email to case, SLA or any Automatic Record Creation in classic you really need to update your rules ASAP to the new UI.
I can’t seem to find the message in the make.powerapps.com, but it’s one of the first messages that appears when opening the classic editor in solutions.
This post is not going to comment the pros/cons of the old vs the new. I will have to come back to you on that one. I am simply stating that you have to do this because it’s not only being deprecated, but it is being shut down. Microsoft has actually created a migration path for the rules, and documented the process fairly well.
I use these rules for the most in the email to case scenario. First it has some conditions, and then it has the actual create of the case record.
When running through the migration, it does in fact get the conditions correct when migrating, but not the owner that I have in the create statement later. This has to be manually added to the flow created by MS.
After the migration is done, I had the following:
A record rule in the new UI
The conditions also got migrated without issues
And a new flow that Microsoft Autocreates for the email to case record creation. The red box marks where I had to add the “owner” field for the Teams ownership that I normally use.
All of these steps are automatically added by Microsoft. This is also the case if you do a NEW Email to Case from the UI. It feels a bit odd to put my faith in the hands of something automatically created, but for now I have to go with the flow.
NB!!! REMEMBER TO CLICK THE SAVE BUTTON IN FLOW BEFORE ACTIVATING THE RULE
So far it seems to be working OK, but I have to do some more heavy testing before I can conclude that the Flow is as stable as the old rules within Dynamics. We are running it in production, so I will update if I see any problems 🤞
By now I hope most people know the https://pcf.gallery (run by Guido Preite). A great page for sharing community components (PCF) and exposing awesome contributions to the rest of the world. 🌎
What I like about the PCF Gallery is the simplicity of the site only being about PCF components. This is why I asked Guido if we could create a similar site for other components regarding the Power Platform. He was so kind to share his code for the project, so Matt Beard and I decided to give it a go. 🤗
First out in list of future galleries is the CONNECTOR gallery . This site will contain all sorts of custom connectors for Power Platform that you can share with the community. If you want to contribute to this gallery, you only have to share the custom connector file you have on GitHub, and we will post it out!
There have been several posts on the environment variables, but I think Microsoft just released a pretty GREAT update recently. I honestly don’t have a clue when this function was released, but it made my day a LOT easier:) This is how we used to refer to Environmental Variables:
Sure you can migrate variables/records between systems and be sure to never overwrite, but it can be time consuming to maintain.
Microsoft has introduced environmental variables to Dynamics / Dataverse to fix this exact problem. You create a new Environmental variable in your solution, and you can se the value for each environment. Within CRM you can refer to the environmental variable instead of hardcoded values:)
Within the solution explorer you can add Environmental variables that can be used across the different systems. In this test I am using a variable to define what environment I am working in “Test or Production”.
I wont be writing about the Environment variables in detail, as you can read more about them in other posts, but I wanted to cover the new way of using this within Power Automate!
Accessing the Environmental Variables in Power Automate Flow
Start by creating a new Power Automate, and add compose function
Check out that bad boy! 🤗😎 No longer need for get record calls from entities etc. A nice improvement that will save a few minutes here and there.
There are many ways of getting the current environment URL, but this is the quick and dirty version of doing just this 😏 Next week I will post about Environment variables as I know this is a possible approach that is better.
The intention about this blog is to show how to use Json to parse just a small part of an action body to get what you want exposed.
So this could be a typical flow in Power Automate for Dataverse. We have a trigger on top and an action below to get more data from the element that triggered.
The BODY of the trigger doesn’t contain environment information, but only Opportunity data:
But the Action contains a lot more interesting data for this.
In order to get this data we need to parse the Json returned here to retrieve the “Odata.ID” that includes the URL for our environment.