The idea was that the citizens would be able to take pictures of suspicious behavior. Once the picture was taken, the classification of the picture would let them know if it was safe or not.
For this exercise I would use my trusted BadBoy/GoodBoy nephew toys:)
To start it off I downloaded a free tool called Lobe. www.lobe.ai . Microsoft acquired this tool recently, and it’s a great tool to learn more about object recognition in pictures. The really cool thing about the software is that calculations for the AI model are done on your local computer, so you don’t need to setup any Azure services to try out a model for recognition.
Another great feature is that it integrates seamlessly with Power Platform. Once you train you model with the correct data, you just export it to Power Platform!👏
The first thing you need to do is have enough pictures of the object. Just do at least 15 pictures from different angles to make it understand the object you want to detect.
Tagg all of the images with the correct tags.
Next step is to train the model. This will be done using your local PC resources. When the training is complete you can export to Power Platform.
It’s actually that simple!!! This was really surprising to me:)
Next up was the Power App the citizens were going to use for the pictures. The idea of course that everyone had this app on their phones and licensing wasn’t an issue 😂
I just added a camera control, and used a button to call a Power Automate Cloud Flow, but this is also where the tricky parts began.
An image is not just an image!!!!! 😤🤦♂️🤦♀️
How on earth anyone is supposed to understand that you need to convert a picture you take, so that you can send it to Flow, only there to convert it to something else that then would make sense???!??!
Image64 and Power Automate – What a shit show
After asking a few friends, and googling tons of different tips/trics I was able to make this line here work. I am sure there are other ways of doing this, but it’s not blatantly obvious to me.
The receiving Power Automate Cloud Flow looked like this:
I tried receiving the image as a type image, but I couldn’t figure it out. Therefore I converted it to a Base64 I believe when sending to Flow. In the Flow I again converted this to a binary representation of the image before sending it to the prediction.
The prediction on the other hand worked out really nice!! I found my model that I had imported from Lobe, and used the ouputs from the Compose 3 action (base64 to binary). I have no idea what the differences are, but I just acknowledge that this is how it has to be done.. I am sure there are other ways, but that someone will have to teach me in the future.
All in all it actually worked really well. As you can see here I added all types of outputs to learn from the data, but it was exactly as expected when taking a picture of Winnie the Poo 😊 The bear was categorized as good, and my model was working.
One can wonder why I chose to use Lobe for this, when the AI Builder has the training functionality included within the Power Platform. For my simple case it wouldn’t really matter what I chose to use, I just wanted to test out the newest tech.
When working with larger scale (numbers) of images, Lobe seems to be a lot easier for the importing/tagging/training of the model. Everything runs locally, so the speed of training and testing is a lot faster also. It’s also simple to retrain the model an upload again. This being a hackathon it was important to try new things out 😊
More about AI builder
I talked to Joe Fernandez from the AI builder team, and he pointed me to some resources that are nice to checkout regarding this topic.
I just participated in a 3 day hackathon ACDC 2022 (arcticclouddeveloperchallenge.net) and it was one of the biggest emotional rollercoasters I have had in MANY years.
Let me just paint the picture first.. ACDC is a yearly hackathon where the best of the best in Norway gather to explore the Dynamics/Power Platform/M365/Azure platforms, creating stellar products. What makes it different from other hackathons?
Mandatory physical attendance
Sleepover at the hotel required, even though you live in the city
Surprise challenges with rewards (head to head)
Lunch and Dinner every day at the hotel, mandatory attendance
Mandatory Socializing activities outside of the teams
Amazing Judges Every time
End of the hackathon dinner and party
Great swag 👊
Lots of energy drinks 🔋
Little to no sleep.. Yea.. I felt that one personally 💤💤💤
I know I know.. Many of the hackathons out there have similar setups, but often they only include parts of what we have to offer. I am of course extremely bias, because I am part of the committee. Today I am writing as the participant from the Team Pizza Time. 🍕
Most of the teams participating were senior teams with tons of experience in Dynamics and lots of years experience within the workforce. We did however have a few new teams with us this year, and that usually is always quite the challenge. The Rules can be complex to follow the first times, and most would struggle keeping up.
The judges this year did an extraordinary job keeping everyone in line, and also helping out all of the new teams understanding what was going on.
After the initial rigging of computers, each team was introduced. Every team presented their initial ideas and small hints on what technology they were planning to use the next 3 days.
Build a Turles HQ
Pizza Ordering Store
An ambitions plan involving the following keywords for technology:
Power Apps Portals
Model Driven App
IOT (proximity, heat)
Motion Sensor Camera
Hue Color Lights
Intranet in Teams
Our team had a pre meeting deciding what tech we wanted to work on, and the scenario’s we wanted to solve. Our scenario was to build a Turtles HQ with security notifications and control center functionality. Then we were going to migrate this story into a Pizza Shop having to work with Pizza Orders and Deliveries.
After our first meeting regarding the solution, I think we all had an idea of what we wanted to solve but not necessarily the same one. It is not uncommon for creative people to think differently about one topic even though they think they are on the same page. This is one of the challenges when working with technology. Later on this would prove to be a huge challenge for our team.
After day one we had been picking a solid amount of points. We had our Teams Intranet, Portals for Ordering, Power Apps for Ordering, Power BI report, Raspberry PI, IOT sensors (2 heat, 2 motion sensors, 1 Hue light bulb), Google Nest Hub, Native React app for Pizza inspo etc etc.. We were on fire, and far beyond the other teams in technology!! (personal opinion). 🔥🔥💯💯
We even won a head to head challenge against the other teams. A challenge where the first one to finish received extra points. In the head to head challenge we had to embed a Power App within Power BI report, and read/edit the data in this Power App. The idea was to update Power Bi directly via the embedded Power App. This scored us a solid extra number of points and a new badge for the collection🥇
At this point in time we were seriously kicking some ass and went to bed as potential winners.
Day 2 – WTF happened?
Where to begin…. I woke up happy and proud of all the achievements from day 1. Everything seemed to be going as I had planned. We were geeking BIG TIME and having so much fun putting different technical things together. We were also gathering lots of the extra bonus points for doing the occasional odd “side quest”.
This day we had started to automate processes so that sensors were triggering events, the Power App was connecting to feedback surveys, and the portal for ordering was working with weather API + maps to give estimated delivery times etc etc.
At the end of the day, every team had to deliver a blog post explaining what we had done since day one. We were 9 teams onsite, so it was important for the judges to have something to read through to be able to cover everyone’s updates. We had made some great progress with our technologies and almost all gadgets were functioning in automation as we had planned. We were feeling quite confident in the next round of points.
This is where the rollercoaster of emotions started! 🎢🎢🎢
Announcing the points from day 2, we had moved from point winners to point losers. We were almost dead last in every category that we had been winning the day before. This was not only the case for Pizza Time, but it was also the case for a few of the other senior teams. What had happened we were asking each other. The junior team with almost no experience at all was getting all of the points. This surely must have been some type of error.. Right!??!?. Of course we had a lot of meetings with the judges trying to figure out what the f*** had happened, but their answers were quite simple.
“Thomas, did you answer how you had added more value to the main categories from day one?”.
I was baffled..
“Answer: I wrote about all of the amazing things we put together of tech ** Check blog day 2**. What we have done is really cool!!”
“But how does that relate to the categories where you present business value, user experience etc?”
I had to think about that one. In my mind the business value was obvious. We had put together so much technology that was pretty impressive (given the amount of time). After about 30 minutes of not saying much, and just looking at my screen in despair, I realized they were right. We were not presenting the solution with a value proposition. It even made us wonder if the initial value proposition was good enough.
After dinner and some “bad vibes”, started what I personally felt was an extraordinary journey. A journey that made me extremely proud to be a part of the team we were on.
We sat down for almost 2 hours straight just breaking down every piece of our solution, trying to figure out what the business value was. We compared it to other deliveries that had lots of points, and that’s when we noticed a few key elements. They were better at selling business value, where the technology only was secondary. It was so simple and obvious that it pissed me off that I hadn’t thought of that before.
The principle applies to every real life scenario. If I can’t convince my customer that my technology add value to their business, they will never by my services.
So the seniors put their heads together and pulled an “all nighter”. We completely ripped our business case apart, and revitalized every aspect of our technology. Our mission was no longer about the Turtles HQ and keeping the city safe from monsters, but it was about the city being in a bad state and helping out those in need.
I never really know when I started day 3, because I simply didn’t sleep. I was up all night doing adjustments to the tech, having to say yes/no to a few components. It hurt having to trash parts of a solution I had been working on for 1,5 days, but that’s the name of the game!
Our pitch had moved from a crime fighting city with Turtles, to a city in need of help after covid. Unemployment was up and the gaps in poor were even bigger now than ever before. People living on the street needed food, and we had a service that could provide food for the needy. Our mission statement went from being bold and covering a lot of work loads, to simpler “We make pizza for the people, no matter what social status you have”.
You can read more about our final delivery here, and you could even compare it to the first post if curious
The youngsters made us realize what we should have been focusing on all along. What is the problem, and how can we solve it. We were so focused on being geeks and having fun that we lost track of a key element to delivering IT solutions. I am a little angry that I didn’t think of this earlier when delivering, but at the same time it would not have given us the chance to turn around and prove the value of Senior Consultants. When we got hit in the face with reality, we could have just quit.. Instead we pushed through the night and delivered a phenomenal presentation (personal opinion) that we were really proud of.
Eventually we finished 2nd place behind the kids, but I am extremely happy how the team managed to work together and push each other to the limits. We ended up feeling like we won that 2nd place, and next year you better believe that I am coming for the 1st!!!🏆
I know I know.. This post is nothing near Power Platform or Dynamics, but it’s just a friendly reminder that you need to check you backup of the Azure Authenticator App on your phone.
I don’t really feel the need to explain what the app is, because I expect everyone reading my blog to be aware of the application on some shape or form (private and business). As a consultant your application might look something like mine (pages and pages of entries):
This is literally pages and pages of logins for customers that have 2 factor authentication setup. Loosing access to this will be a serious pain in the A**!! Well, that is exactly what happened the other day without me having any idea why. 🤬🤬🤬
Of course I headed on over to my backup just to see that THE BACKUP WAS NOT THERE!!
Well, the backup was there, but it was a stale backup from 1 year ago. For some reason the application had stopped doing the auto backups. 🤦♂️🤦♂️
Just make sure backup is turned ON, AND make sure that it is backing up automatically 😂 1. Open Settings 2. Turn on 3. If already on, click Details to look at the latest date for backup.
Something small, but yet useful. Thank you Tanguy for the tip🤗
As you probably figured out the Default Business unit name is something you can’t change OOTB after you have created a CRM/Dataverse environment.
If you are not careful when creating a new environment the org name will be set for you, and that is also when the Business Unit Name is set.
This can result in the following main Business Unit, and it just doesn’t make any sense. Also you can’t change it because Parent Business needs to be entered. Problem is that there is no parent to the parent in this case 🤣
XrmToolBox – Bulk Data Update to the rescue😎
Open the view for active Business Units and find your result.
⛔NB!!Make sure your search only returns one business unit⛔
Next thing you do is set a fixed value, add attribute to the bulk job, and the update the record.
When Microsoft introduced Azure for the Microsoft public, it was a new way of thinking. We were suddenly paying for what we needed and when we needed it. Amazon had been there for a long while, but for Microsoft customers this was a new way of thinking. After a skeptical start, this model has really become somewhat of a system standard.
As of today Power Platform will be available on Azure subscription! It is being introduced as a “Pay as you go” model. It is important that you don’t mistake this for the same as Azure. In Azure you actually only pay for the compute time used (in most cases), but here you will pay for a license once you use an application.
WOW THIS IS SOOOO COOL … Well, is it really?
Let’s just think about the following first. Just a few weeks ago Microsoft dropped the prices to half of what they used to cost. They are now only 5$ and 20$ for the different plans. When you think about the value you get from a Dataverse OOTB that is a BARGAIN already.
So why am I not overly excited about the “Pay as you Go” PAYGO model? Well, I don’t really see the big impact yet. Most of my customers are on the CSP agreement, and can flex as much as they feel for. Planning ahead for apps is also hard, and is counter intuitive for innovation. By releasing a plan as PAYGO, you essentially need to plan financially for all users that might use an app, while you silently hope that not all users actually use the app that month. For every user that didn’t use the app, you save some money.
I am sure that the plan makes sense for many scenarios, but I just don’t really see them yet. The good thing is that “limitations/possibilities” for the new plan will be monitored closely in the beginning to find the correct levels for all types of use cases. Remember to voice your opinion if you see some great opportunity. Microsoft will be listening😀
Standard Pricing App and User Plan
Standard Pricing Storage
PAYGO Pricing app
The only thing that we know for sure is that licensing will always be a situation where we as consumers want changes. We want more more more, and want to pay less less less. Microsoft will continuously find new license models to adapt to our wishes while finding ways to keep profits. Don’t get me wrong. I am all about Microsoft being able to charge what they want. After all it’s a great product!!! I’m just saying that you need to look behind the shining stuff before you automatically assume that everything new is automatically better.
What you need to do as a customer is get help to assess assess your licensing situation. Not only is licensing complex from a rules perspective, but the applications can be modified to adapt to licensing changes. I am not saying PAYGO is bad, but I’m not jumping on the PAYGO train quite yet. Most of my customers are CSP customers and have a lot of freedom with licensing (Up and Down). Just going to see what happens first 😁
I might also have misunderstood quite a lot in regards to the benefits received from this model, and if so I would love feedback to learn new ways of thinking!👌
Step 2.. Break down the URL from the first picture like this
Step three… If you don’t really know how to do this, ask a friend!! 🙂
Step three again.. Enter the security settings. When entering the security settings and providing something more than blank, you will be prompted with the credentials first time you create a connection to the connector.
I broke the URL down further with the “api_key” as a query, so that it would show in the URL like the example on the first picture.
Step four.. Create a search tag like the one I had in the URL from the first picture
Step five.. Get the URL from the first picture with your API key, and add this to the import sample
Choose the GET in this case, and add the full URL
Your request should look something like this:
Step 6.. Add a connection to the connector and test with a tag. It should return some info like this:
When you are done, you have a custom connector you can reuse from Power Apps, Flow or any other tool that can use custom connectors.
Doesn’t matter how many times you tell a salesperson that input of data is important. They will always ask you what’s the output that I can look forward to. A few years back I remember gamification being important, but you normally need high volume of sales + many sales reps for it to be any fun. So in the constant pursuit of having some fun with sales I managed to find a new way of spreading some joy 😉
If you are not familiar with Adaptive Cards, you are not alone. Adaptive Cards are configurable messages that can be delivered either within a chat or channel (plus many other services). The card is basically just a “custom page” that you create and send to the chat in Teams.
Adaptive Card Configuration
Learning how to create adaptive cards isn’t all that hard. Head on over to https://adaptivecards.io/designer/ and begin designing your personal adaptive card. Because we are working with Teams, we choose the following.
Start by choosing the Microsoft Teams – Light or dark. Other configurations might not work.
Middle part is your actual card.
Left side are the components you can drag/drop to the center card.
Down low is the JSON we are going to copy paste once we are done with the visuals
Adaptive Card Samples
If you are as terrible as I am being creative, you can find tons of samples here.
Gif’s ‘R’ us 🙌
For the next step we want to get a random GIF from a GIF site. I have found an API for Giphy that seems to be working great.
Head on over to Gyphy API and create a new account. Then open a new app to attain an API key
Create a Team for the sales users, or just use the one you already have. Once you have found the right place for the adaptive card we can start the next process.
Open Power Automate and setup a flow to trigger once the Opportunity Closes as Won.
The first steps are pretty basic. When Oppty closed as won get the following pieces of data. These are standard Get Item by ID, so no need to dive into details here.
This will send a request to Giphy asking for a random GIF with the WIN tag.
Opening the URL in the web browser you will receive a LONG json back. The only text we need is the
Therefore we have to parse the Json to get the URL of the Original Image. If you don’t know how to do this, look for Power Automate Parse Json. Basically just copy the text above from the web browser, and open it in the “Generate from sample” part. This will automatically create the following Json Payload.
Last step to the process is adding the Adaptive Card to the Flow
The text in the adaptive card is what I copied from the adaptive card configurator on top 🔝
I could add the complete Json here, but all you need is your personal adaptive card inserted to this step.
Result – CHECK OUT THIS BAD BOY😎
It’s just a simple message in a chat, but this stuff really get the sales team going.. Now it’s a competition getting the funnies GIF, and most likes:) It’s not really gamification, but it sure is a lot of fun🎉😁🎈
Tiny blog for a tiny button 😂 This is only relevant if you have users that work in Outlook Web. Every now and then I do encounter a few Apple users that prefer the Outlook Web, even though it works well with Outlook for Mac.
If you use the Outlook client you know the button from the ribbon. Click the button do load the client.
OOTB the Dynamics client is hidden once it is deployed for the user. Only way to find it is to open the actual email and choose the ellipsis
Great thing is that we can change the order of the buttons:)
Open the Outlook Web settings and choose “View All Outlook Settings”.
Find the Dynamics 365 button in the Customize Actions and click save.
If you are using email to case, SLA or any Automatic Record Creation in classic you really need to update your rules ASAP to the new UI.
I can’t seem to find the message in the make.powerapps.com, but it’s one of the first messages that appears when opening the classic editor in solutions.
This post is not going to comment the pros/cons of the old vs the new. I will have to come back to you on that one. I am simply stating that you have to do this because it’s not only being deprecated, but it is being shut down. Microsoft has actually created a migration path for the rules, and documented the process fairly well.
I use these rules for the most in the email to case scenario. First it has some conditions, and then it has the actual create of the case record.
When running through the migration, it does in fact get the conditions correct when migrating, but not the owner that I have in the create statement later. This has to be manually added to the flow created by MS.
After the migration is done, I had the following:
A record rule in the new UI
The conditions also got migrated without issues
And a new flow that Microsoft Autocreates for the email to case record creation. The red box marks where I had to add the “owner” field for the Teams ownership that I normally use.
All of these steps are automatically added by Microsoft. This is also the case if you do a NEW Email to Case from the UI. It feels a bit odd to put my faith in the hands of something automatically created, but for now I have to go with the flow.
NB!!! REMEMBER TO CLICK THE SAVE BUTTON IN FLOW BEFORE ACTIVATING THE RULE
So far it seems to be working OK, but I have to do some more heavy testing before I can conclude that the Flow is as stable as the old rules within Dynamics. We are running it in production, so I will update if I see any problems 🤞
I wrote about the theme because I still think it is highly important to merge some data to DV4T in some business scenarios from a Dataverse table. DV4T is essentially built for quick and agile projects, but these projects could need a little data from time to time to hit the ground running. Exporting / Importing data via Power Automate is in my book not a very solid option, and feels like a “work arround”.
In the recent update of KingsWaySoft they just released an Interactive Login Experience
Normally the best practice for SSIS and KingsWaySoft would be with an Application User. The only problem is that Dataverse for Teams doesn’t support application users. With the new type of connection the you have to manually add the username/password for each run. It retrieved a new token for each run.
How to set it up
Open your preferred SSIS tool and start a new connection manager. Notice that I have chosen the Interactive Login for OAuth Type.
Add the username and open “Select Organization”
I am not completely sure what the “Use SDK App” is for, but we have 2 factor authentication, and I had to activate this option for it to bypass the need for App ID.
Click on Retrieve and now we can see all organizations. Dataverse AND DV4T. The BlueberryAPI is a Dataverse for Teams database.
WAIT BEFORE YOU TEST CONNECTION!Click on OK
Right click on the connector and open properties
On the right side make sure you add the API 9.1
NOW you can test the connection 🤗
This is what Dataverse for Teams looks like. 1 table called events.