There have been several posts on the environment variables, but I think Microsoft just released a pretty GREAT update recently. I honestly don’t have a clue when this function was released, but it made my day a LOT easier:)
Bilde system A og system B med verdi som er ulik pr system, men refereres i noe i begge systemer.
Microsoft has introduced environmental variables to Dynamics / Dataverse to fix this exact problem.
Within the solution explorer you can add Environmental variables that can be used across the different systems. In this test I am using a variable to define what environment I am working in “Test or Production”.
I wont be writing about the Environment variables in detail, as you can read more about them in other posts, but I wanted to cover the new way of using this within Power Automate!
There are many ways of getting the current environment URL, but this is the quick and dirty version of doing just this 😏 Next week I will post about Environment variables as I know this is a possible approach that is better.
The intention about this blog is to show how to use Json to parse just a small part of an action body to get what you want exposed.
So this could be a typical flow in Power Automate for Dataverse. We have a trigger on top and an action below to get more data from the element that triggered.
The BODY of the trigger doesn’t contain environment information, but only Opportunity data:
But the Action contains a lot more interesting data for this.
In order to get this data we need to parse the Json returned here to retrieve the “Odata.ID” that includes the URL for our environment.
Even though we all wish it wasn’t so, Document Locations still rule the integrations between Dynamics 365 and SharePoint. I’m not saying that I have a better idea what would be a smarter way of solving it, but it all seems a bit “2011” ish.
Last week I encountered a problem with the Document Locations for Teams, and I was surprised when I couldn’t find them in the Document Locations at first. The list only contained the SharePoint sites that the standard SharePoint connector uses.
In this list I was missing all of the Teams locations. Turns out that the view is only showing Active SHAREPOINT locations.. hehe
All you have to do is add the “MS TEAMS” to the search, and you should see all of the document locations for that also
After a lot of painful digging I finally found the issue. Someone had decided to install SharePoint in Norwegian when they first setup the tenant!!! hehe. This meant that the URL the SharePoint URL was wrong.
Wrong URL ⛔
Correct URL ✅
Microsoft Support didn’t see a fix in the near future for language support, so I guess it’s time for a small work around 🙂 Not really exciting fix, but you need to create a Workflow or Power Automate on create to change the name of the document location to your local language.
Why a workflow you may ask? When is the last time a workflow failed you I answer 😎
Launching a new app or launching a new CRM system always leaves the users with the same question. Where do I find the application? At first I didn’t really understand the question, because I thought it was natural to bookmark the URL to your application ie https://www.company.crm4.dynamics.com/***** etc.
Eventually I realized that most users are actually using the waffle menu in office 365 when navigating to applications that they don’t use continuously.
They were expecting to see the application in the list when you clicked the waffle menu, because this would save them time.
Luckily this is not a problem 😊 Open the all apps, and locate the app you are looking for
And just like that you now have a quick navigation to your CRM or Power App application in the Microsoft 365 app launcher👊
This post is almost not relevant any more due to the fact that we all will be pushed into the make.powerapps experience in “the fullness of time”, but I will still be using the classic viewer for the foreseeable future 🙂
From time to the images can get distorted due to unknown reasons. This is not a big deal as the buttons still work, but can be annoying for the users. You might have seen this in the 2 places:
There seem to be several ways to solve this issue, but not all seem to work the same. I have seen some options describing a cache clear in the console of the browser, but for some reason none of those worked for me. Lately I have had to open the developer tools and delete a bit more manual.
Click F12 in the browser where you see the error
Open the Application tab on the dev tools
Clear or delete anything within the storage related to dynamics.
Try to restart the browser, and hopefully that should do the trick:)
My last post described the ACDC hackathon and what it’s all about, but this post is what our company ended up delivering for the Arctic Cloud Developer Challenge after 2 1/2 days of pure geeking 💪
Meet the Team
Area: Microsoft 365, SharePoint, Developer, Power Apps
Areas: Dynamics 365, Power Apps, Developer,
Areas: Microsoft 365, Azure, Developer
Areas: Power BI, Data Platform, Power Apps
Areas: Dynamics 365, Power Apps, Power Platform
Welcome to LEGO CITY
Our idea was to create an interconnected city with bleeding edge technology for monitoring the status quo of the citizens safety. The City is built next to a mountain that recently has been showing signs of activity. Geologist say that it could blow at any time. Luckily the city has invested heavily in technology that could help them save lives ❤
The city has a train track surrounding a lake. The Train track is crucial for transporting the LEGO’s around, and providing transportation for factories producing parts. In the city you have a Mayor’s office, LEGO factory, houses, and THE MOUNTAIN OF DOOM!! 🔥 Based on this drawing, or mission was to create the technology needed to make the city a safe place to live.
On the day of the event we started building right away, and this is what we created within a few hours😅
But since this is a technical blog, I will get into the makings of the Connected Lego City details, that made this project on of the coolest city concepts I have seen in a while!
Mountain of DOOM!!🔥
Let’s start at the top near themountain. We placed 2 IOT sensors for monitoring temperature, and movement. We used the Azure IOT dev kit sensors for this purpose. Both IOT sensors were pushing data to Azure IOT hub, and then over to Stream Analytics. The constant changes were reported to the Majors office. The impressive part here was
Train Track Switch
On the top left we connected an Arduino device with a servo Switch connected. This was used to change the train track from long track, to short track.
By sending a boolean to the device, it would mechanically change the trains direction. You can barely see the device hidden beneath the track with a wire for power under the blue “water”. To see it in action, just watch the video on top.
Majors Office – Command Central
In the bottom left we had the majors office (aka command central), where the major could minor all thigs happening in his city. This is where we were looking at the output from the city sensors, and reporting via Stream Analytics to Power BI.
We included Dynamcis 365 information on the left to give a status on all ongoing city work orders (fixing problems), and on the right side we had live data from the mountain. One of the charts were showing the constant temperature, and others were measuring movement. Below we connected to the Norwegian weather API, so that we could understand what potential conditions would effect our emergency response the next days.
We also created a webserver running the live feed of the camera with the controls. The conductor could then log in and power the train in any direction needed 🚆🚂
Citizen Self Service
When there was an issue in the City, the citizens could report this via porta to Dynamics Customer Service. We connected with The Portal Connector for this scenario.
Submitting the case would deliver the case to Dynamics 365 Customer Service, where we used a few PCF components from pcf.gallery for picture rendering in CRM.
LEGO Factory – Field Service
Once the case had been evaluated, we would create a work order for booking our trusted technician for the job. For this case we used Field Service OOTB, and the technician would used the new mobile app for service delivery. We also connected the warehouse for picking parts from the factory based on the initial pictures that the technician would have seen.
Payment – Vipps
The technician fixed the broken houses, he would then send an invoice via the Norwegian payment service Vipps. The awesome thing about this was that it was all done by using a Power Automate Flow for the job! Once the Work Order was complete, we simply created a payment for the Vipps API, and received our money.
If the IOT sensors detected crisis, they would start a flow that the mayor would have to approve.
From this flow we would also trigger sending of SMS notification to the citizens that were in dataverse.
WOW… Simply WOW. It’s the best way to summarize this years hackathon.
Normally this 3 day hackathon is situated in the beautiful hill of Holmenkollen, but this year we were forced to go online for obvious reasons. One little difference we did do was to open up for teams to gather locally at their companies offices. This way we could add a social factor within safety regulations without having to keep everyone 100% at home office.
We decided to have a participation fee for the event, and everything that was extra would be given to children’s cancer www.barnekreftforeningen.no ❤
This project has consumed most of my time the last few months being a part of the committee, and getting it all together has not been easy.
We were 5 teams total and almost 40 participants that dedicated 3 whole days of fun where this years topic was LEGO.
I have allready posted about the Customer Lookup, but there is a bug ATM that will prevent you from setting the customer lookup correctly.
Luckily Microsoft has provided us with a temporary workaround while they figure out how they are going to support polymorphic lookups like Customer in dataflows 👌
When trying to map customer lookup you won’t be able to see the correct lookup in the DataFlow. It may differ what you see, but it could be a different combination of fields like this:
All of the fields that have “Fieldsomething.Fieldsomethingelse” are lookups. The Customer lookup does not work as intended here, even though the field “ParentCustomerID” is the correct field to use. In a previous post I showed how to get around this issue, but Microsoft removed the projects configuration options. After talking to support, they showed me how to make it work again !!🙏💪
In our system for billing, we have the following structure. A debitor (aka Account), can potentially have lots of projects connected. The only way to connect data in DataFlow is via KEY’s in the Dynamics configuration. In our case we have a key “AccountVAT”. This ID is a unique organization number used in Norway.
Both tables know AccountID, but only the Account table holds the AccountVAT number. As you see on the image above, the Project doesn’t know what AccountVAT to connect to. Lets fix that.
So I load up my 2 tables in dataflow. If you wonder how this is done, check my other posts on DataFlow.
Select the MERGE QUERIES option
Here we match the 2 tables based on AccountID. As you see here there are lots of options for matching, but now you also get a nice visual for matching status at the bottom.
Next step is choosing what data to “join” into the projects table.
So I select the AccountVAT number as the field to join.
As you see in the picture below, the Debitor.Foretaksnr is now visible. This column is now joined from the Debitor table to the Project table.
The selected row below is a special field. You see that the Debitor.foretaksnr is a field in another table, and you see the destination field being special. The destination field is of type Lookup, and is expecting “AccountNumber” as matching.
The result is projects record linked to the correct account in CRM 👍💪