Launching a new app or launching a new CRM system always leaves the users with the same question. Where do I find the application? At first I didn’t really understand the question, because I thought it was natural to bookmark the URL to your application ie https://www.company.crm4.dynamics.com/***** etc.
Eventually I realized that most users are actually using the waffle menu in office 365 when navigating to applications that they don’t use continuously.
They were expecting to see the application in the list when you clicked the waffle menu, because this would save them time.
Luckily this is not a problem 😊 Open the all apps, and locate the app you are looking for
And just like that you now have a quick navigation to your CRM or Power App application in the Microsoft 365 app launcher👊
This post is almost not relevant any more due to the fact that we all will be pushed into the make.powerapps experience in “the fullness of time”, but I will still be using the classic viewer for the foreseeable future 🙂
From time to the images can get distorted due to unknown reasons. This is not a big deal as the buttons still work, but can be annoying for the users. You might have seen this in the 2 places:
There seem to be several ways to solve this issue, but not all seem to work the same. I have seen some options describing a cache clear in the console of the browser, but for some reason none of those worked for me. Lately I have had to open the developer tools and delete a bit more manual.
Click F12 in the browser where you see the error
Open the Application tab on the dev tools
Clear or delete anything within the storage related to dynamics.
Try to restart the browser, and hopefully that should do the trick:)
My last post described the ACDC hackathon and what it’s all about, but this post is what our company ended up delivering for the Arctic Cloud Developer Challenge after 2 1/2 days of pure geeking 💪
Meet the Team
Area: Microsoft 365, SharePoint, Developer, Power Apps
Areas: Dynamics 365, Power Apps, Developer,
Areas: Microsoft 365, Azure, Developer
Areas: Power BI, Data Platform, Power Apps
Areas: Dynamics 365, Power Apps, Power Platform
Welcome to LEGO CITY
Our idea was to create an interconnected city with bleeding edge technology for monitoring the status quo of the citizens safety. The City is built next to a mountain that recently has been showing signs of activity. Geologist say that it could blow at any time. Luckily the city has invested heavily in technology that could help them save lives ❤
The city has a train track surrounding a lake. The Train track is crucial for transporting the LEGO’s around, and providing transportation for factories producing parts. In the city you have a Mayor’s office, LEGO factory, houses, and THE MOUNTAIN OF DOOM!! 🔥 Based on this drawing, or mission was to create the technology needed to make the city a safe place to live.
On the day of the event we started building right away, and this is what we created within a few hours😅
But since this is a technical blog, I will get into the makings of the Connected Lego City details, that made this project on of the coolest city concepts I have seen in a while!
Mountain of DOOM!!🔥
Let’s start at the top near themountain. We placed 2 IOT sensors for monitoring temperature, and movement. We used the Azure IOT dev kit sensors for this purpose. Both IOT sensors were pushing data to Azure IOT hub, and then over to Stream Analytics. The constant changes were reported to the Majors office. The impressive part here was
Train Track Switch
On the top left we connected an Arduino device with a servo Switch connected. This was used to change the train track from long track, to short track.
By sending a boolean to the device, it would mechanically change the trains direction. You can barely see the device hidden beneath the track with a wire for power under the blue “water”. To see it in action, just watch the video on top.
Majors Office – Command Central
In the bottom left we had the majors office (aka command central), where the major could minor all thigs happening in his city. This is where we were looking at the output from the city sensors, and reporting via Stream Analytics to Power BI.
We included Dynamcis 365 information on the left to give a status on all ongoing city work orders (fixing problems), and on the right side we had live data from the mountain. One of the charts were showing the constant temperature, and others were measuring movement. Below we connected to the Norwegian weather API, so that we could understand what potential conditions would effect our emergency response the next days.
We also created a webserver running the live feed of the camera with the controls. The conductor could then log in and power the train in any direction needed 🚆🚂
Citizen Self Service
When there was an issue in the City, the citizens could report this via porta to Dynamics Customer Service. We connected with The Portal Connector for this scenario.
Submitting the case would deliver the case to Dynamics 365 Customer Service, where we used a few PCF components from pcf.gallery for picture rendering in CRM.
LEGO Factory – Field Service
Once the case had been evaluated, we would create a work order for booking our trusted technician for the job. For this case we used Field Service OOTB, and the technician would used the new mobile app for service delivery. We also connected the warehouse for picking parts from the factory based on the initial pictures that the technician would have seen.
Payment – Vipps
The technician fixed the broken houses, he would then send an invoice via the Norwegian payment service Vipps. The awesome thing about this was that it was all done by using a Power Automate Flow for the job! Once the Work Order was complete, we simply created a payment for the Vipps API, and received our money.
If the IOT sensors detected crisis, they would start a flow that the mayor would have to approve.
From this flow we would also trigger sending of SMS notification to the citizens that were in dataverse.
WOW… Simply WOW. It’s the best way to summarize this years hackathon.
Normally this 3 day hackathon is situated in the beautiful hill of Holmenkollen, but this year we were forced to go online for obvious reasons. One little difference we did do was to open up for teams to gather locally at their companies offices. This way we could add a social factor within safety regulations without having to keep everyone 100% at home office.
We decided to have a participation fee for the event, and everything that was extra would be given to children’s cancer www.barnekreftforeningen.no ❤
This project has consumed most of my time the last few months being a part of the committee, and getting it all together has not been easy.
We were 5 teams total and almost 40 participants that dedicated 3 whole days of fun where this years topic was LEGO.
I have allready posted about the Customer Lookup, but there is a bug ATM that will prevent you from setting the customer lookup correctly.
Luckily Microsoft has provided us with a temporary workaround while they figure out how they are going to support polymorphic lookups like Customer in dataflows 👌
When trying to map customer lookup you won’t be able to see the correct lookup in the DataFlow. It may differ what you see, but it could be a different combination of fields like this:
All of the fields that have “Fieldsomething.Fieldsomethingelse” are lookups. The Customer lookup does not work as intended here, even though the field “ParentCustomerID” is the correct field to use. In a previous post I showed how to get around this issue, but Microsoft removed the projects configuration options. After talking to support, they showed me how to make it work again !!🙏💪
In our system for billing, we have the following structure. A debitor (aka Account), can potentially have lots of projects connected. The only way to connect data in DataFlow is via KEY’s in the Dynamics configuration. In our case we have a key “AccountVAT”. This ID is a unique organization number used in Norway.
Both tables know AccountID, but only the Account table holds the AccountVAT number. As you see on the image above, the Project doesn’t know what AccountVAT to connect to. Lets fix that.
So I load up my 2 tables in dataflow. If you wonder how this is done, check my other posts on DataFlow.
Select the MERGE QUERIES option
Here we match the 2 tables based on AccountID. As you see here there are lots of options for matching, but now you also get a nice visual for matching status at the bottom.
Next step is choosing what data to “join” into the projects table.
So I select the AccountVAT number as the field to join.
As you see in the picture below, the Debitor.Foretaksnr is now visible. This column is now joined from the Debitor table to the Project table.
The selected row below is a special field. You see that the Debitor.foretaksnr is a field in another table, and you see the destination field being special. The destination field is of type Lookup, and is expecting “AccountNumber” as matching.
The result is projects record linked to the correct account in CRM 👍💪
I applaud Microsoft for listening and moving in the right direction🙏🙌👍, but they seem to still be missing the point of SAAS for Dynamics 🤷♂️.
Think of Dynamics like a phone with storage. The OS takes up parts of the storage and you can freely use the rest of the storage.
The whole point of SAAS is that I am paying for a service. I am not buying any hardware and installing something from scratch. I am a part of a larger shared solution, where I pay for what I use. Either include the system tables in the price, or reduce the storage included in Dynamics. Either way it is only reasonable for a customer in SAAS to pay for what they produce of data.
Thank you for all the votes and the attention that this received, because it only goes to show that Microsoft does actually listen if we come together. In the future I hope we use our voices more often, because the number of votes on these issues are nothing compared to the number of consultants out there. We need to come together in numbers 🙂
MAAAAAASIVE update in the search experience for Dynamics, and I am extremely excited about it. The global search has received a major face-lift and is now looking really good. I personally never used the old relevance search, because I thought it looked bad and was confusing for the end customer.
My customers personally fond this search confusing when the database started to grow. I therefore always told the users to chose categorized search instead.
This way it was easier for the user to find the Accound/Contact/Oppty they were looking for.
New Search experience ✨🎉👍
The new search places the search bar in the middle on top, just like many other office applications. A fairly intuitive placement for the global search, and the results look awesome!!!
This is not the average Dynamics CRM post, but I was challenged by Malin Martnes to see if we could integrate the competency part of of Dynamics HR with Customer Engagement. The reason we wanted to look into the matter was because we thought it would be really easy!!! Turns out we were wrong.. hehe 😂
A lot of fields available, but just not the ones regarding competency.
What to do?!?!? Docs to the rescue !!
Thanks to a link from DOC’s I learned that Finance and Operations has a link to export data. You can choose whatever dataset you want, and export it to file. This has actually nothing to do with HR, but is a feature from Finance and Operations.
The delay is there because the service can sometimes use a little time with the response. Just let it work:)
This part of the flow was just to see if .zip file had been completed. When it was complete, we could get the actual package.
At this point I had absolutely no idea what I was doing, but it was working. The body of the HTTP GET function returned a .zip file. This I could create directly in a OneDrive connector.
The last step was then to extract the file from the .zip, and voila. I now had an automated Excel export from Dynamics HR.
For the last part I could simply connect a DataFlow for importing to Dynamics:)
Was there even a point to this?
Well.. Yes, and no.. The positive thing about this was learning that Finance and Operations has an export function that I could use for extremely simple integrations. This I might be able to use at some later timer.