By now I hope most people know the https://pcf.gallery (run by Guido Preite). A great page for sharing community components (PCF) and exposing awesome contributions to the rest of the world. 🌎
What I like about the PCF Gallery is the simplicity of the site only being about PCF components. This is why I asked Guido if we could create a similar site for other components regarding the Power Platform. He was so kind to share his code for the project, so Matt Beard and I decided to give it a go. 🤗
First out in list of future galleries is the CONNECTOR gallery . This site will contain all sorts of custom connectors for Power Platform that you can share with the community. If you want to contribute to this gallery, you only have to share the custom connector file you have on GitHub, and we will post it out!
Launching a new app or launching a new CRM system always leaves the users with the same question. Where do I find the application? At first I didn’t really understand the question, because I thought it was natural to bookmark the URL to your application ie https://www.company.crm4.dynamics.com/***** etc.
Eventually I realized that most users are actually using the waffle menu in office 365 when navigating to applications that they don’t use continuously.
They were expecting to see the application in the list when you clicked the waffle menu, because this would save them time.
Luckily this is not a problem 😊 Open the all apps, and locate the app you are looking for
And just like that you now have a quick navigation to your CRM or Power App application in the Microsoft 365 app launcher👊
My last post described the ACDC hackathon and what it’s all about, but this post is what our company ended up delivering for the Arctic Cloud Developer Challenge after 2 1/2 days of pure geeking 💪
Meet the Team
Area: Microsoft 365, SharePoint, Developer, Power Apps
Areas: Dynamics 365, Power Apps, Developer,
Areas: Microsoft 365, Azure, Developer
Areas: Power BI, Data Platform, Power Apps
Areas: Dynamics 365, Power Apps, Power Platform
Welcome to LEGO CITY
Our idea was to create an interconnected city with bleeding edge technology for monitoring the status quo of the citizens safety. The City is built next to a mountain that recently has been showing signs of activity. Geologist say that it could blow at any time. Luckily the city has invested heavily in technology that could help them save lives ❤
The city has a train track surrounding a lake. The Train track is crucial for transporting the LEGO’s around, and providing transportation for factories producing parts. In the city you have a Mayor’s office, LEGO factory, houses, and THE MOUNTAIN OF DOOM!! 🔥 Based on this drawing, or mission was to create the technology needed to make the city a safe place to live.
On the day of the event we started building right away, and this is what we created within a few hours😅
But since this is a technical blog, I will get into the makings of the Connected Lego City details, that made this project on of the coolest city concepts I have seen in a while!
Mountain of DOOM!!🔥
Let’s start at the top near themountain. We placed 2 IOT sensors for monitoring temperature, and movement. We used the Azure IOT dev kit sensors for this purpose. Both IOT sensors were pushing data to Azure IOT hub, and then over to Stream Analytics. The constant changes were reported to the Majors office. The impressive part here was
Train Track Switch
On the top left we connected an Arduino device with a servo Switch connected. This was used to change the train track from long track, to short track.
By sending a boolean to the device, it would mechanically change the trains direction. You can barely see the device hidden beneath the track with a wire for power under the blue “water”. To see it in action, just watch the video on top.
Majors Office – Command Central
In the bottom left we had the majors office (aka command central), where the major could minor all thigs happening in his city. This is where we were looking at the output from the city sensors, and reporting via Stream Analytics to Power BI.
We included Dynamcis 365 information on the left to give a status on all ongoing city work orders (fixing problems), and on the right side we had live data from the mountain. One of the charts were showing the constant temperature, and others were measuring movement. Below we connected to the Norwegian weather API, so that we could understand what potential conditions would effect our emergency response the next days.
We also created a webserver running the live feed of the camera with the controls. The conductor could then log in and power the train in any direction needed 🚆🚂
Citizen Self Service
When there was an issue in the City, the citizens could report this via porta to Dynamics Customer Service. We connected with The Portal Connector for this scenario.
Submitting the case would deliver the case to Dynamics 365 Customer Service, where we used a few PCF components from pcf.gallery for picture rendering in CRM.
LEGO Factory – Field Service
Once the case had been evaluated, we would create a work order for booking our trusted technician for the job. For this case we used Field Service OOTB, and the technician would used the new mobile app for service delivery. We also connected the warehouse for picking parts from the factory based on the initial pictures that the technician would have seen.
Payment – Vipps
The technician fixed the broken houses, he would then send an invoice via the Norwegian payment service Vipps. The awesome thing about this was that it was all done by using a Power Automate Flow for the job! Once the Work Order was complete, we simply created a payment for the Vipps API, and received our money.
If the IOT sensors detected crisis, they would start a flow that the mayor would have to approve.
From this flow we would also trigger sending of SMS notification to the citizens that were in dataverse.
WOW… Simply WOW. It’s the best way to summarize this years hackathon.
Normally this 3 day hackathon is situated in the beautiful hill of Holmenkollen, but this year we were forced to go online for obvious reasons. One little difference we did do was to open up for teams to gather locally at their companies offices. This way we could add a social factor within safety regulations without having to keep everyone 100% at home office.
We decided to have a participation fee for the event, and everything that was extra would be given to children’s cancer www.barnekreftforeningen.no ❤
This project has consumed most of my time the last few months being a part of the committee, and getting it all together has not been easy.
We were 5 teams total and almost 40 participants that dedicated 3 whole days of fun where this years topic was LEGO.
In our system for billing, we have the following structure. A debitor (aka Account), can potentially have lots of projects connected. The only way to connect data in DataFlow is via KEY’s in the Dynamics configuration. In our case we have a key “AccountVAT”. This ID is a unique organization number used in Norway.
Both tables know AccountID, but only the Account table holds the AccountVAT number. As you see on the image above, the Project doesn’t know what AccountVAT to connect to. Lets fix that.
So I load up my 2 tables in dataflow. If you wonder how this is done, check my other posts on DataFlow.
Select the MERGE QUERIES option
Here we match the 2 tables based on AccountID. As you see here there are lots of options for matching, but now you also get a nice visual for matching status at the bottom.
Next step is choosing what data to “join” into the projects table.
So I select the AccountVAT number as the field to join.
As you see in the picture below, the Debitor.Foretaksnr is now visible. This column is now joined from the Debitor table to the Project table.
The selected row below is a special field. You see that the Debitor.foretaksnr is a field in another table, and you see the destination field being special. The destination field is of type Lookup, and is expecting “AccountNumber” as matching.
The result is projects record linked to the correct account in CRM 👍💪
I applaud Microsoft for listening and moving in the right direction🙏🙌👍, but they seem to still be missing the point of SAAS for Dynamics 🤷♂️.
Think of Dynamics like a phone with storage. The OS takes up parts of the storage and you can freely use the rest of the storage.
The whole point of SAAS is that I am paying for a service. I am not buying any hardware and installing something from scratch. I am a part of a larger shared solution, where I pay for what I use. Either include the system tables in the price, or reduce the storage included in Dynamics. Either way it is only reasonable for a customer in SAAS to pay for what they produce of data.
Thank you for all the votes and the attention that this received, because it only goes to show that Microsoft does actually listen if we come together. In the future I hope we use our voices more often, because the number of votes on these issues are nothing compared to the number of consultants out there. We need to come together in numbers 🙂
I will keep calling it CDS Lite in the smallest hope that this one day might get the name CDS Lite. It just makes perfect sense to give it a name like this. Allright, enough about the hope for a better naming future…
I know the system is only in preview and many things will change in the next months, but I thought I would give it a go at importing data from CDS to CDS Lite. I had no idea if this was possible, but I was curious to see how it would react.
At the moment there is no connecting to the database through normal tools I have had before
SSIS – KingsWaySoft
Every time I try to connect I get the response saying that I have to use an Oauth method to connect.
PS: Notice how Microsoft has put “LITE” in the error message? It’s a small hope, but it’s a hope 😂
So the only way I found out how to insert data in the Preview was via flow
Flow to the rescue
I am pretty sure that the current doc recommends using Flow for data to CDS Lite (Oakdale), but I am sure this will change in the future somehow. How else are you going to fill 2 million records (estimates from docs on the 2 gig DB size). Flow is not intended for data migration, but I had to use the tools I had available at the time 🙂
First I imported 5694 accounts to my CDS database (top row headings)
The CDS Lite (Oakdale) was empty for accounts, and only included the following columns.
The flow was a little tricky. I actually had to create the flow from the CDS Lite (Oakdale) environment. It did not work otherwise. Current environment connector was fine for writing to CDS – Lite (Oakdale).
The flow will only do 512 records if you don’t open up the paging. Go to settings of the connector and increase the threshold.
I started the flow and patiently waited.
27 minutes later the accounts had been migrated with a warning. I could only find 1 error in the migration, so I consider that a success. All of my accounts were now over to the CDS Lite (Oakdale).
This is all just a beginning of a longer journey for the CDS Lite (Oakdale). Everything is in preview, so there are many things that will change once this is released, and a few years from now. I just wanted to see how to get data in to the system. The reason why might be more clear in a later post where I describe what I see in a future customer scenario for CDS and CDS Lite.
He created a PCF component for quick updating related entities without having to leave the record.
Updating related LOOKUP
Internally we have all of our active projects in a list. For every project we have a customer contact registered for continuous surveys (Power Automate, Forms Pro, SMS.. Cool Stuff). For the system to work correctly we need to make sure we have both the phone number and email address updated at any given time for the automated surveys to work.
I have been posting a lot from the Data Flow in Power Platform, because I needed to use it internally. So far this tool is extremely interesting, but I am not sure it is 100% clear to me how it all works.
This is why i decided to import 94.000 contacts to 1 account, to see how fast/slow it was. I also wanted to monitor how much this affects the API or any other stuff in the CDS/Dynamcis base. This post is mostly for my own curiosity.
The run itself didn’t show much of timestamps, so i opened an advanced find
So it used 1 hour for 94883 records. Not really sure how we can measure this anymore, because of speed caps hitting a little randomly. I would say this is decent speed when importing. 94883 contacts / 60 minutes / 60 seconds = 26,35 contacts pr second
The storage grew a fair bit…. 2.72GB -> 3.44GB = 720 MB = 0,007 MB pr contact
This is the old Address entity.. Anyone even use this anymore???
How about API calls?
The first time i ran this, it was only create. Then i did a second round, and all I got was 13,544 API calls. I am not sure I trust the Analytics yet!
After running the import 2 times, i never got any indication about API usage. There must be something wrong with the statistics here? I did almost 100k upserts, and there is nothing to prove that I have accept for the Creates under performing operations. This alone would make this an unbelievable integration platform for CDS because of the API’s we are saving. I understand that this is probably not the case, and there must be something wrong with the reporting.
That being said. No one has been able to tell me if this cost any money. If this is considered a part of Dynamics/PowerPlatform license i am impressed. A truly hidden GEM.
I still don’t know if this is local to my tennant, but I have tried in 2 different orgs, and I always get an error with Customer field for Contact. This post is almost identical to the Set Parent Lookup, but there are some differences in the end.
I have one Excel file with 6 contacts. They are linked to the account with number 311. The unique identifier I have on contact is email.
I setup a key for email on contact, so that the system will understand what to write to.
On the Account I have 2 fields for number matching, because I have different systems that integrate, but it doesn’t matter in this case. PO Account Number is what we are going to use, and I will try to explain why (based on my findings).
PS: Both Account Number and PO Account Number are Alternate Keys on Account!!
Setup Data Flow
Go to https://make.powerapps.com and start a new Data Flow project. In this case I am choosing the Excel file for simplicity. I have uploaded the Excel file to One Drive, so that it is available at all times online.
In the Excel file I had to make sure that the first row was headers before i continue
During the mapping I will see both the Alternate Keys I have for account. Normally you probably only have 1 Alternate Key for Account, but my setup requires 2 because of 2 different systems integrating against 2 different numbers. For blog i am using value 311, so it doesn’t really matter here.
When done mapping Account Number (again it doesn’t matter if I choose one of the other) continue to the next step. I choose a manual refresh, and it get stuck here.
You will also see an error in the data flow projects.
Open data integrations, and open the one that failed.
From here you can open the mapping table
This is where you most likely will see a missing mapping. For some reason it cancels out my mapping. I have tried this in 2 environments and same issues. Every time i choose the Customer Lookup i have this problem.
Click on the destination field, and navigate WAY to the bottom.
This is where I believe the bug is located. Data Flows happens to be sensitive to what Alternate Key was created last. This key is the last thing that I published on Account, and therefore it is in the list. If you only have 1 alternate key, this will show.
Again, it doesn’t matter in this demo because i have 311 in both KEY fields.
The Work Around
After you save the changes, open Data integration. Here you will setup a schedule to run from the admin.powerplatform.microsoft.com.
Click save schedule and wait for it to run. PS!! Remember to stop the schedule when done!
For the Final result you will see 6 contacts connected to the Account
Next up is how long does it take to import 94000 contacts to CDS via Data Flow!