Common Data Model Vs Common Data Service

ByRob Peledie

Common Data Model Vs Common Data Service

I thought I would put together a short explanation of Microsoft’s Common Data X conventions, and specifically what the meaning and differences are between CDM and CDS.

One of my drivers for this is that I didn’t fully understand it myself, or at least I find it difficult to articulate the definitions, and there’s surprisingly little about the connection between the terms….

So let’s start off with the “official” definitions of both

Common Data Model


The Common Data Model is a declarative specification, and definition of standard entities that represent commonly used concepts and activities across business and productivity applications, and is being extended to observational and analytical data as well. CDM provides well-defined, modular, and extensible business entities such as Account, Business Unit, Case, Contact, Lead, Opportunity, and Product, as well as interactions with vendors, workers, and customers, such as activities and service level agreements. Anyone can build on and extend CDM definitions to capture additional business-specific ideas.

Common Data Service


Common Data Service lets you securely store and manage data that’s used by business applications. Data within Common Data Service is stored within a set of entities. An entity is a set of records used to store data, similar to how a table stores data within a database. Common Data Service includes a base set of standard entities that cover typical scenarios, but you can also create custom entities specific to your organization and populate them with data using Power Query. App makers can then use PowerApps to build rich applications using this data.

There we go…… thats clear Isn’t it?………….

Still doesn’t feel like a clear explanation, so let’s try again…..

I think the clue to understanding this is in the last word of both expressions – Model and Service

A Model provides a framework, or architecture that standardises something….. so for example you might use the expression 
“the project became a model for other schemes” to infer that other projects will be based on this. 

So in this context, the Common Data Model is a standardisation of data concepts.

On the Microsoft GitHub page for CDM, you can find the poster below, which I think really explains the concept


The Common Data Model standard defines a common language for business entities covering, over time, the full range of business processes across sales, services, marketing, operations, finance, talent, and commerce and for the Customer, People, and Product entities at the core of a company’s business processes. The goal of CDM is to enable data and application interoperability spanning multiple channels, service implementations, and vendors. CDM provides self-describing data (structurally and semantically), enabling applications to easily read and understand the data.

I think that starts to focus our understanding a little better.

Now, if you really want to dig in to the Schema, and get a good understanding, then you can use the CDM Entity Navigator to drill in to it

So…….. How does CDS fit in to this? Well as with CDM, let’s look at the last word – Service.

If we view the CDM as the standard of entity schema, then the CDS is the mechanism for enabling you to build that data.

To put its simply……if you were to spin up a new environment of CDS here, then it would build it based on the CDM schema, and supply you with some of those standard Entities and relationships defined in the CDM Schema, but the actual data etc would be provisioned or managed by the Service…CDS.

Hopefully the above has helped to explain the difference between CDM and CDS and how they fit in to the landscape.

ByRob Peledie

AI Builder In PowerApps – Business Card Reader Example

So a few days ago (Early June 2019), Microsoft started to roll out previews of the AI Builder in PowerApps.

What is AI Builder?

According to the Documentation (and it’s worth bearing in mind, that because this is in preview mode, the documentation may change), AI Builder is:


“A new Power Platform capability that allows you to easily automate processes and predict outcomes to help improve business performance. AI Builder is a turnkey solution that brings the power of Microsoft Artificial Intelligence through a point and click experience. Using AI Builder, you can add intelligence to your apps even if you have no coding or data science skills.”

As it stands right now, this is in preview, and you have to enable it in your PowerApps settings.

For this quick article I thought I would go through a really simple example of scanning a business card, and then creating a new Lead record in Dynamics 365.

Just worth saying at this point, that while the following example is simple, it does show a really good use of AI and Microsoft’s Cognitive services. OCR or Optical Character Recognition, has been around for many years, but in the example below, the AI is doing more than just converting an image in to text….. it’s then analysing that text and making a decision as to what it is…. so looking at a word, and deciding it is a first name as opposed to a last name or company name. Putting that sort of power in our hands is awesome, and will get better and better!

To start with, create a new Canvas app in PowerApps, and use the Phone layout. If you have enabled the AI Builder Preview, then you should see the options in the insert menu

AI Builder

Choose the Business Card Reader option. This will place the Business Card Reader component on the Screen

The Business Card Reader currently can extract the following if a Business Card is detected:

  • CleanedImage: The image after processing where the business card appears cropped and enhanced from the original image.
  • CompanyName: The company name in the business card, if found.
  • Department: The organization department found in the business card, if found.
  • Email: The contact email found in the business card if any.
  • FirstName: The contact first name in the business card, if found.
  • FullAddress: The contact full address in the business card, if found.
  • FullName: The contact full name in the business card, if found.
  • JobTitle: The contact job title in the business card, if found.
  • LastName: The contact last name in the business card, if found.
  • OriginalImage: The original image before processing.
  • Phone1: The first phone or fax number detected in the business card, if found.
  • Phone2: The second phone or fax number detected in the business card, if found.
  • Phone3: The third phone or fax number detected in the business card, if found.
  • Website: The website detected in the business card, if found.

I decided just to keep things nice and Simple for this demo, so just included the following fields that I would use to create a new Lead in Dynamics 365:

  • Company Name
  • First Name
  • Last Name
  • Email Address

I also included a text field that I would map over to the Description field in Dynamics (I’ll show you later how I configure the Microsoft Flow to concatenate the Company Name – Last name for the Topic on the Lead record.

The fields I added were Text Input fields, so If I needed to change anything on the captured Data I could.

The formula for each field just references the Business Card Reader Component Name and the field, so for Company Name it was

BusinessCardReader1.CompanyName

After you have completed these steps, Save and Publish your app. If you’ve followed all the steps above, then you should be able to open the App, Take a photo of a business card and after processing, retrieve the data in to the fields.

So the next stage is to grab that data from our app, and use it to create a new lead.

For this we need to trigger a flow from a button on our screen. I added a Button and called it “Submit to D365”

Once the Button is on the screen, it needs to trigger a flow, which we’ll create next. In the Action Tab of PowerApps, choose Flows, then click Create a new flow. This will open up the Flow designer, which should already have the connector to your PowerApp (Thus inheriting that the button will trigger it)

So in the image above, you’ll see that I have added the “Create a New Record” Action in the flow, and chosen the Org Name and Entity – Lead for this example.

What you’ll also notice is that In the right hand side, there are currently no fields available, but instead have the “Ask In PowerApps” option. This is so that the flow can grab the relevant fields from the App. So If I Choose the Last Name field in the Create a Record Activity then Ask in PowerApps, it will return a holding field for reference:

You’ll also see in the image above that I have concatenated some fields for the Topic.

Once you’ve done this….. Save your Flow and go back to the App. 

In the App, highlight your Submit button, and use the following formula

PowerAppsbutton.Run(Createanewrecord_LastName,Createanewrecord_CompanyName,Createanewrecord_Email,Createanewrecord_FirstName)

So this is launching the flow and grabbing the data from the APP

NOTE: Worth noting here, that you’ll need to change the names of the fields on the app to match the names above and in the flow – so add the Createanewrecord_ prefix)

If all looks good, you should have no red crosses, and your formula looks like this

So after that, Save and Publish you App and give it a go!

ByRob Peledie

365Knowledge YouTube Channel Update

It’s been a number of years since 365Knowledge posted any videos on YouTube, so I thought it was time we started again.

Interestingly, although I had neglected the channel, the 6 videos had a total watch of over 13,000 views which is small in comparison to most channels, but still quite a number.

So……………Watch this space!

ByRob Peledie

Create Dynamics 365 Records From Azure Service Bus Queue – Via Logic App

I wanted to put together a solution where an external system could send a JSON payload through an Azure Service Bus Queue, and a new record created in Dynamics 365.

There are no doubt better, or at least different ways to approach a scenario like this, but I wanted to brush up on some skills, so thought it was a nice use case, and one I will be needing soon for integrating an external Oracle system.

So, basically this is the running order of steps:

  • Create Azure Service Bus Queue
  • Create Logic  App
  • Test (Using Postman to send JSON messages)

So, nothing too complex, but just thinking about this sort of scenario a few years ago (not that many to be fair), would have required a fair amount more development skills, but pretty well all of the above can be achieved in a codeless manner. As I have said before, “Don’t use a Sledgehammer to crack a walnut”. I have also tried to add some links to useful references within this blog.

So firstly, setup an Azure Service bus, and make sure you note the Primary Connection string, Queue name, and primary key.

If you’re planning on using something like Postman to test, the you’ll need to create a SAS Key

Open up a Cloud Shell and use the following to create a SAS Key.

You’ll need the Service Bus URI and the Queue Name, as well as the policy name and Key.

[Reflection.Assembly]::LoadWithPartialName("System.Web")| out-null
$URI="myNamespace.servicebus.windows.net/myEventHub"
$Access_Policy_Name="RootManageSharedAccessKey"
$Access_Policy_Key="myPrimaryKey"
#Token expires now+300
$Expires=([DateTimeOffset]::Now.ToUnixTimeSeconds())+300
$SignatureString=[System.Web.HttpUtility]::UrlEncode($URI)+ "`n" + [string]$Expires
$HMAC = New-Object System.Security.Cryptography.HMACSHA256
$HMAC.key = [Text.Encoding]::ASCII.GetBytes($Access_Policy_Key)
$Signature = $HMAC.ComputeHash([Text.Encoding]::ASCII.GetBytes($SignatureString))
$Signature = [Convert]::ToBase64String($Signature)
$SASToken = "SharedAccessSignature sr=" + [System.Web.HttpUtility]::UrlEncode($URI) + "&sig=" + [System.Web.HttpUtility]::UrlEncode($Signature) + "&se=" + $Expires + "&skn=" + $Access_Policy_Name
$SASToken

To test the Service Bus Queue I used Postman. You can add a number of useful Postman collections for Azure from Ludvig Falck on GitHub

There’s a few steps then in Postman to setup the parameters, Headers and Body (I just setup a really simple JSON payload as you can see).

Parameters

Add queueName as a Key and the queue name as the Value

Headers

For the Headers, you’ll need the SAS key you generated earlier.

Body

For the body, I just used a really simple payload

{
"email":"me@me.com"	,
"first":"John",
"last":"Smith"
}

Once this is completed, you should be able to hit “Send” and see the message being created in the Azure Service Bus Queue

Next step is to grab that message from Service Bus, and use it to create a Record in this case.

I decided to use Logic Apps as opposed to Flow, just to keep everything in Azure (although technically it’s all in Azure, but you get what I mean).

So this is the basic flow:

It’s pretty easy to follow. When a message is received in the Queue, grab it, Parse the JSON, use the data to create a record, then complete that message in the Queue.

One thing to bear in mind is that the JSON payload will need to be decoded to then be used in the creation of a Dynamics Record. In the “Content” field of the Parse JSON action, add this expression:

decodeBase64(triggerBody()?[‘ContentData’])

You’ll also need to add the JSON Schema. For the example Payload I used this:

{
    "properties": {
        "email": {
            "type": "string"
        },
        "first": {
            "type": "string"
        },
        "last": {
            "type": "string"
        }
    },
   
}

Once this is all done, Save the Logic app and Run it. If you’ve followed all the steps, sending a JSON payload from Postman should create a record in Dynamics.

ByRob Peledie

Count File Rows with PowerShell

Very quick post (more so I don’t forget how to do this!)

Had a CSV file which I knew had a lot of rows!….. more than excel could handle to open, so I wanted to know the exact number, but without opening.

PowerShell to the rescue:

$lines = 0
$fileReader = New-Object IO.StreamReader 'NAME_OF_YOUR_CSV.csv'
while ($fileReader.ReadLine() -ne $null)
{
	$lines++
}
$fileReader.close()
echo $lines

Result for me….. after a short while 🙂 4,366,151

ByRob Peledie

Fun with Microsoft Flow, What3Words and Flic Button

There’s some really cool IOT stuff going on, and its getting easier to get involved and have a go.

I thought in this blog I would have a play with a couple of tools and see what we could do. Hopefully it’ll inspire you to have a go and improve!

For this article I am going to use:

  • Microsoft Flow
  • What3Words 
  • Flic

Before I go any further, lets just have a quick overview of these tools.

Microsoft Flow

Microsoft Flow is a tool that can integrate cloud-based apps and services so they interact with each other seamlessly. According to Microsoft, this cloud-based tool improves efficiency and productivity by enabling virtually anyone in an organization to automate many tedious and time-consuming business tasks and processes without developer intervention.

What3Words

what3words provides a precise and incredibly simple way to talk about location. We have divided the world into a grid of 3m x 3m squares and assigned each one a unique 3 word address.

Flic

Flic Buttons are a Smart Button that can be linked to a number of services. The button can be programmed to perform actions via an app or a hub, and can even trigger Microsoft Flows

I know with the Flic button a message with your coordinates can be sent, so I was interested to see if we could Parse that message through Microsoft Flow, run it through the What3Words API then action something with the results.

The resulting flow takes looks like this:

So lets run through it.

Theres a couple of things you need to do with the Flic button to set this up. 

In the Flic App on your phone or other device, you need to setup your connection to Microsoft Flow, and its really important that you choose to send location data:

Once this is done, log in to Microsoft Flow and create a new flow from Blank. You will then be able to add a Flic button as a trigger – the first time you do this, you’ll need to add your Flic account as a connection. You’ll then get to choose which button and which action

Next add an HTTP action. If your connection to Flic is successful, then you’ll have the output from the button press as options to feed in to your HTTP (in this case it is to access the What3Words API ( this is assuming you have already signed up to the What3Words API here and have an API key.

So here we have added the What3Words URI, and as you’ll see added the output from the Flic button as the coordinates. You’ll also need to add your API key as shown in the URI after &Key= and in the Value field.

Once this is completed, the returning JSON needs to be Parsed, so add the Parse JSON action. 

For the Schema, I used the “Use Sample Payload” option and the example on the What3Words Documentation here but just in case, here it is:

{
    "country": "GB",
    "square": {
        "southwest": {
            "lng": -0.195543,
            "lat": 51.520833
        },
        "northeast": {
            "lng": -0.195499,
            "lat": 51.52086
        }
    },
    "nearestPlace": "Bayswater, London",
    "coordinates": {
        "lng": -0.195521,
        "lat": 51.520847
    },
    "words": "filled.count.soap",
    "language": "en",
    "map": "https://w3w.co/filled.count.soap"
}                           

Finally, for this example I took the Parsed data, and added it to an email, but you could just as easily create a record in Dynamics 365 etc

And that should be it! If all works you’ll get lots of lovely ticks on your flow, and an email with your 3 word location, and a link to the map:

The possibilities are endless with these sort of tools. The whole process took less than an hour, and the only “Code” was the JSON payload that I copied and pasted!

Have a go.

ByRob Peledie

Microsoft AZ-900 Certification

I’m pleased to say I have just passed the AZ-900 exam.

Although my main area of expertise is and always has been Dynamics 365 (CRM), I find myself getting more and more involved with the wider Microsoft Business Solutions Ecosystem, and that includes Azure. 

Whether its Logic Apps for workflow, or Service Bus for integration, I find most weeks I’m delving in to Azure.

For those thinking of taking the exam, I have to say I didn’t find it too difficult, and there are so many resources available to make sure you’re up to speed.

As a starting point, it’s always worth checking out the  details of what you’ll be examined on. for AZ-900, the details can be found here 

If you want some great training, and have a limited budget, I cannot recommend Microsoft Learn highly enough – Partly because it’s free, but mostly because it’s awesome! and getting better almost daily.

ByRob Peledie

Dynamics 365 User Access Report with FetchXml and Power BI

Ok, so that’s not the snappiest of titles, but essentially it’s my lazy way of covering a couple of topics in one blog post:

  1. Need to report on Licensed Users and When they Last Accessed Dynamics
  2. Showing that in a Power BI Report using FetchXML

It’s worth noting at this point, how fantastic the Dynamics 365, and generally the Microsoft community is. Most of what I did has already been done and blogged about for free. I’ve tried to reference everyone who’s blog posts helped me.

So the first thing I needed was some fetch to give me the last logged in (or Access in Audit entity terms) for users.  I got the following from www.mscrmsolution.com but could have hand crafted in FetchXML Builder in XrmToolbox

<fetch aggregate='true' >
    <entity name='audit' >
    <attribute name='createdon' alias='LastAccessTime' aggregate='max' />
    <filter>
        <condition attribute='operation' operator='eq' value='4' />
        <condition entityname='su' attribute='isdisabled' operator='eq' value='0' />
    </filter>
    <link-entity name='systemuser' from='systemuserid' to='objectid' alias='su' >
        <attribute name='systemuserid' alias='SystemUserId' groupby='true' />
        <attribute name='domainname' alias='DomainName' groupby='true' />
        <attribute name='fullname' alias='FullName' groupby='true' />
    </link-entity>
    </entity>
    </fetch>

Couple of things worth noting here. Operation = 4 which is the Access record in the Audit (as opposed to Update, Create etc)

So at this point, I had the first piece done, as all I initially wanted was a way to see how long it was since users logged in. To get a quick view I just pasted the results in Excel and used a formula to calculate the days.

What I actually wanted to do was create a Power BI report that would connect to Dynamics, and use this FetchXml dynamically.

Ulrik Carlsson (CRMChartGuy) has a great post on this, and all credit to him for the following. (You may want to read his post in entirety, as I have summarized some bits)

Firstly open up your Power BI desktop, and choose Get Data –> Web

When this opens choose the Advanced option

So there a few steps here to get the connection working:

  1. The Service Root URL that you would otherwise use in Power BI https://MYCRMORG.api.crm.dynamics.com/api/data/v9.1/
  2. The PLURAL name of the entity schemaname follow by ?fetchXml=
    Example: audits?fetchXml=
  3. The FetchXML above, encoded as a URL (You can get this encoded at 
    https://www.freeformatter.com/url-encoder.html )
  4. Type in “Prefer” – it is not option you can select
  5. Exactly type: odata.include-annotations=”OData.Community.Display.V1.FormattedValue” (Important that you type in the Quote marks to avoid formatting issues)

Click OK

Open up the Query Editor

Click on “To Table” in the convert area, then click on the expand button (two arrows pointing away from each other)

And there you have it…..A live power BI report showing users access.

I’m not going to go in to how to “Pretty” the report up, I’ll leave that to you! 

The process above would be the same for any FetchXml query, and if you don’t need to hand craft the Fetch, then you can start with an Advanced Find, and export it out.

ByRob Peledie

You Say Flow, I Say Logic Apps

I’ve been using Flow for quite a while, but recently started utilising Azure Logic Apps for some pieces of work. On one of the forums I frequent, someone asked the question “What’s the difference between Flow and Logic Apps”.

Well the quick and simple answer is that in a lot of ways, they’re the same thing……

However, that’s not the real answer or the complete picture. There are a few differences that might help you decide which to use in specific circumstances.

There’s a really good overview of the similarities and differences here, but it’s summed up nicely by this:


Microsoft Flow and Logic Apps are both designer-first integration services that can create workflows. Both services integrate with various SaaS and enterprise applications.
Microsoft Flow is built on top of Logic Apps. They share the same workflow designer and the same connectors.
Microsoft Flow empowers any office worker to perform simple integrations (for example, an approval process on a SharePoint Document Library) without going through developers or IT. Logic Apps can also enable advanced integrations (for example, B2B processes) where enterprise-level Azure DevOps and security practices are required. It’s typical for a business workflow to grow in complexity over time. Accordingly, you can start with a flow at first, and then convert it to a logic app as needed.

There’s loads of resources available to help get started, and decide which way to go. Here’s a couple of quick summaries

As I have mentioned before, Microsoft Learn has some great resources, and there is a really good Flow Learning Path to get you started. Nothing as yet on Learn for Logic Apps, but I’m sure there will be soon.

For me, when I looked at an integration with an external API, to be used across the whole enterprise, I decided to base it on Logic Apps. Whereas when I wanted a small solution to move attachments from Dynamics 365 to OneDrive (on my own companies Dynamics 365 instance) Flow was fine.

As we move to a more Serverless architecture, SaaS and PaaS solutions like Flow and Logic Apps are giving the power of integration and communication back in to the hands of the Power User, and not just the remit of the coder. The turn around and potential for Tech Debt is reduced (in my opinion), and is a very exciting step forward.

ByRob Peledie

Learn More With….. Microsoft Learn!

I’m pretty well Microsoft through and through. However a few years ago I had a detour in to the depths of the realm know as Salesforce (It’s fine, I’m ok now)…..

Seriously though, There are a few things that Salesforce do quite well, and one of the things I thought they did well was Trailhead. Trailhead was an online, free learning platform which has a gamification feel about it, and allowed users to acquire the skills needed to be anything from a good user to a Salesforce Dev.

I always thought it was a great concept and executed really well, and something Microsoft could take note of.

Well…….

A few years forward and we have Microsoft Learn!

Now I have to say, the look and feel is very similar to Trailhead, but that’s not a bad thing, after all, Salesforce did it well.

So what can you learn? Well there’s lots of Learning paths you can take including Azure, Dynamics, Power BI and more, but as I’m well and truly entrenched in the Powerapps/Dynamics space, this is the Path I chose, and there really is some great modules.

The modules are so well put together, and really informative. If you’re going through one of the Azure modules, you even get a Sandbox instance right in the browser to work through the lesson

So give it a go…. Learn something new today.