It’s been a while since I wrote anything too technical, and I’ve had a chance during this lockdown to brush up on some stuff I haven’t really played with in Power Apps, so thought I would look at something a little different….. connecting and consuming an Open API, and consuming the results in a Power App.
Before I get started, there is also a video to go with this article
There are plenty of open API’s you could use to have a play around, but I decided to use the one from CoinGecko, which gives lots of options for real time Cryptocurrency prices. You don’t need to sign up to it, and there’s no authentication required, which makes life easier if this is your first venture.
If you navigate to the CoinGecko API site, you’ll see the complete list of options, and get a chance to test out your requests right there. For example the first, and most basic is to ping the api.
So for the example we’re going to build, let’s keep it nice and simple. A Power App that retrieves the current prices for Bitcoin in 3 currencies, then allows you to choose your preferred currency, your holding, and show your live value.
Firstly lets create the Flow that gets the data from the CoinGecko API. Go to flow.microsoft.com and log in. Once logged in, go to your flows and click New –> Instant – from blank
You’ll then get the opportunity to choose your trigger, and in this case we want to choose PowerApps, as an action in our PowerApp will call the flow and API
Once you’ve clicked create, your flow will open, and the PowerApps trigger will be shown as the first step
Before we go any further, lets just step through the process that we’re going to do. This is the finished flow
So if we break it down in to simple steps……. The Power App calls the flow. this then uses the HTTP action to call the API. The JSON results are Parsed, and the Responses captured.
This then gives us the resulting data to be consumed in the Power App.
So let’s break it down. Firstly that HTTP Step. If I open up that step this is what you’ll see
So the Method we’re going to use here is GET – because we’re getting data! Seriously though, if you want to know more about these methods, then check out this.
So the URI is where we form our request, and in this case it’s a really simple request for the Price of Bitcoin in 3 currencies – USD, GBP and EUR.
Since theres no Authentication for this API, we can easily see if this request works by posting that URI in a web browser
So the results you see there are in a JSON format. We now need to Parse them, and break them down in to their parts so we can consume and use them in a readable format.
Here we add the Parse JSON action.
So firstly you’ll see that we want to pick up the Body as Content – that’s the payload or data we are retrieving. The Schema may seem a little daunting, but there’s a neat little trick here……. “Generate from sample”
In this, we can just paste in the returned data that we got before in our test with the URL, and the flow will just generate the schema for you!
Finally we just add the Response action, and you’ll see it’s almost identical to the Parse – just with the addition of the status code of 200 which effectively means OK. This then allows the response to be consumed in your Power App
So, that should be your flow created. Don’t forget to name it and save it. Next let’s create a simple Power App and consume that data.
Go to make.powerapps.com and create a new Canvas app from blank. The first thing we’ll do is add a button – for this simple example the button is going to call the flow which in turn calls the CoinGecko api. I’ve just renamed mine to “Update Prices” .
Now there’s a few things we want to do in order to grab those prices. In the formula box I have added the following
I’m not going to go into all the details of this but essentially we are creating a collection, calling it “getrates” and the data that feeding it is coming via the flow we previously created (Mine is named “GetCoinGecko”
If you want to check your flow is working, just check your collection
I have also added a few labels….. 3 to identify the currency, and 3 to hold the data we pull back
so far then we have a mechanism for getting the data from the API via flow, and triggered by the button. it’s being stored in the collection. We now have to show it in the label fields.
In the formula for the label for GBP I have added the following
This formula calls the first record in the collection and in this case the GBP value and displays it
Repeat for the USD and EUR data, and there you have it!
Have a play…… why not add a drop down and option to put your holding to show your value (I wish I had 100!)
I had a situation recently that an instance (not one I managed I hasten to add!) had started to grow in storage, to the point that it was at nearly 200% capacity. I was asked to look at it. Turned out that Plugin Tracing had been switched on, and a plugin was throwing errors. Unfortunately there were now so many records in the Log file that it wouldn’t open (Usual generic unhelpful SQL Error).
Taking a quick look at organisation insights confirmed that this was consuming nearly 60gb of storage, and a quick record count (Using Record Counter in XrmToolBox) indication there was approaching 10,000,000 records.
Normally I would use the standard Bulk Delete option, however for a couple of reasons I didn’t think this was a viable option:
In the end I decided to go back to trusty old SSIS, and (again) the lovely KingswaySoft solution for Dynamics 365.
The steps are essentially:
If you’ve used the SSIS Toolbox before, then you’ll know how to setup a connection.
To retrieve the data, I decided to go with FetchXML as I wanted to specify a date range, so the following was added to the Dynamics Source Data Flow:
<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false"> <entity name="plugintracelog"> <attribute name="plugintracelogid" /> <filter type="and"> <condition attribute="createdon" operator="on-or-before" value="2018-11-20" /> </filter> </entity> </fetch>
So the Fetch above would pull back any log files created before 20/11/2018.
Once this is done, connect the Source to the destination, and open up the Destination Module
Add your connection in Connection Manager.
In Action, choose Delete.
In Destination Entity, choose the Plugintracelog entity
I enabled multithread, and chose 32 threads.
Once that is all done, Save, Build and Run.
So I don’t want to get in to the whole GDPR discussion in this article, but with GDPR in mind, I think it’s important, and potentially reduces risk, to anonymise data in a non production environment.
Again, there are plenty of articles on whether it should be a process of anonymising or pseudonymisation, but for the purpose of this article, I am going to go through the steps of using SSIS for the purpose of Anonymising Dynamics 365 Data.
Firstly, a few assumptions…..
You have Visual Studio and SSIS setup. I’m personally using VS 2015 Pro Update 3, SSDT.
You will also need a couple of essential Dynamics tools from the lovely people at KingswaySoft
They provide a developer license model to get you started.
Once you have everything downloaded and setup, then you’re ready to go.
Create a new SSIS Project
In Connection Manager, create a new connection
Next, drag over the following components:
In the Dynamics Source Component, Enter the Connection you just created, the Entity (in this case Lead).
Click on the columns section, and choose which fields from Dynamics you want to anonymise…… In this case I chose things like FirstName, LastName, Email etc.
Connect the Dynamics CRM Source component to the Data Anonymiser component by dragging the connector.
Open the Data Anonymiser Component.
Here you will see the fields/columns you have decided to anonymise, and the option to choose the data type
So for example the email address will be anonymised with data of email type
Do this for all your fields/columns.
Once this is done, open the CRM Destination Component
Next click on the columns section and map the source to the destination
So to summarize….
Once this is all done, save your solution and run it:
So in the instance above 274 Lead records were anonymised.
I hope you found this quick tutorial helpful.