Microsoft Power BI lets you build dashboards and interactive reports from your existing data. It can connect to pretty much any data source and also comes with a bunch of built in connectors to services like Google Analytics, Facebook, Quickbooks Online etc.

We’ve been working on a range of Power BI connectors for our accommodation provider clients, providing them with updating business intelligence dashboards to monitor their Xero Accounting, NewBook booking data, Office 365 activity, MailChimp, Facebook, and now their phone calls.

Since phone numbers are unique, this data can be matched against existing customer data in Power BI, so our clients can find out which customers are calling, how often, and for how long.

This is a sample dashboard I put together this morning using the phone data we’re importing for a Queensland tourist park. The data comes from FoneDynamics, an online call tracking and analytics service based in Australia. Power BI Dashboard Phone Calls

Clicking on dashboard elements opens the interactive Power BI reports. From here, you can drill down into the data and see how the elements relate to one another. In this example, we’ve selected ‘No’ under the Repeat callers chart. This gives us some pretty detailed information about the first time callers into the tourist park. This could include how long they spent on the phone, how many first time callers went unanswered, how long it took to answer these calls etc.Phone Call Business Intelligence Dashboard

Since we’re using Power BI, we can open and interact with these reports on any device. Here’s the same dashboard running on an iPhone, where you can open the charts and set alerts on important data. For example, get notified when the missed calls for the month exceeds a certain number.FoneBox Power BI Dashboard

The tech stuff

For this solution, we had to pull the data from a collection of CSV files stored on an FTP site. Power BI will connect to CSV files out of the box, though not via FTP and it won’t join multiple CSVs into a single dataset.CSVs For Import Into Power BI

I wrote a console app last night to connect to the FTP site, pick up each CSV, collect the phone records from it and upload them into an Azure Storage Table.Extract Phone Data for Business Intelligence On PowerBI

Here’s the data in an Azure Storage Table.

Power BI Phone Data On Azure Storage

Once the data for the previous day’s phone records is uploaded into Azure Table Storage, I set up an Azure Web Job to run the process once a day.Azure Web Jobs To Power BI

Now that we have a regular, up to date source of phone data, we can connect to it using Power BI Desktop app. We can use this app to create our reports before publishing them to Power BI Online.Connect To Azure Table Storage Via PowerBI

If you’d like to learn more about setting up Business Intelligence dashboards for your data, send an email with what you have in mind and we might be able to make it happen.

We use Microsoft’s AzCopy to move large amounts of data from external sources into Microsoft Azure Storage.

The typical scenario for us is a customer who is moving onto an Azure Virtual Machine, and wants their data stored on that machine.

AzCopy is a versatile command line utility that allows you to move files from another PC or Server into Azure Storage, and then into your Azure virtual machine.

When migrating data to an Azure VM, the solution looks like this:Azcopy Process to Move Data Into Azure

To get started, you need to install AzCopy from here.
Next, create an Azure Storage Account. You can do this in either the old portal or the new portal.
Once you’ve created a storage account, you’ll need to create a storage container. Here is a quick guide for the old portal and the new portal.

Now that you’ve installed AzCopy, created a storage account and a container, you can put together your AzCopy commands. Typically you’ll create two commands, one that uploads your data into Azure Storage, and the other downloads your data from Azure Storage into your Azure Virtual Machine.

Here’s an example that will move local data from E:\NAS into Azure Storage, and then download it to E:\NAS on the destination virtual machine:

To Azure Blob Storage

AzCopy /Source:E:\NAS /Dest:https://<storageaccountname>.blob.core.windows.net/<containername> /DestKey:<LongStorageAccountKey> /S /V:C:\temp\NASDrive.log

From Azure Blob Storage

AzCopy /Source:https://<storageaccountname>.blob.core.windows.net/<containername> /Dest:E:\NAS /SourceKey:<LongStorageAccountKey> /S /V:C:\temp\NASDrive.log

These commands will also create a log file under C:\temp\NASDrive.log

Running the AzCopy commands

Running AzCopy To Migrate Data Into Azure

Open Command Prompt and navigate to the location where AzCopy was installed. Typically this is under “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy”

Paste in your first command and run it to start the upload. In the meantime, you can install AzCopy on the destination server.

Once the upload is complete, go to the destination server and run the second command to download the data from Azure Blob Storage.

If you experience errors

AzCopy Warning After Data Migration Into Azure

Occasionally you may experience errors in the AzCopy transfer (usually the upload), where certain files will fail. The solution for this is usually to append the /NC: parameter and run the command again. The /NC: parameter limits the number of concurrent connections to Azure storage. I usually set it to /NC:5, where 5 is the maximum number of concurrent files that will be uploaded. The upper limit of concurrent connections is 512.

Updated command for uploading to Azure Blob Storage

AzCopy /Source:E:\NAS /Dest:https://<storageaccountname>.blob.core.windows.net/<containername> /DestKey:<LongStorageAccountKey> /S /V:C:\temp\NASDrive.log /NC:5

If you’re rerunning the command, you will be asked whether you want to skip files that already exist. Choose to skip All.

Here’s the results of the second transfer.Azcopy Summary For Azure Data Migration You may notice that the number of files transferred in this image is different to the number of files that failed in the previous image. In this case, we removed some unnecessary files before restarting the upload.

AzCopy Documentation

For more information on AzCopy, see the documentation here.

 

Here’s how to create an Azure Storage Account Container in the old Microsoft Azure portal at https://manage.windowsazure.com, assuming you already have an Azure subscription and existing Azure Storage account.

  1. You will need an existing storage account. See this guide for how to create one.
  2. Login to https://manage.windowsazure.com
  3. Open your existing storage account.Open Azure Storage Account
  4. Click Containers then click Create a container:Create Azure Storage Container
  5. Choose a Name and Access type for it:New Azure Storage Container
  6. Click the tick and wait for the container to be created. It should just take a few seconds.Created Azure Storage Container
  7. You can now refer to this container from other services using https://<yourstorageaccountname>.blob.core.windows.net/<yourcontainername> 

    Depending on the Access type you chose (eg. private), you may need your storage account name and primary key to access this container. See the end of this guide for how to retrieve these.

 

Here’s how to create an Azure Storage Account Container, assuming you already have an Azure subscription and existing Azure Storage account.

  1. Firstly you’ll need a storage account. See this guide for how to create one.
  2. Open your storage account by signing into https://portal.azure.com
  3. Click BlobsOpen Azure Storage Account Blobs
  4. Click Containers in the Blob service blade.Azure Storage Container Creation
  5. Create the new container by clicking ‘+ Container’ then choosing a Name and Access type for it:Add Azure Storage Container
  6. You can now refer to this container from other services using https://<yourstorageaccountname>.blob.core.windows.net/<yourcontainername>
    Depending on the Access type you chose, you may need your storage account name and primary key. See the end of this guide for how to retrieve these.

Creating an Azure Storage account through the new Azure Portal is quite simple. Here’s how you do it.

To create an Azure Storage account in the old portal, follow this quick guide instead.

  1. Log into your Azure Subscription at https://portal.azure.com
  2. Click New, Data + Storage, Storage AccountCreate Azure Storage Account
  3. Choose Create
    Azure Storage Account Deployment Model
  4. Type a name for your storage account.

    This will be the storage account name that you’ll use to access your storage.

    Choose a pricing tier.

    This is the replication level of your storage across Azure’s physical data centre locations. Different levels of replication offer greater redundancy in case of a datacentre outage.

    Choose a Resource Group,

    This groups your related services together in Azure. If you don’t have any other services running yet, click Create a new resource group

    Choose the subscription

    This will add the storage account under the chosen subscription, you may only have one.

    Choose the location of the data

    This is the physical location of the datacentres that will hold your storage account. If you’ve chosen Geo Replication, your data will also be replicated outside of this location.

    Choose whether you want to use Diagnostics

    Diagnostics will disable the regular monitoring charts and alerts for your storage resource, though will send diagnostic data into a storage account for your own monitoring.
    Azure Storage Account Details

  5. Now, click Create.
    Azure Storage Account Creation
  6. Wait a few moments for your storage account to be created. Once completed, you can open your brand new Azure storage account!
    Azure Storage Account Info
  7. To make use of this Storage account, click ‘Keys‘ on the settings blade to retrieve the storage account name and primary access key.
    Azure Storage Account Name and Key

Creating an Azure Storage account through the current portal (non-preview) is quite simple. Here’s how you do it.

(To create an Azure Storage account in the new portal, follow this quick guide instead.)

 

  1. Login to https://manage.windowsazure.com
  2. Click StorageAzure Portal Login
  3. Click New
    Create New Storage Account
  4. Click Quick Create, then choose a name for the storage account.
    Choose the location you want your data to be stored, make it close to you or your other Azure resources.
    Choose the Azure Subscription you want the Storage Account associated with.
    Choose the replication level for your storage, this will affect pricing, though different levels of replication offer greater redundancy in case of a datacentre outage.Quick Create Azure Storage Account
  5. Click Create Storage Account.

    Your storage account will show as ‘Creating’ in the storage window.Azure Storage Account CreatingWait for it to change to OnlineAzure Storage Account Online
  6. Once it’s created you can open it up. You now have a new Azure Storage account! Click Manage Access Keys at the bottom of the page to retrieve the Storage account keys. You’ll usually only need the Storage account name and Primary Key to make use of your storage account.Azure Storage Account Access Keys

 

When switching from Google Apps/Google for Work to Office 365, you’ll usually want to migrate your Google Drive files as well as your mail.

There are a few online tools that will do this for free, or at a cost, with varying degrees of functionality. I came across this handy article that goes into more detail on these methods.

The method that stuck out to me was the new SharePoint Online Migration API from Microsoft. A free powershell driven process. Microsoft released an IT User Guide on the steps required when it was in preview. This is the document I used, and it can be downloaded here.

I used a Microsoft Azure virtual machine to do the initial download of the Google Drive Directory – about 150 GB of data. It downloaded incredibly fast on the Azure VMs connection and completed in a couple of hours. I just used the Google Drive sync tool for this, though you can also use Google’s Takeout tool if you need to convert your Google Docs/Sheets/Slides to their Microsoft Office equivalents.

Once downloaded, I installed the updated SharePoint Online Management Shell, and followed the instructions in the provided Word documents above.

The next step was to create an Azure storage account, create the folders for the migration packages (a bunch of XML manifests outlining what needs to be migrated) and start the upload of the data.

I got an error during one of the first powershell commands that read ‘New-SPOMigrationPackage : The server could not be contacted‘. Following the instructions of this blog post, I changed the initial command to this and added the -NoAdLookup switch to resolve it.

Next we run the Set-SPOMigrationPackageAzureSource cmdlet to upload the data from the Azure Server to the Azure Storage account.

Once the data and the migration package is uploaded, the migration can be kicked off via PowerShell. Now the data is being moved from Azure to OneDrive for Business/SharePoint online. You can check the status of the Migration using the Microsoft Azure Storage Explorer, or just keep an eye on the library that you’re migrating to.

Storage Explorer Migration Queue

 

The coolest thing about this method is that it avoids the upload throttling of the ‘Open with Explorer’ method, and the syncing issues of the OneDrive for Business sync client. Best of all, it preserves the date modified metadata of the original files.

File metadata is preserved

 

Last week I connected NewBook to Microsoft’s Power BI for our Accommodation provider clients. This weekend I gave Xero the same treatment, and it’s worked out better than I thought.

Some tech stuff

XerotoAzuretoPowerBI

The architecture of this solution is pretty much the same as the NewBook solution. A .net console app runs as an Azure web job on a set schedule. It pulls the up to date Xero data via the Xero API and transfers it into Azure Table Storage. Power BI connects to the table storage, transforms the data, creates the relationships between the flat tables and makes it really easy to create cool reports and dashboards.

Discover more with Power BI reporting

Once our cloud hosted application pulls your Xero data into Power BI, we can get to work building the reports. Most of these examples are built using the sample data in the Xero demo organisation. I’ve included a couple of reports from our own Xero organisation because I think they demonstrate the functionality of Power BI a bit better. I’ve edited out the names and figures, though you’ll get the idea.

 

This above report showing our custom tracking option breakdown, our total amount paid vs owing and a donut chart of our top paying customers. Clicking on a chart element updates the other chart elements. Clicking around these reports gave me some great insight on how we’re performing as an organisation.

 

This map uses the data from your Xero contact’s street and postal info to give you an idea of how your customers are distributed.  Our demo data only contained two address records, though map data updates when other elements are selected also. For example, you can even find out whether customer location has anything to do with the sorts of products they purchase from you.

Create the perfect dashboard(s)

Once the reports are created, they can be pinned to a custom dashboard in Power BI. Dashboards can give you a quick overview on how everything’s running. You can create multiple dashboards, and view them on your phone, tablet, or computer. We have a dashboard on a TV in our office displaying our daily performance data from a few sources.

 

Ask the right questions

This is probably the coolest thing about Power BI – the ability to ask questions of your data. You don’t need to be a data scientist to put together these dashboards, just open up Power BI and tell it what you want. The Q&A feature builds charts, graphs and returns numbers instantly. Once you’re happy with the response, you can pin it to your dashboard.

 

Check in from anywhere

The current Xero mobile apps don’t give you access to the same reports that you get on the full website. Now with the Power BI mobile apps, we can create and access our own custom Xero reporting and dashboards from any device.

 

Tap the mobile graphs to drill down into the data. Set alerts on important data to be notified when targets are exceeded or not met.

 

Export and share with anyone

Dashboards can be shared with other Power BI users via the web, tablet or mobile apps. You can also export, annotate and share individual chart elements if you don’t want to give someone access to your entire dashboard or dataset.

If you want to know more about getting this set up for your organisation, get in touch. We’re going to be offering this as a per user, per month pricing bundled with Power BI. This bundle will include set up, training and customisation of your first Xero to Power BI dashboard.

Powerful, easy up to date reporting on all devices with Power BI

We support a large number of tourist park operators and accommodation providers running NewBook, a cloud based reservation system. We got in touch with the developers last week and asked if we could get API access to a test tenant to see if we could connect NewBook to Power BI. We wanted to create some cool Business Intelligence dashboards with up to date reporting for our customers.

NewBook’s developers responded the same day with the details, and I spent the weekend putting it together. I had a lot of fun getting this working, and I think the end result is so powerful we’ll be offering it as a standalone service.

The architecture of the solution is pretty straightforward, though it took me a few iterations to settle on this one.

New Book Power BI SolutionOutline

The solution works via a .Net console app running on Microsoft Azure. The app pulls up to date booking and revenue information from NewBook and sends it on to Azure Table Storage. Power BI connects to the table storage as a data source. We used Power BI desktop to create some interesting graphs and charts, and published the results to a Power BI dashboard where it can be viewed on all devices.

What does it look like?

Here’s some sample charts that I put together yesterday, though you don’t have to be restricted to the below examples. One of the coolest things about Power BI is its powerful natural language Q&A feature, you can ask it pretty much anything and it’ll build the chart instantly – I’ll add an example of that below.

See a snapshot of your current guests:

Dive into the data to gain more insight. Dashboards link with the reports we published from Power BI Desktop. Clicking on a charts segment will update other charts. Now we can see that 8 of our 36 bookings are here on business, most of them booked direct, and are staying in a permanent cabin or van.

We can also view historical financial and booking data. This can help you get an understanding of your business’ performance over time.

Even better on mobile

Power BI apps are available for mobile devices too, where the charts become even more interactive. Tapping the charts and rotating them by touch unlocks more information.

 

Here’s the natural language Q&A feature. You can ask it questions about your business and it’ll generate a chart that you can pin to your dashboards.

 

Dashboards can be created, annotated and shared amongst your team. You can display them on TV’s, interact with them on phones and tablets or just use them via your PC or laptop.

Interested in Power BI and NewBook?

If you’re interested in Power BI, or if you’d like to get your NewBook reporting set up like this, get in touch with us.

Interested in using Newbook to manage your resort or tourist park property? Find out more here.

In the previous post I outlined connecting our temperature/humidity sensor to a Raspberry Pi and taking a reading from it. The next step is to send that data to the cloud and make use of it in Power BI.

Here’s a basic outline of the process

Connecting the sensor to the Cloud

Of course, Power BI isn’t the only thing you could use this data for, though our goal is to add this temperature reading to the Power BI dashboard in our office.

The first thing I needed to find was somebody who had done this before. A few days ago I found this post by Rakesh George that outlines his process: http://www.identitymine.com/blog/2015/06/19/iot-with-raspberry-pi-azure-and-windows-devices/

His temperature sensor setup is a lot more complicated than mine – it looks like he’s taking the binary values from the sensor and working out the temperature value. In our case, we had a great library from Adafruit that did all of that for us.

Compared to other IoT projects, this one is extremely simple. We’re not making use of Event Hubs or Stream Analytics or any of the high scale, high performance tools in Azure. We’re just sending a temperature and humidity value to Azure Table Storage once a minute, then connecting to that data via Power BI Desktop.

I downloaded Rakesh’s source files to see how it can apply to my setup.

Firstly, you’ll need to install the Azure SDK on the Raspberry Pi to make use of the Azure services, check the linuxruninstructions.txt in Rakesh’s Python source file download for instructions on this.

Most of the code relevant to Azure is located in the azuremodule.py file. I learnt a lot about the Azure Python SDK by reading and modifying this script file.

If you’re setting this up yourself, you’ll need to sign up for Microsoft Azure (free trials are available), create a storage account, and add the storage account name and primary key into the sample code.

After a little bit of tweaking, I was able to get it to work using the Adafruit library to retrieve the values from my sensor and upload it to my Azure Table Storage account every 60 seconds.

Once you’re collecting data, you can use the Power BI Desktop App to pull this data into your dashboards for reporting. As I’m testing it, I created a new dashboard for these temperature/humidity values.

Power BI Desktop can be downloaded from here: https://powerbi.microsoft.com/desktop

A basic version of Power BI is available free, though you can sign up for a 60 day trial of Power BI Pro, which has many of the same features, though it refreshes your dashboards hourly instead of daily.

Importing and preparing the data for Power BI

Click Get Data, choose Azure, Microsoft Azure Table Storage.

Get Data in Power BI

Enter the storage account name, click OK, then enter the primary key.

Connect to Azure Table Storage

Select the table to import.

Connect to data in Power BI

Import the data, and use the query editor to set the data types and separate the columns for humidity and temperature values.

Modify Data in Power BI

Create your Power BI graphs and publish them to Power BI.

Building Graphs in Power BI

Once published, you can access them via the browser or the Power BI apps on mobile devices.

View graphs on Power BI Mobile Devices