Microsoft Power BI lets you build dashboards and interactive reports from your existing data. It can connect to pretty much any data source and also comes with a bunch of built in connectors to services like Google Analytics, Facebook, Quickbooks Online etc.

We’ve been working on a range of Power BI connectors for our accommodation provider clients, providing them with updating business intelligence dashboards to monitor their Xero Accounting, NewBook booking data, Office 365 activity, MailChimp, Facebook, and now their phone calls.

Since phone numbers are unique, this data can be matched against existing customer data in Power BI, so our clients can find out which customers are calling, how often, and for how long.

This is a sample dashboard I put together this morning using the phone data we’re importing for a Queensland tourist park. The data comes from FoneDynamics, an online call tracking and analytics service based in Australia. Power BI Dashboard Phone Calls

Clicking on dashboard elements opens the interactive Power BI reports. From here, you can drill down into the data and see how the elements relate to one another. In this example, we’ve selected ‘No’ under the Repeat callers chart. This gives us some pretty detailed information about the first time callers into the tourist park. This could include how long they spent on the phone, how many first time callers went unanswered, how long it took to answer these calls etc.Phone Call Business Intelligence Dashboard

Since we’re using Power BI, we can open and interact with these reports on any device. Here’s the same dashboard running on an iPhone, where you can open the charts and set alerts on important data. For example, get notified when the missed calls for the month exceeds a certain number.FoneBox Power BI Dashboard

The tech stuff

For this solution, we had to pull the data from a collection of CSV files stored on an FTP site. Power BI will connect to CSV files out of the box, though not via FTP and it won’t join multiple CSVs into a single dataset.CSVs For Import Into Power BI

I wrote a console app last night to connect to the FTP site, pick up each CSV, collect the phone records from it and upload them into an Azure Storage Table.Extract Phone Data for Business Intelligence On PowerBI

Here’s the data in an Azure Storage Table.

Power BI Phone Data On Azure Storage

Once the data for the previous day’s phone records is uploaded into Azure Table Storage, I set up an Azure Web Job to run the process once a day.Azure Web Jobs To Power BI

Now that we have a regular, up to date source of phone data, we can connect to it using Power BI Desktop app. We can use this app to create our reports before publishing them to Power BI Online.Connect To Azure Table Storage Via PowerBI

If you’d like to learn more about setting up Business Intelligence dashboards for your data, send an email with what you have in mind and we might be able to make it happen.

Once you’ve setup Azure Backup to protect your Azure Virtual Machine (see this guide for more info), the process to restore it is quite simple. I recommend running test restores on a regular schedule to make sure it’s working correctly.

To restore an instance of your Azure Virtual Machine follow this quick guide.

  1. Log into https://manage.windowsazure.com
  2. In the left side menu, select “Recovery Services” Restore Azure Virtual Machine from Azure Backup
  3. Click to open the backup vault you created for the virtual machine.
  4. Click the “Protected items” tab across the top. Open Protected Items Under Azure Recovery Services
  5. Click “Restore” down the bottom to restore the selected virtual machine.Restoring Azure Virtual Machine Under Azure Recovery Services
  6. Select a recovery point and click next.
  7. Fill out the restore instance details. You will need to give it a unique name, but make sure you select the appropriate cloud service, storage account, and virtual network if you’re following this in a disaster recovery scenario.Choose Azure Virtual Machine Restore Instance
  8. Click the tick in the bottom right to start the restore process
  9. Once the restore is completed you can download the new RDP connect file and test.

 

If you’re running Azure Virtual Machines in production, you’ll probably want to protect them with Azure Backup. The good news is Azure provides a simple way to protect an entire virtual machine, so you can easily restore it if things go wrong.

Protecting Virtual Machine instances differs from the typical Azure Backup client that is usually installed on client PCs and On-premises servers.

Here’s how to set Azure Backup for an Azure Virtual Machine:

  1. Log into https://manage.windowsazure.com
  2. Open Recovery ServicesBackup Azure VirtualMachine With Recovery Services
  3. Click NewNew Azure Backup Vault
  4. Choose Backup Vault, Quick Create, give it a name and choose to place it in the same region as the virtual machine you’ll be protecting. Quick Create Azure Backup Vault In Azure Recovery Services
  5. The backup vault will appear in Recovery ServicesAzure Recovery Services To Backup Azure Virtual Machines
  6. Click on the vault and scroll down to Protect Azure Virtual Machines. Backup Azure Virtual Machine
  7. Click Discover Virtual Machines. It will take a view minutes. Once discovered, you’ll be notified that virtual machines were found in the same region. Register Azure Virtual Machine
  8. Click Register and choose the Virtual Machine that you want to protect.Registering Azure Server For Protection
  9. Wait for the Virtual Machine status to change to RegisteredRegistering Azure VirtualMachines For ProtectionRegistered Azure VirtualMachine For Protection
  1. Click ProtectProtect Azure Virtual Machine
  2. Choose the virtual machine you just registered.Selec tAzure Virtual Machine For Protection
  3. Choose the Default Policy Settings or configure your own and click Finish

 

We use Microsoft’s AzCopy to move large amounts of data from external sources into Microsoft Azure Storage.

The typical scenario for us is a customer who is moving onto an Azure Virtual Machine, and wants their data stored on that machine.

AzCopy is a versatile command line utility that allows you to move files from another PC or Server into Azure Storage, and then into your Azure virtual machine.

When migrating data to an Azure VM, the solution looks like this:Azcopy Process to Move Data Into Azure

To get started, you need to install AzCopy from here.
Next, create an Azure Storage Account. You can do this in either the old portal or the new portal.
Once you’ve created a storage account, you’ll need to create a storage container. Here is a quick guide for the old portal and the new portal.

Now that you’ve installed AzCopy, created a storage account and a container, you can put together your AzCopy commands. Typically you’ll create two commands, one that uploads your data into Azure Storage, and the other downloads your data from Azure Storage into your Azure Virtual Machine.

Here’s an example that will move local data from E:\NAS into Azure Storage, and then download it to E:\NAS on the destination virtual machine:

To Azure Blob Storage

AzCopy /Source:E:\NAS /Dest:https://<storageaccountname>.blob.core.windows.net/<containername> /DestKey:<LongStorageAccountKey> /S /V:C:\temp\NASDrive.log

From Azure Blob Storage

AzCopy /Source:https://<storageaccountname>.blob.core.windows.net/<containername> /Dest:E:\NAS /SourceKey:<LongStorageAccountKey> /S /V:C:\temp\NASDrive.log

These commands will also create a log file under C:\temp\NASDrive.log

Running the AzCopy commands

Running AzCopy To Migrate Data Into Azure

Open Command Prompt and navigate to the location where AzCopy was installed. Typically this is under “C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy”

Paste in your first command and run it to start the upload. In the meantime, you can install AzCopy on the destination server.

Once the upload is complete, go to the destination server and run the second command to download the data from Azure Blob Storage.

If you experience errors

AzCopy Warning After Data Migration Into Azure

Occasionally you may experience errors in the AzCopy transfer (usually the upload), where certain files will fail. The solution for this is usually to append the /NC: parameter and run the command again. The /NC: parameter limits the number of concurrent connections to Azure storage. I usually set it to /NC:5, where 5 is the maximum number of concurrent files that will be uploaded. The upper limit of concurrent connections is 512.

Updated command for uploading to Azure Blob Storage

AzCopy /Source:E:\NAS /Dest:https://<storageaccountname>.blob.core.windows.net/<containername> /DestKey:<LongStorageAccountKey> /S /V:C:\temp\NASDrive.log /NC:5

If you’re rerunning the command, you will be asked whether you want to skip files that already exist. Choose to skip All.

Here’s the results of the second transfer.Azcopy Summary For Azure Data Migration You may notice that the number of files transferred in this image is different to the number of files that failed in the previous image. In this case, we removed some unnecessary files before restarting the upload.

AzCopy Documentation

For more information on AzCopy, see the documentation here.

 

Here’s how to create an Azure Storage Account Container in the old Microsoft Azure portal at https://manage.windowsazure.com, assuming you already have an Azure subscription and existing Azure Storage account.

  1. You will need an existing storage account. See this guide for how to create one.
  2. Login to https://manage.windowsazure.com
  3. Open your existing storage account.Open Azure Storage Account
  4. Click Containers then click Create a container:Create Azure Storage Container
  5. Choose a Name and Access type for it:New Azure Storage Container
  6. Click the tick and wait for the container to be created. It should just take a few seconds.Created Azure Storage Container
  7. You can now refer to this container from other services using https://<yourstorageaccountname>.blob.core.windows.net/<yourcontainername> 

    Depending on the Access type you chose (eg. private), you may need your storage account name and primary key to access this container. See the end of this guide for how to retrieve these.

 

Here’s how to create an Azure Storage Account Container, assuming you already have an Azure subscription and existing Azure Storage account.

  1. Firstly you’ll need a storage account. See this guide for how to create one.
  2. Open your storage account by signing into https://portal.azure.com
  3. Click BlobsOpen Azure Storage Account Blobs
  4. Click Containers in the Blob service blade.Azure Storage Container Creation
  5. Create the new container by clicking ‘+ Container’ then choosing a Name and Access type for it:Add Azure Storage Container
  6. You can now refer to this container from other services using https://<yourstorageaccountname>.blob.core.windows.net/<yourcontainername>
    Depending on the Access type you chose, you may need your storage account name and primary key. See the end of this guide for how to retrieve these.

Creating an Azure Storage account through the new Azure Portal is quite simple. Here’s how you do it.

To create an Azure Storage account in the old portal, follow this quick guide instead.

  1. Log into your Azure Subscription at https://portal.azure.com
  2. Click New, Data + Storage, Storage AccountCreate Azure Storage Account
  3. Choose Create
    Azure Storage Account Deployment Model
  4. Type a name for your storage account.

    This will be the storage account name that you’ll use to access your storage.

    Choose a pricing tier.

    This is the replication level of your storage across Azure’s physical data centre locations. Different levels of replication offer greater redundancy in case of a datacentre outage.

    Choose a Resource Group,

    This groups your related services together in Azure. If you don’t have any other services running yet, click Create a new resource group

    Choose the subscription

    This will add the storage account under the chosen subscription, you may only have one.

    Choose the location of the data

    This is the physical location of the datacentres that will hold your storage account. If you’ve chosen Geo Replication, your data will also be replicated outside of this location.

    Choose whether you want to use Diagnostics

    Diagnostics will disable the regular monitoring charts and alerts for your storage resource, though will send diagnostic data into a storage account for your own monitoring.
    Azure Storage Account Details

  5. Now, click Create.
    Azure Storage Account Creation
  6. Wait a few moments for your storage account to be created. Once completed, you can open your brand new Azure storage account!
    Azure Storage Account Info
  7. To make use of this Storage account, click ‘Keys‘ on the settings blade to retrieve the storage account name and primary access key.
    Azure Storage Account Name and Key

Creating an Azure Storage account through the current portal (non-preview) is quite simple. Here’s how you do it.

(To create an Azure Storage account in the new portal, follow this quick guide instead.)

 

  1. Login to https://manage.windowsazure.com
  2. Click StorageAzure Portal Login
  3. Click New
    Create New Storage Account
  4. Click Quick Create, then choose a name for the storage account.
    Choose the location you want your data to be stored, make it close to you or your other Azure resources.
    Choose the Azure Subscription you want the Storage Account associated with.
    Choose the replication level for your storage, this will affect pricing, though different levels of replication offer greater redundancy in case of a datacentre outage.Quick Create Azure Storage Account
  5. Click Create Storage Account.

    Your storage account will show as ‘Creating’ in the storage window.Azure Storage Account CreatingWait for it to change to OnlineAzure Storage Account Online
  6. Once it’s created you can open it up. You now have a new Azure Storage account! Click Manage Access Keys at the bottom of the page to retrieve the Storage account keys. You’ll usually only need the Storage account name and Primary Key to make use of your storage account.Azure Storage Account Access Keys

 

When switching from Google Apps/Google for Work to Office 365, you’ll usually want to migrate your Google Drive files as well as your mail.

There are a few online tools that will do this for free, or at a cost, with varying degrees of functionality. I came across this handy article that goes into more detail on these methods.

The method that stuck out to me was the new SharePoint Online Migration API from Microsoft. A free powershell driven process. Microsoft released an IT User Guide on the steps required when it was in preview. This is the document I used, and it can be downloaded here.

I used a Microsoft Azure virtual machine to do the initial download of the Google Drive Directory – about 150 GB of data. It downloaded incredibly fast on the Azure VMs connection and completed in a couple of hours. I just used the Google Drive sync tool for this, though you can also use Google’s Takeout tool if you need to convert your Google Docs/Sheets/Slides to their Microsoft Office equivalents.

Once downloaded, I installed the updated SharePoint Online Management Shell, and followed the instructions in the provided Word documents above.

The next step was to create an Azure storage account, create the folders for the migration packages (a bunch of XML manifests outlining what needs to be migrated) and start the upload of the data.

I got an error during one of the first powershell commands that read ‘New-SPOMigrationPackage : The server could not be contacted‘. Following the instructions of this blog post, I changed the initial command to this and added the -NoAdLookup switch to resolve it.

Next we run the Set-SPOMigrationPackageAzureSource cmdlet to upload the data from the Azure Server to the Azure Storage account.

Once the data and the migration package is uploaded, the migration can be kicked off via PowerShell. Now the data is being moved from Azure to OneDrive for Business/SharePoint online. You can check the status of the Migration using the Microsoft Azure Storage Explorer, or just keep an eye on the library that you’re migrating to.

Storage Explorer Migration Queue

 

The coolest thing about this method is that it avoids the upload throttling of the ‘Open with Explorer’ method, and the syncing issues of the OneDrive for Business sync client. Best of all, it preserves the date modified metadata of the original files.

File metadata is preserved

 

Last week I connected NewBook to Microsoft’s Power BI for our Accommodation provider clients. This weekend I gave Xero the same treatment, and it’s worked out better than I thought.

Some tech stuff

XerotoAzuretoPowerBI

The architecture of this solution is pretty much the same as the NewBook solution. A .net console app runs as an Azure web job on a set schedule. It pulls the up to date Xero data via the Xero API and transfers it into Azure Table Storage. Power BI connects to the table storage, transforms the data, creates the relationships between the flat tables and makes it really easy to create cool reports and dashboards.

Discover more with Power BI reporting

Once our cloud hosted application pulls your Xero data into Power BI, we can get to work building the reports. Most of these examples are built using the sample data in the Xero demo organisation. I’ve included a couple of reports from our own Xero organisation because I think they demonstrate the functionality of Power BI a bit better. I’ve edited out the names and figures, though you’ll get the idea.

 

This above report showing our custom tracking option breakdown, our total amount paid vs owing and a donut chart of our top paying customers. Clicking on a chart element updates the other chart elements. Clicking around these reports gave me some great insight on how we’re performing as an organisation.

 

This map uses the data from your Xero contact’s street and postal info to give you an idea of how your customers are distributed.  Our demo data only contained two address records, though map data updates when other elements are selected also. For example, you can even find out whether customer location has anything to do with the sorts of products they purchase from you.

Create the perfect dashboard(s)

Once the reports are created, they can be pinned to a custom dashboard in Power BI. Dashboards can give you a quick overview on how everything’s running. You can create multiple dashboards, and view them on your phone, tablet, or computer. We have a dashboard on a TV in our office displaying our daily performance data from a few sources.

 

Ask the right questions

This is probably the coolest thing about Power BI – the ability to ask questions of your data. You don’t need to be a data scientist to put together these dashboards, just open up Power BI and tell it what you want. The Q&A feature builds charts, graphs and returns numbers instantly. Once you’re happy with the response, you can pin it to your dashboard.

 

Check in from anywhere

The current Xero mobile apps don’t give you access to the same reports that you get on the full website. Now with the Power BI mobile apps, we can create and access our own custom Xero reporting and dashboards from any device.

 

Tap the mobile graphs to drill down into the data. Set alerts on important data to be notified when targets are exceeded or not met.

 

Export and share with anyone

Dashboards can be shared with other Power BI users via the web, tablet or mobile apps. You can also export, annotate and share individual chart elements if you don’t want to give someone access to your entire dashboard or dataset.

If you want to know more about getting this set up for your organisation, get in touch. We’re going to be offering this as a per user, per month pricing bundled with Power BI. This bundle will include set up, training and customisation of your first Xero to Power BI dashboard.