Creating an Azure Storage account through the new Azure Portal is quite simple. Here’s how you do it.

To create an Azure Storage account in the old portal, follow this quick guide instead.

  1. Log into your Azure Subscription at https://portal.azure.com
  2. Click New, Data + Storage, Storage AccountCreate Azure Storage Account
  3. Choose Create
    Azure Storage Account Deployment Model
  4. Type a name for your storage account.

    This will be the storage account name that you’ll use to access your storage.

    Choose a pricing tier.

    This is the replication level of your storage across Azure’s physical data centre locations. Different levels of replication offer greater redundancy in case of a datacentre outage.

    Choose a Resource Group,

    This groups your related services together in Azure. If you don’t have any other services running yet, click Create a new resource group

    Choose the subscription

    This will add the storage account under the chosen subscription, you may only have one.

    Choose the location of the data

    This is the physical location of the datacentres that will hold your storage account. If you’ve chosen Geo Replication, your data will also be replicated outside of this location.

    Choose whether you want to use Diagnostics

    Diagnostics will disable the regular monitoring charts and alerts for your storage resource, though will send diagnostic data into a storage account for your own monitoring.
    Azure Storage Account Details

  5. Now, click Create.
    Azure Storage Account Creation
  6. Wait a few moments for your storage account to be created. Once completed, you can open your brand new Azure storage account!
    Azure Storage Account Info
  7. To make use of this Storage account, click ‘Keys‘ on the settings blade to retrieve the storage account name and primary access key.
    Azure Storage Account Name and Key

Creating an Azure Storage account through the current portal (non-preview) is quite simple. Here’s how you do it.

(To create an Azure Storage account in the new portal, follow this quick guide instead.)

 

  1. Login to https://manage.windowsazure.com
  2. Click StorageAzure Portal Login
  3. Click New
    Create New Storage Account
  4. Click Quick Create, then choose a name for the storage account.
    Choose the location you want your data to be stored, make it close to you or your other Azure resources.
    Choose the Azure Subscription you want the Storage Account associated with.
    Choose the replication level for your storage, this will affect pricing, though different levels of replication offer greater redundancy in case of a datacentre outage.Quick Create Azure Storage Account
  5. Click Create Storage Account.

    Your storage account will show as ‘Creating’ in the storage window.Azure Storage Account CreatingWait for it to change to OnlineAzure Storage Account Online
  6. Once it’s created you can open it up. You now have a new Azure Storage account! Click Manage Access Keys at the bottom of the page to retrieve the Storage account keys. You’ll usually only need the Storage account name and Primary Key to make use of your storage account.Azure Storage Account Access Keys

 

When switching from Google Apps/Google for Work to Office 365, you’ll usually want to migrate your Google Drive files as well as your mail.

There are a few online tools that will do this for free, or at a cost, with varying degrees of functionality. I came across this handy article that goes into more detail on these methods.

The method that stuck out to me was the new SharePoint Online Migration API from Microsoft. A free powershell driven process. Microsoft released an IT User Guide on the steps required when it was in preview. This is the document I used, and it can be downloaded here.

I used a Microsoft Azure virtual machine to do the initial download of the Google Drive Directory – about 150 GB of data. It downloaded incredibly fast on the Azure VMs connection and completed in a couple of hours. I just used the Google Drive sync tool for this, though you can also use Google’s Takeout tool if you need to convert your Google Docs/Sheets/Slides to their Microsoft Office equivalents.

Once downloaded, I installed the updated SharePoint Online Management Shell, and followed the instructions in the provided Word documents above.

The next step was to create an Azure storage account, create the folders for the migration packages (a bunch of XML manifests outlining what needs to be migrated) and start the upload of the data.

I got an error during one of the first powershell commands that read ‘New-SPOMigrationPackage : The server could not be contacted‘. Following the instructions of this blog post, I changed the initial command to this and added the -NoAdLookup switch to resolve it.

Next we run the Set-SPOMigrationPackageAzureSource cmdlet to upload the data from the Azure Server to the Azure Storage account.

Once the data and the migration package is uploaded, the migration can be kicked off via PowerShell. Now the data is being moved from Azure to OneDrive for Business/SharePoint online. You can check the status of the Migration using the Microsoft Azure Storage Explorer, or just keep an eye on the library that you’re migrating to.

Storage Explorer Migration Queue

 

The coolest thing about this method is that it avoids the upload throttling of the ‘Open with Explorer’ method, and the syncing issues of the OneDrive for Business sync client. Best of all, it preserves the date modified metadata of the original files.

File metadata is preserved

 

Last week I connected NewBook to Microsoft’s Power BI for our Accommodation provider clients. This weekend I gave Xero the same treatment, and it’s worked out better than I thought.

Some tech stuff

XerotoAzuretoPowerBI

The architecture of this solution is pretty much the same as the NewBook solution. A .net console app runs as an Azure web job on a set schedule. It pulls the up to date Xero data via the Xero API and transfers it into Azure Table Storage. Power BI connects to the table storage, transforms the data, creates the relationships between the flat tables and makes it really easy to create cool reports and dashboards.

Discover more with Power BI reporting

Once our cloud hosted application pulls your Xero data into Power BI, we can get to work building the reports. Most of these examples are built using the sample data in the Xero demo organisation. I’ve included a couple of reports from our own Xero organisation because I think they demonstrate the functionality of Power BI a bit better. I’ve edited out the names and figures, though you’ll get the idea.

 

This above report showing our custom tracking option breakdown, our total amount paid vs owing and a donut chart of our top paying customers. Clicking on a chart element updates the other chart elements. Clicking around these reports gave me some great insight on how we’re performing as an organisation.

 

This map uses the data from your Xero contact’s street and postal info to give you an idea of how your customers are distributed.  Our demo data only contained two address records, though map data updates when other elements are selected also. For example, you can even find out whether customer location has anything to do with the sorts of products they purchase from you.

Create the perfect dashboard(s)

Once the reports are created, they can be pinned to a custom dashboard in Power BI. Dashboards can give you a quick overview on how everything’s running. You can create multiple dashboards, and view them on your phone, tablet, or computer. We have a dashboard on a TV in our office displaying our daily performance data from a few sources.

 

Ask the right questions

This is probably the coolest thing about Power BI – the ability to ask questions of your data. You don’t need to be a data scientist to put together these dashboards, just open up Power BI and tell it what you want. The Q&A feature builds charts, graphs and returns numbers instantly. Once you’re happy with the response, you can pin it to your dashboard.

 

Check in from anywhere

The current Xero mobile apps don’t give you access to the same reports that you get on the full website. Now with the Power BI mobile apps, we can create and access our own custom Xero reporting and dashboards from any device.

 

Tap the mobile graphs to drill down into the data. Set alerts on important data to be notified when targets are exceeded or not met.

 

Export and share with anyone

Dashboards can be shared with other Power BI users via the web, tablet or mobile apps. You can also export, annotate and share individual chart elements if you don’t want to give someone access to your entire dashboard or dataset.

If you want to know more about getting this set up for your organisation, get in touch. We’re going to be offering this as a per user, per month pricing bundled with Power BI. This bundle will include set up, training and customisation of your first Xero to Power BI dashboard.

Powerful, easy up to date reporting on all devices with Power BI

We support a large number of tourist park operators and accommodation providers running NewBook, a cloud based reservation system. We got in touch with the developers last week and asked if we could get API access to a test tenant to see if we could connect NewBook to Power BI. We wanted to create some cool Business Intelligence dashboards with up to date reporting for our customers.

NewBook’s developers responded the same day with the details, and I spent the weekend putting it together. I had a lot of fun getting this working, and I think the end result is so powerful we’ll be offering it as a standalone service.

The architecture of the solution is pretty straightforward, though it took me a few iterations to settle on this one.

New Book Power BI SolutionOutline

The solution works via a .Net console app running on Microsoft Azure. The app pulls up to date booking and revenue information from NewBook and sends it on to Azure Table Storage. Power BI connects to the table storage as a data source. We used Power BI desktop to create some interesting graphs and charts, and published the results to a Power BI dashboard where it can be viewed on all devices.

What does it look like?

Here’s some sample charts that I put together yesterday, though you don’t have to be restricted to the below examples. One of the coolest things about Power BI is its powerful natural language Q&A feature, you can ask it pretty much anything and it’ll build the chart instantly – I’ll add an example of that below.

See a snapshot of your current guests:

Dive into the data to gain more insight. Dashboards link with the reports we published from Power BI Desktop. Clicking on a charts segment will update other charts. Now we can see that 8 of our 36 bookings are here on business, most of them booked direct, and are staying in a permanent cabin or van.

We can also view historical financial and booking data. This can help you get an understanding of your business’ performance over time.

Even better on mobile

Power BI apps are available for mobile devices too, where the charts become even more interactive. Tapping the charts and rotating them by touch unlocks more information.

 

Here’s the natural language Q&A feature. You can ask it questions about your business and it’ll generate a chart that you can pin to your dashboards.

 

Dashboards can be created, annotated and shared amongst your team. You can display them on TV’s, interact with them on phones and tablets or just use them via your PC or laptop.

Interested in Power BI and NewBook?

If you’re interested in Power BI, or if you’d like to get your NewBook reporting set up like this, get in touch with us.

Interested in using Newbook to manage your resort or tourist park property? Find out more here.

In the previous post I outlined connecting our temperature/humidity sensor to a Raspberry Pi and taking a reading from it. The next step is to send that data to the cloud and make use of it in Power BI.

Here’s a basic outline of the process

Connecting the sensor to the Cloud

Of course, Power BI isn’t the only thing you could use this data for, though our goal is to add this temperature reading to the Power BI dashboard in our office.

The first thing I needed to find was somebody who had done this before. A few days ago I found this post by Rakesh George that outlines his process: http://www.identitymine.com/blog/2015/06/19/iot-with-raspberry-pi-azure-and-windows-devices/

His temperature sensor setup is a lot more complicated than mine – it looks like he’s taking the binary values from the sensor and working out the temperature value. In our case, we had a great library from Adafruit that did all of that for us.

Compared to other IoT projects, this one is extremely simple. We’re not making use of Event Hubs or Stream Analytics or any of the high scale, high performance tools in Azure. We’re just sending a temperature and humidity value to Azure Table Storage once a minute, then connecting to that data via Power BI Desktop.

I downloaded Rakesh’s source files to see how it can apply to my setup.

Firstly, you’ll need to install the Azure SDK on the Raspberry Pi to make use of the Azure services, check the linuxruninstructions.txt in Rakesh’s Python source file download for instructions on this.

Most of the code relevant to Azure is located in the azuremodule.py file. I learnt a lot about the Azure Python SDK by reading and modifying this script file.

If you’re setting this up yourself, you’ll need to sign up for Microsoft Azure (free trials are available), create a storage account, and add the storage account name and primary key into the sample code.

After a little bit of tweaking, I was able to get it to work using the Adafruit library to retrieve the values from my sensor and upload it to my Azure Table Storage account every 60 seconds.

Once you’re collecting data, you can use the Power BI Desktop App to pull this data into your dashboards for reporting. As I’m testing it, I created a new dashboard for these temperature/humidity values.

Power BI Desktop can be downloaded from here: https://powerbi.microsoft.com/desktop

A basic version of Power BI is available free, though you can sign up for a 60 day trial of Power BI Pro, which has many of the same features, though it refreshes your dashboards hourly instead of daily.

Importing and preparing the data for Power BI

Click Get Data, choose Azure, Microsoft Azure Table Storage.

Get Data in Power BI

Enter the storage account name, click OK, then enter the primary key.

Connect to Azure Table Storage

Select the table to import.

Connect to data in Power BI

Import the data, and use the query editor to set the data types and separate the columns for humidity and temperature values.

Modify Data in Power BI

Create your Power BI graphs and publish them to Power BI.

Building Graphs in Power BI

Once published, you can access them via the browser or the Power BI apps on mobile devices.

View graphs on Power BI Mobile Devices

At GCITS we’re getting familiar with the Internet of Things (IoT). To start with, we’re connecting sensors to the cloud to make use of the real time data.

As a quick test, we want to connect a temperature sensor in our office to our dashboard in Power BI. The first step is to get a reading from the temperature sensor via the Raspberry Pi.

I had a bit of trouble getting this working using the some of the suggested wiring diagrams (it may just be an issue with my breadboard) so I’m posting this here in case it helps someone else.

You will need:

  • Raspberry Pi running Raspbian
  • A Breadboard
  • Assorted wires
  • 10k ohm resistor
  • DHT11 Temperature/Humidity Sensor

Wire it all up

First step is to wire up your Raspberry Pi to the DHT11 temperature sensor.

Here’s a drawing of my wiring, as well as two photos that should make it clear.

DHT11 Wiring Diagram
Raspberry Pi Wiring
DHT11 Wiring On Breadboard

 

Set up your Raspberry Pi

Once you’re all set, you’ll need to install a library to easily pull data from the sensor. I’m getting started with Python, so I’m using the Python library provided by Adafruit. I found some great instructions in this PDF: https://learn.adafruit.com/downloads/pdf/dht-humidity-sensing-on-raspberry-pi-with-gdocs-logging.pdf

I’ll summarise it here:

Open the terminal on your Raspberry Pi and run the following commands:

git clone https://github.com/adafruit/Adafruit_Python_DHT.git
cd Adafruit_Python_DHT

This will clone the Adafruit Python Library to your Pi

To make sure you have the correct dependencies to use the library, you’ll also need to run these commands:

sudo apt-get update
sudo apt-get install build-essential python-dev python-openssl

Next you’ll need to install the library we cloned earlier:

sudo python setup.py install

To confirm that you’ve successfully installed the library, test the sensor by navigating to the Examples folder and running the test python script:

cd examples
sudo ./AdafruitDHT.py 11 4

This tests pin GPIO 4 for the DHT11 sensor and returns the temperature and humidity values.
Temperature and Humidity Results
Temperature Outside

 

As you can see, the result seems to be pretty accurate.

The parameters at the end can be modified to suit your setup. If you’re using the DHT22 sensor, replace 11 with 22 , or if you’re using another GPIO pin, substitute 4 for the appropriate GPIO pin number.

For example, sudo ./AdafruitDHT.py 22 17 refers to a DHT22 on GPIO17

Stay tuned for future posts on how we’ll connect this data to the cloud and Power BI!

We ran into an interesting problem the other day when our internal app.PowerBI.com dashboard just stopped automatically updating. I tried to force a manual refresh but received this rather unhelpful error message.

Power BI Error Message

Here’s how we fixed it.

After some back and forth with the PowerBI support team, we realised that the issue might have something to do with the oAuth security tokens stored in Zendesk. We were able to connect to the Zendesk data using the PowerBI desktop application which generated a new oAuth request. From there, we queried Zendesk support on how to delete the oAuth tokens manually so we could recreate them.

The support guys at Zendesk have been pretty fantastic and this time was no exception. They quickly sent through instructions for deleting the existing oAuth tokens. They linked us to a page with a curl command that would revoke the tokens stored on Zendesk. After downloading curl and running the command we were back in action and our data was live again! Here is how you can do the same thing.

If you’re using Windows, download the latest version of curl.exe from http://www.paehl.com/open_source. You’ll need the version that includes WinSSL, since that’s how we’ll connect to your Zendesk platform. Unpack the 7zip archive and open a command prompt to the extracted “curl_X64_ssl\winssl” directory.

You can query the stored tokens using this command (remember to substitute the values in the curly braces with your own):
curl.exe https://{subdomain}.zendesk.com/api/v2/oauth/tokens.json -v -u {emailaddress}:{password}

Which returned the following output:

ZenDesk OAuth Token

This gives us information about the current oAuth token, including the token ID (in red) which we could now use to revoke the token:
curl https://{subdomain}.zendesk.com/api/v2/oauth/tokens/{ID from the last command}.json -X DELETE -v -u {emailaddress}:{password}

I had to do this a couple of times to delete all tokens. Be careful running this command if you have other applications that integrate with Zendesk – if you delete the wrong token you will break them. In our case, we only connected Zendesk with PowerBI and no other application so I just deleted all tokens. Once they were gone, I deleted the Zendesk data and dashboards from PowerBI and then reconnected to the service using the “Get Data” menu. As expected, this prompted me to authenticate so PowerBI and Zendesk could create a new token.

Power BI Dashboard Ready

Everything was all good after this point and my automatic update was working again. If you’re interested in PowerBI and Office 365 or if you want to get more value out of your existing subscription then get in touch and we can build something that works for you.

 

Microsoft delivered on a long awaited feature of OneDrive last week – the syncing of shared folders.

Microsoft announced the feature back in February 2014 and called it Co-Owners. As far as I can tell, it’s no longer called co-owners, though the concept is still the same.

The new feature allows users of the consumer version of OneDrive to add folders that have been shared with them to their own OneDrive. Once added, these folders can be synced to computers and devices. Any changes are uploaded to OneDrive and appear for all users who access the folder.

Since the shared folder appears to be inside your own OneDrive, the feature works on all versions of the app – PC, Mac, iOS, Android and Windows Phone.

It came in handy last week for one of my clients who run Macs and don’t have a business version of Office 365 since their parent organisation runs hosted exchange through another provider. The feature was very easy to set up and involves just a few steps.

To add shared folders to your OneDrive

  1. Log in to www.onedrive.com as the user that has folders shared with them
  2. Click the Shared link on the left menu
    View Shared folders on OneDrive
  3. Select the folder by right-clicking or ticking the circle in the top right of the folder. Choose Add to My OneDrive. It’s either on the context menu if you right-click, or on the top menu if you select the folder.
    Add to OneDrive via right-click

    Add to OneDrive via top menu
  4. Once added to OneDrive, you can access this folder from any device, or sync it to your PC or Mac.

 

This week we’re exploring the capabilities of Microsoft Power BI (an Office 365 add-on) to give us a clear picture of our daily performance.

Power BI allows you to connect multiple data sources from a wide range of on-premise and cloud services and view the live data in a clean dashboard. You can view and share dashboards from the browser, the windows app, or the mobile apps for iOS, Android or Windows Phone. In our case, we wanted the screen in our office to focus on a couple of things: the performance of our support team, and our website statistics.

We’re tracking the support team performance because it’s the core of our business, and the website performance because we’re focusing on delivering more useful content and would like to see how it’s received.

The data we need to track these metrics is stored in external silos – Zendesk and Google Analytics. Luckily, Power BI makes it easy to connect these data sources to a single dashboard.

Here’s a video of it starting up, and a photo of the finished dashboard:

Power BI Dashboard

This setup uses a Raspberry Pi connected to an Azure Virtual Machine running Power BI through the browser.

We’ll be adding new features in the next few weeks  involving additional Raspberry Pis, some connected sensors and Azure SQL. Stay tuned!