Outlook for Windows: Shared calendar improvements

Microsoft have updated the Office 365 roadmap with some upcoming improvements to calendar sharing in Office 365.

Apart from being simpler, these new calendar updates are also great for separate companies who use Office 365 and share resources like meeting rooms.

The current external sharing options are difficult to configure and only update every 3 hours. However these new changes will allow for a simple, instantly syncing calendar sharing experience. Both internally and with external Office 365 & Outlook.com users.

See here for more info.

From Microsoft’s notes

Introducing a new service backed model for sharing calendars with other Office 365 subscribers that improves performance and reliability and keeps all calendars in sync.

This update came from our Azure function which monitors the Office 365 Roadmap, generates an image and triggers a Microsoft Flow Approval to collect our input. See our knowledge base for more examples of our business process automation.

Connect Gravity Forms to Microsoft Flow

Connect Gravity Forms to Microsoft Flow

We use Gravity Forms on our website and it works pretty well. Whenever a form is completed, we receive an email – except sometimes we don’t. Just recently, I missed a few enquiries because of a configuration change on our site.

To stop this from happening, I went looking for alternative notification options for Gravity Forms that didn’t just rely on an email making it from our website to my inbox. I remembered that Zapier had a connector, however I was disappointed to discover that it only works on Developer licenses which cost $199 USD a year, and we’re running a $39 single site license.

Luckily Gravity Forms has some easy to follow API documentation that allow us to connect directly to our site’s forms and entries via REST methods.

This solution demonstrates how to build an Azure Function app in C# that retrieves the entries from a Gravity Form and sends them to Microsoft Flow. A Microsoft Flow checks each entry against a SharePoint list and if it doesn’t exist, it adds it.

The benefits of this solution is that it’s completely serverless and almost free (depending on the App Service plan). Also since it’s in Microsoft Flow, you can do anything you want with the form entries. You could create a task in Planner, add a message to Microsoft Teams channel, or pipe them directly into Dynamics 365 or Mailchimp.

The first step is to enable access to your Gravity Forms via the API.

Enable the Gravity Forms API and retrieve the form info

  1. Sign into your site’s WordPress Admin Panel and visit the Settings section of Gravity Forms
  2. Enable the API. Retrieve and make a note of the Public API Key and Private API Key.Enable Gravity Form sAPI
  3. Set the Impersonate account user. Your Function App will have the same form access permissions as the user that you choose here
  4. Visit the form that you’d like to retrieve the entries for and make a note of the Form ID (eg. 1)Retrieve Field Info From Form
  5. Make a note of all the fields in the form. We’ll be adding these as columns to a SharePoint List.
  6. It’s also worth making a note of each field’s corresponding ID (eg. 1.3, 2, 3 etc) since the JSON representation of each field uses this and not the field name.Get Field IDs From Gravity Forms

Create a SharePoint List to receive the form data

  1. Sign into your SharePoint site with your Office 365 Account.
  2. Visit Site Contents and create a new SharePoint list with an appropriate name.Create SharePoint List Under Site Contents
  3. You can rename the Title column to something else if Title isn’t appropriate. I changed mine to First Name.Rename Title Column In SharePoint
  4. Create columns to match the field names in your Gravity Forms form. Here’s the field names and types that we’re using:Column Details For Gravity Forms

Create a Function App in Visual Studio 2017

In previous tutorials, we’ve created Azure Functions directly in the browser. This time we’ll be using Visual Studio to test and deploy our functions.

Open Visual Studio 2017 and make sure that it’s up to at least version 15.3. You’ll need to ensure that Azure Development tooling is installed, and that you can create Azure Function Apps. See here for a list of prerequisites.

  1. Go to File, New Project, Visual C#, Cloud, Azure Functions then create a Function App and give it a name.Create New Azure Function App

If Visual Studio is completely up to date, you’ve installed all the prerequisites, but you still can’t see an option to create Azure Functions, you may need to go to Tools > Extensions and Updates > Updates > Visual Studio Marketplace and install the Azure Functions and Web Jobs Tools update.Update Azure Functions

  1. An Azure Function app is pretty much an Azure Web App, and each Function App can contain multiple functions. To add a Function to our Function App, right click on your project in the Solution Explorer, choose Add, New Item.Add New Item To Project In Visual Studio
  2. Then select Azure Function and give your function a name – I’ve called this one GravityForms_Enquiries. Click Add.Add New Azure Function To Project
  3. Choose Timer trigger. You’ll also want to specify how often you’d like the function app to run using CRON scheduling. The default value means that your function will execute every 5 minutes. In our published function, we’re going to check for form entries every 4 hours. While we’re debugging, we’re checking every minute – just until we’re ready to publish.Create Timer Triggered CSharp Function
  4. Your Function should look like thisCreatedAzure Function In Visual Studio
  5. Copy and paste the following code into your function. Replace the string placeholders in the RunAsync method with your own values, and make sure that you update your namespace and function app name (if you didn’t choose GravityForms_Enquiries too).
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Net.Http;
using System.Threading.Tasks;
using System.Web;
using System.Security.Cryptography;
using System.Net.Http.Headers;
using System.Text;

namespace GCITSFunctions
{
    public static class GravityForms_Enquiries
    {
        [FunctionName("GravityForms_Enquiries")]
        public static void Run([TimerTrigger("0 0 */4 * * *")]TimerInfo myTimer, TraceWriter log)
        {
            //Change the Timer Trigger above to "0 * * * * *" when debugging to avoid waiting too long for it to execute.

            log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
            
            string content = RunAsync().Result;

            // Remove the comments from the below four lines once you've retrieved the HTTP POST URL from Microsoft Flow and added it in.
            //HttpClient client = new HttpClient();
            //HttpContent jsoncontent = new StringContent(content, Encoding.UTF8, "application/json");
            //string flowRequest = "<Enter flow HTTP POST URL HERE>";
            //var result = client.PostAsync(flowRequest, jsoncontent).Result;
        }

        
        static async Task<string> RunAsync()
        {
            HttpClient client = new HttpClient();
            // Add the public and private keys for Gravity Forms
            string publicKey = "<Enter Gravity Forms Public API Key>";
            string privateKey = "<Enter Gravity Forms Private API Key>";
            string method = "GET";
            // Specify the form ID of the form you're retrieving entries for
            string formId = "1";
            string route = string.Format("forms/{0}/entries", formId);
            /* Paging specifies the number of entries that will be retrieved from your form in this call, eg. 1000. You can make this higher or lower if you like. 
            It will retrieve the most recent entries first. */
            string paging = "&paging[page_size]=1000";
            string expires = Security.UtcTimestamp(new TimeSpan(0, 1, 0)).ToString();
            string signature = GenerateSignature(publicKey, privateKey, method, route);
            /* Replace gcits.com with your own domain name. If the call doesn't work initially, you may need to make sure that 'pretty' permalinks are enabled on your site.
            See here for more information: https://www.gravityhelp.com/documentation/article/web-api/ */
            string url = string.Format("//gcits.com/gravityformsapi/{0}?api_key={1}&signature={2}&expires={3}{4}", route, publicKey, signature, expires, paging);
            client.BaseAddress = new Uri(url);
            client.DefaultRequestHeaders.Accept.Clear();
            client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
            var response = await client.GetAsync(client.BaseAddress);

            string content = response.Content.ReadAsStringAsync().Result;
            return content;

        }

        public static string GenerateSignature(string publicKey, string privateKey, string method, string route)
        {
            string expires = Security.UtcTimestamp(new TimeSpan(0, 1, 0)).ToString();
            string stringToSign = string.Format("{0}:{1}:{2}:{3}", publicKey, method, route, expires);
            var sig = Security.Sign(stringToSign, privateKey);
            return (sig);
        }


    }

    public class Security
    {

        public static string UrlEncodeTo64(byte[] bytesToEncode)
        {
            string returnValue
                = System.Convert.ToBase64String(bytesToEncode);

            return HttpUtility.UrlEncode(returnValue);
        }

        public static string Sign(string value, string key)
        {
            using (var hmac = new HMACSHA1(Encoding.ASCII.GetBytes(key)))
            {
                return UrlEncodeTo64(hmac.ComputeHash(Encoding.ASCII.GetBytes(value)));
            }
        }

        public static int UtcTimestamp(TimeSpan timeSpanToAdd)
        {
            TimeSpan ts = (DateTime.UtcNow.Add(timeSpanToAdd) - new DateTime(1970, 1, 1, 0, 0, 0));
            int expires_int = (int)ts.TotalSeconds;
            return expires_int;
        }
    }
}

  1. Update the TimerTrigger to ‘0 * * * * *’ while we debugUpdate Timer Trigger For Debugging
  2. Add a reference to your function for System.Web by right clicking on your project and choosing Add, Reference. Add Reference To Azure Function App
  3. Scroll down to System.Web, check the box and click OK.Choose System.Web Reference
  4. Next we need to add a connection string for an Azure Storage account into the local.settings.json file. You can retrieve a connection string from an existing storage account via the Azure Portal, or by downloading Azure Storage Explorer from www.storageexplorer.com, signing in and copying the Connection String from the bottom left properties section. I recommend downloading Storage Explorer anyway since it’s a handy tool for working with Azure Storage accounts. If you don’t have a storage account, you’ll need to make one.Copy Connection String From Azure Storage Explorer
  5. Once you’ve got the Connection string, paste it in the AzureWebJobsStorage value of the local.settings.json file.Add Azure Storage Connection String

Get your Gravity Form entries as JSON

In order for us to use your form entries in Microsoft Flow, we’ll need to show Flow what your form entries look like in JSON format. To do this, we’ll use Fiddler.

  1. Download Fiddler from www.telerik.com/fiddler and install and run it.
  2. Fiddler allows you to analyse your computer’s internet traffic, as well as a bunch of other things. You might see a lot of activity from irrelevant processes, you can right click and filter these processes out.Filter Processes In Fiddler
  3. Once your Fiddler stream is a little less busy, we’ll run the function app. Return to Visual Studio and press F5.
  4. You’ll see the Azure Functions Core Tools window appear. This is a local version of the Azure Functions runtime that allows you to debug Azure Functions on your own computer before you deploy them.Start Azure Functions Core Tools
  5. Wait for your Azure function to execute. It should display some text that looks like this:Run Azure Function And Confirm
  6. Now switch over to Fiddler and locate the call that it just made to your website. If all goes well, you should see a row with a result of 200 to your domain. Locate Call To Gravity Forms Endpoint In Fiddler
  7. Click this row, and choose to decode it on the bottom right.Decode Fiddler Response Body
  8. Select the Raw tab, and triple click the JSON result at the bottom to select it all, then copy this into Notepad for later. This is the JSON representation of your Gravity Forms form entries that we can use in Microsoft Flow.Copy JSON Payload From Fiddler Raw Tab

Create a Microsoft Flow to receive the Gravity Forms entries

  1. Visit flow.microsoft.com and sign in with your Office 365 account.
  2. Create a new Blank flow, give it a name and start it with a Request Trigger. Then click Use sample payload to generate schemaStart Microsoft Flow With Request Trigger
  3. Paste the JSON payload that we saved from Fiddler and click Done.Paste JSON Payload In Microsoft Flow
  4. Add an action step so that we can save the flow and retrieve the HTTP POST URL. In this example I added a Notification action.Add Sample Action To Save Flow
  5. Click Create Flow, then copy the URL that was created next to HTTP POST URL.Copy HTTP POST URL From Microsoft Flow
  6. Switch back over to Visual Studio 2017 and paste the HTTP POST URL in the placeholder for the flowRequest string variable. Next uncomment out the last four lines of the Run method.Update Azure Function With Flow Request URL
  7. Run the Function again to confirm that it’s sending the JSON payload to Microsoft Flow. You should see a row in Fiddler that looks like this:Confirm Function App Can Reach Microsoft Flow
  8. Inspecting the call on the top right under the Raw tab shows that it sent the JSON payload:Inspect Function Call To Microsoft Flow
  9. When you return to Microsoft Flow, you should see a recent successful Flow run.Confirm Flow Received Function Call And Ran
  10. Open the flow run to see that the payload was received by the Request step.Results Of HTTP Request Call

Use Microsoft Flow to add the entries to a SharePoint list

  1. Remove the action below the Request trigger and add an Apply to each step. Add the entries output from the popout menu to the ‘Select an output’ field. Then add a SharePoint – Get Items action into the Apply to each step and select or enter your SharePoint site, then choose the SharePoint list you created earlier.Get Items From SharePoint In Apply To Each
  2. Click Show advanced options and add GFID eq ‘id’ into the Filter Query field. Where ‘id’ is the id output from the Request trigger. Be sure to include the single quotes. This step checks the SharePoint List for existing entries with the same Gravity Forms entry ID.Filter Get Items By Gravity Forms ID
  3. Next add a Condition step, and click Edit in advanced mode. Copy and paste the following into this field:
    @empty(body('Get_items')?['value'])

    Check If Returned Items Are Empty

  4. This checks whether any items were returned from SharePoint that match that Gravity Forms ID. If there weren’t any, we’ll create one.
  5. Your flow layout should now look like this:Structure Of Completed Microsoft Flow
  6. In the Yes section, add a SharePoint – Create Item action. Select or enter your SharePoint site, and choose the relevant SharePoint list. Refer to your notes on which fields match which field IDs, then drag the form values into the corresponding SharePoint fields.If Yes Create SharePoint Item In Microsoft Flow
  7. I also added an Office 365 – Send an email action below this one so I get an extra email notification. You may want to wait until you’ve already imported all existing form entries before you add this one. If you’ve got hundreds of entries that haven’t been added to SharePoint, you’ll get hundreds of emails.Send Email Notification With Gravity Forms Entry
  8. Click Update Flow, return to Visual Studio 2017 and run your Function app again (press F5).
  9. Once it successfully completes, close the Azure Functions Core Tools window and head back over to Microsoft Flow to see it process. It should display the following when it’s done:Wait For Flow To Run
  10. Next, visit your SharePoint list. You should now have the data from all Website Enquiry form entries in a single location. This data can now be used for all sorts of Microsoft Flows and business processes.Form Entries In SharePoint List

Publish your Function App to Azure

To make sure that your form data stays up to date, we need to publish our Function App to Azure.

  1. Switch to Visual Studio, and update the timer on your function app to ‘0 0 */4 * * *‘ to make sure it doesn’t keep running each minute in the cloud
  2. Now, right click on your project name and click PublishPublish Azure Function App
  3. Click Azure Function App, and choose Create New. (If you already have an existing Azure Function App, you can Select Existing, and specify the function app you’d like to deploy to.)Publish To New Azure Function App
  4. Since we’re creating a new Azure Function App we need to specify some details. As mentioned earlier, Function Apps are just Azure Web Apps. To deploy them, we need to create or choose an App Service.
  5. Give your Function App an App Name, select your Azure subscription, choose or create a Resource Group, App Service Plan and Storage Account.Create Azure App Service
  6. When creating an App Service plan, you can choose from the following sizes. Pricing varies depending on the underlying VM size, however the Consumption plan costs pretty much nothing. Choosing Consumption doesn’t give you a dedicated VM, and function runs are limited to 5 minutes duration. This applies to all functions within your Function App.Choose Azure App Service Plan
  7. Once you’re happy with your settings, click OK, then Create, and wait for your App Service to deploy.Wait For Function App To Deploy
  8. When it finishes, click Publish. Your function app is now deploying to Azure.Click Publish To Publish Function To Azure
  9. Sign in to https://portal.azure.com to see it in action under Web Apps. By default, your functions are in Read Only mode, and you won’t be able to view or edit the underlying C# code.Open Function App In Azure Portal
  10. To keep track of your function’s activity, you can see function runs in the Monitor Section.See Run History Of Azure Functions

What is the Office 365 Unified Audit Log?

For security and compliance in Office 365, the Unified Audit Log is probably the most important tool of all. It tracks every user and account action across all of the Office 365 services. You can run reports on deletions, shares, downloads, edits, reads etc, for all users and all products. You can also set up custom alerting to receive notifications whenever specific activities occur.

For all of it’s usefulness, the most amazing thing about it is that it’s not turned on by default.

It can be extremely frustrating when you come across a query or problem that could easily be resolved if we had access to the logs, only to find out they were never enabled in the first place. Here’s how to get it set up in your own organisation, or if you’re a Microsoft Partner, how to script it for all of your customers using Delegated Administration and PowerShell.

How to enable the Unified Audit Log for a single Office 365 tenant

If you’re only managing your own tenant, it’s quite simple to turn it on. You can do this in two ways.

How to enable the Unified Audit Log via the Security and Compliance Center for a single Office 365 tenant

  1. Visit https://protection.office.com as an Office 365 admin
  2. Click Search & investigation
  3. Click Audit log search
  4. If it’s not enabled you’ll see a link to Start recording user and admin activities. Click it to enable the Unified Audit Log.

How to enable the Unified Audit Log via PowerShell for a single Office 365 tenant

  1. Connect to Exchange Online via PowerShell as an administrator by following this guide
  2. Make sure your Office 365 tenant is ready for the Unified Audit Log by enabling Organization Customization:
    Enable-OrganizationCustomization
  3. Run the following command to enable the Unified Audit Log:
    Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true

How to Enable the Unified Audit Log on Multiple Office 365 tenants using Delegated Administration via PowerShell

I’ve recently written a few posts on running bulk PowerShell operations across all of your customer’s Office 365 tenants.

Since the PowerShell command for enabling the Unified Audit Log is just one line, I assumed we’d be able to add it as a script block and run it across all of our Office 365 customers at once.

When I tried setting this up, it initially appeared to be working, though I soon received the following error:

The remote server returned an error: (401) Unauthorized.

Attempting To Set Office 365 Unified Audit Log Via Delegated Administration in PowerShell

It looks like Microsoft don’t allow you to run this particular script using Delegated Administration, though I’m not too sure why. You also can’t enable it via https://protection.office.com using your delegated admin credentials, it just seems to revert you back to the settings for your own Office 365 tenant.

In order to enable the Unified Audit Log, we’ll need to activate it using an admin within the customer’s Office 365 tenant. The remainder of this blog post contains the instructions on how to script this process.

Disclaimer

Use the following scripts at your own risk. They are designed to temporarily create Global Admins with a standard password (chosen by you) on each of your customer’s environments. If all goes well, every admin that was created should be deleted automatically. If some tenants fail to enable the Unified Audit Log correctly, the new admin for those tenants will remain (I’ve included a script to remove these ones too). Also, see step 3 for a link to a script that reports on every Unlicensed Office 365 Company Admin in your Office 365 tenant. Use it to verify that none of these temporary admins remain.

This process has three parts

  1. PowerShell Script One: Checking Unified Audit Log Status and creating admin users
  2. PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins
  3. PowerShell Script Three (And Optional Script): Removing unsuccessful admins and checking tenants for all unlicensed admins.

Things you should know beforehand

For the most part, these scripts work. Using these three scripts, I’ve enabled the Unified Audit Log on 227 of our 260 delegated Office 365 customers. However, there are a few error messages that can pop up, and a few reasons that will prevent it working for some Office 365 tenants at all.

Here are a few things to keep in mind:

  • It doesn’t work with LITEPACK and LITEPACK_P2 subscriptions

    In our case these are Telstra customers running the older Office 365 Small Business and Office 365 Small Business Premium subscriptions. You can run our Office 365 Delegated Tenant license report to identify these customers.LITEPACK_P2 Will Not Enable Office 365 Unified Audit Log

  • It does not work on customers that don’t have any subscriptions, or only has expired subscriptions.

    It won’t work for Office 365 tenants that don’t have any Office 365 subscriptions, or if their Office 365 subscriptions have expired. The script will fail for these organisations with the error: The tenant organization isn’t in an Active State. Complete the administrative tasks that are active for this organization, and then try again.Office 365 Organisation Isn't In An Active State

  • It does not work on customers that only have Dynamics CRM licenses

    This script doesn’t seem to run on customers that only have Dynamics CRM Online. It hasn’t been tested with customers that only have Dynamics 365.

  • You should wait before running the second PowerShell Script

    It can take a while for the temporary admin user to receive the appropriate permissions in your customers Office 365 organisation. If you run the second script too soon, the temporary admin may not be able to pull down all the Exchange Online cmdlets to perform the required tasks.

PowerShell Script One: Checking Unified Audit Log Status and creating admin users

This script uses your own delegated admin credentials. It creates a list of all of your Office 365 Customers and reports on their subscriptions. If they have at least one subscription (active or not) it attempts to run an Exchange Online cmdlet to check whether the Unified Audit Log is enabled. If it’s enabled, it does nothing and moves onto the next customer. If it’s disabled, it creates a new user, assigns it to the Company Administrator role and adds a row to a CSV with the tenant ID, customer name and user principal name.

Retrieving License Count, Unified Audit Log Status and Creating Office 365 Admin

To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.

Modify the placeholder variables at the top of the script and run it in PowerShell.

<# This script will connect to all delegated Office 365 tenants and check whether the Unified Audit Log is enabled. If it's not, it will create an Exchange admin user with a standard password. Once it's processed, you'll need to wait a few hours (preferably a day), then run the second script. The second script connects to your customers' Office 365 tenants via the new admin users and enables the Unified Audit Log ingestion. If successful, the second script will also remove the admin users created in this script. #>

#-------------------------------------------------------------

# Here are some things you can modify:

# This is your partner admin user name that has delegated administration permission

$UserName = "[email protected]"

# IMPORTANT: This is the default password for the temporary admin users. Don't leave this as Password123, create a strong password between 8 and 16 characters containing Lowercase letters, Uppercase letters, Numbers and Symbols.

$NewAdminPassword = "Password123"

# IMPORTANT: This is the default User Principal Name prefix for the temporary admin users. Don't leave this as gcitsauditadmin, create something UNIQUE that DOESNT EXIST in any of your tenants already. If it exists, it'll be turned into an admin and then deleted.

$NewAdminUserPrefix = "gcitsauditadmin"

# This is the path for the exported CSVs. You can change this, though you'll need to make sure the path exists. This location is also referenced in the second script, so I recommend keeping it the same.

$CreatedAdminsCsv = "C:\temp\CreatedAdmins.csv"

$UALCustomersCsv = "C:\temp\UALCustomerStatus.csv"

# Here's the end of the things you can modify.

#-------------------------------------------------------------

# This script block gets the Audit Log config settings

$ScriptBlock = {Get-AdminAuditLogConfig}

$Cred = get-credential -Credential $UserName

# Connect to Azure Active Directory via Powershell

Connect-MsolService -Credential $cred

$Customers = Get-MsolPartnerContract -All

$CompanyInfo = Get-MsolCompanyInformation

Write-Host "Found $($Customers.Count) customers for $($CompanyInfo.DisplayName)"

Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "

foreach ($Customer in $Customers) {

	Write-Host $Customer.Name.ToUpper()
	Write-Host " "

	# Get license report

	Write-Host "Getting license report:"

	$CustomerLicenses = Get-MsolAccountSku -TenantId $Customer.TenantId

	foreach($CustomerLicense in $CustomerLicenses) {

		Write-Host "$($Customer.Name) is reporting $($CustomerLicense.SkuPartNumber) with $($CustomerLicense.ActiveUnits) Active Units. They've assigned $($CustomerLicense.ConsumedUnits) of them."

	}

	if($CustomerLicenses.Count -gt 0){

		Write-Host " "

		# Get the initial domain for the customer.

		$InitialDomain = Get-MsolDomain -TenantId $Customer.TenantId | Where {$_.IsInitial -eq $true}

		# Construct the Exchange Online URL with the DelegatedOrg parameter.

		$DelegatedOrgURL = "https://ps.outlook.com/powershell-liveid?DelegatedOrg=" + $InitialDomain.Name

		Write-Host "Getting UAL setting for $($InitialDomain.Name)"

		# Invoke-Command establishes a Windows PowerShell session based on the URL,
		# runs the command, and closes the Windows PowerShell session.

		$AuditLogConfig = Invoke-Command -ConnectionUri $DelegatedOrgURL -Credential $Cred -Authentication Basic -ConfigurationName Microsoft.Exchange -AllowRedirection -ScriptBlock $ScriptBlock -HideComputerName

		Write-Host " "
		Write-Host "Audit Log Ingestion Enabled:"
		Write-Host $AuditLogConfig.UnifiedAuditLogIngestionEnabled

		# Check whether the Unified Audit Log is already enabled and log status in a CSV.

		if ($AuditLogConfig.UnifiedAuditLogIngestionEnabled) {

			$UALCustomerExport = @{

				TenantId = $Customer.TenantId
				CompanyName = $Customer.Name
				DefaultDomainName = $Customer.DefaultDomainName
				UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
				UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
				DistinguishedName = $AuditLogConfig.DistinguishedName
			}

			$UALCustomersexport = @()

			$UALCustomersExport += New-Object psobject -Property $UALCustomerExport

			$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append

		}

		# If the Unified Audit Log isn't enabled, log the status and create the admin user.

		if (!$AuditLogConfig.UnifiedAuditLogIngestionEnabled) {

			$UALDisabledCustomers += $Customer

			$UALCustomersExport =@()

			$UALCustomerExport = @{

				TenantId = $Customer.TenantId
				CompanyName = $Customer.Name
				DefaultDomainName = $Customer.DefaultDomainName
				UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
				UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
				DistinguishedName = $AuditLogConfig.DistinguishedName
			}

			$UALCustomersExport += New-Object psobject -Property $UALCustomerExport
			$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append


			# Build the User Principal Name for the new admin user

			$NewAdminUPN = -join($NewAdminUserPrefix,"@",$($InitialDomain.Name))

			Write-Host " "
			Write-Host "Audit Log isn't enabled for $($Customer.Name). Creating a user with UPN: $NewAdminUPN, assigning user to Company Administrators role."
			Write-Host "Adding $($Customer.Name) to CSV to enable UAL in second script."


			$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force
			$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)

			New-MsolUser -TenantId $Customer.TenantId -DisplayName "Audit Admin" -UserPrincipalName $NewAdminUPN -Password $NewAdminPassword -ForceChangePassword $false

			Add-MsolRoleMember -TenantId $Customer.TenantId -RoleName "Company Administrator" -RoleMemberEmailAddress $NewAdminUPN
	
			$AdminProperties = @{
				TenantId = $Customer.TenantId
				CompanyName = $Customer.Name
				DefaultDomainName = $Customer.DefaultDomainName
				UserPrincipalName = $NewAdminUPN
				Action = "ADDED"
			}

			$CreatedAdmins = @()
			$CreatedAdmins += New-Object psobject -Property $AdminProperties

			$CreatedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $CreatedAdminsCsv -Append

			Write-Host " "

		}

	}

Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "

}

Write-Host "Admin Creation Completed for tenants without Unified Audit Logging, please wait 12 hours before running the second script."


Write-Host " "

See the Unified Audit Log status for your customers

One of the outputs of this script is the UALCustomerStatus.csv file. You can make a copy of this, and rerun the process at the end to compare the results.
Report On Customer Status Of Office 365 Unified Audit Log

Browse the list of created admins

The script will also create a CSV containing the details for each admin created. This CSV will be imported by the second PowerShell Script and will be used to enable the Unified Audit Log on each tenant.

List Of Office 365 Admins Created By PowerShell Script

PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins

This script should be run at least a few hours after the first script to ensure that the admin permissions have had time to correctly apply. If you don’t wait long enough, your admin user may not have access to the required Exchange Online cmdlets.

You’ll need to update the password in this script to reflect the password you chose for your temporary admins in the first script.

To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.

Modify the placeholder variables at the top of the script and run it in PowerShell.

<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>

#-------------------------------------------------------------

# Here are some things you can modify:

# This is your partner admin user name that has delegated administration permission

$UserName = "[email protected]"

# IMPORTANT: This is the default password for the temporary admin users. Use the same password that you specified in the first script.

$NewAdminPassword = "Password123"

# This is the CSV containing the details of the created admins generated by the first script. If you changed the path in the first script, you'll need to change it here.

$Customers = import-csv "C:\temp\CreatedAdmins.csv"

# This CSV will contain a list of all admins removed by this script.

$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"

# This CSV will contain a list of all unsuccessful admins left unchanged by this script. Use it to retry this script without having to start again.

$RemainingAdminsCsv = "C:\temp\RemainingAdmins.csv"

#-------------------------------------------------------------

$Cred = get-credential -Credential $UserName

foreach ($Customer in $Customers) {

	Write-Host $Customer.CompanyName.ToUpper()
	Write-Host " "


	$NewAdminUPN = $Customer.UserPrincipalName

	$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force

	$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)

	Write-Host " "

	Write-Output "Getting the Exchange Online cmdlets as $NewAdminUPN"

	$Session = New-PSSession -ConnectionUri https://outlook.office365.com/powershell-liveid/ `
	-ConfigurationName Microsoft.Exchange -Credential $NewAdminCreds `
	-Authentication Basic -AllowRedirection
	Import-PSSession $Session -AllowClobber

	# Enable the customization of the Exchange Organisation
	
	Enable-OrganizationCustomization

	# Enable the Unified Audit Log

	Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true

	# Find out whether it worked

	$AuditLogConfigResult = Get-AdminAuditLogConfig

	Remove-PSSession $Session

	# If it worked, remove the Admin and add the removed admin details to a CSV

	if($AuditLogConfigResult.UnifiedAuditLogIngestionEnabled){

		# Remove the temporary admin
		Write-Host "Removing the temporary Admin"

		Remove-MsolUser -TenantId $Customer.TenantId -UserPrincipalName $NewAdminUPN -Force

		$AdminProperties = @{
			TenantId = $Customer.TenantId
			CompanyName = $Customer.CompanyName
			DefaultDomainName = $Customer.DefaultDomainName
			UserPrincipalName = $NewAdminUPN
			Action = "REMOVED"
		}

		$RemovedAdmins = @()
		$RemovedAdmins += New-Object psobject -Property $AdminProperties
		$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append
	
	}

	# If it didn't work, keep the Admin and add the admin details to another CSV. You can use the RemainingAdmins CSV if you'd like to try again.

	if(!$AuditLogConfigResult.UnifiedAuditLogIngestionEnabled){

		Write-Host "Enabling Audit Log Failed, keeping the temporary Admin"

		$AdminProperties = @{
			TenantId = $Customer.TenantId
			CompanyName = $Customer.CompanyName
			DefaultDomainName = $Customer.DefaultDomainName
			UserPrincipalName = $NewAdminUPN
			Action = "UNCHANGED"
		}

		$RemainingAdmins = @()
		$RemainingAdmins += New-Object psobject -Property $AdminProperties
		$RemainingAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemainingAdminsCsv -Append


	}

	Write-Host " "
	Write-Host "----------------------------------------------------------"
	Write-Host " "

}

View the successful Office 365 admins that were removed

If the Unified Audit Log was enabled successfully, the newly created Office 365 admin will be automatically removed. You can see the results of this in the RemovedAdmins CSV.

Office 365 Admins Removed Once Unified Audit Log Is Enabled

See the remaining Office 365 admins that couldn’t enable the Unified Audit Log

If the Unified Audit Log couldn’t be enabled, the Office 365 admin will remain unchanged. If you like, you can use the RemainingAdmins CSV in place of the CreatedAdmins CSV and rerun the second script. In our case, some tenants that couldn’t be enabled on the first try, were able to be enabled on the second and third tries.

Office 365 Admins That Remain Unchanged Since Unified Audit Log Enable Failed

 

PowerShell Script Three: Removing unsuccessful admins

Any tenants that weren’t able to have their Unified Audit Log enabled via PowerShell will still have the Office 365 admin active. This script will import these admins from the RemainingAdminsCsv and remove them.

Once removed, it will add them to the RemovedAdmins CSV. You can compare this to the CreatedAdmins CSV from the first script to make sure they’re all gone.

<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>

#-------------------------------------------------------------

# Here are some things you can modify:

# This is your partner admin user name that has delegated administration permission

$UserName = "[email protected]"

# This CSV contains a list of all remaining unsuccessful admins left unchanged by the second script.

$RemainingAdmins = import-csv "C:\temp\RemainingAdmins.csv"

# This CSV will contain a list of all admins removed by this script.

$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"

#-------------------------------------------------------------

$Cred = get-credential -Credential $UserName

Connect-MsolService -Credential $cred

ForEach ($Admin in $RemainingAdmins) {

	$tenantID = $Admin.Tenantid

	$upn = $Admin.UserPrincipalName

	Write-Output "Deleting user: $upn"

	Remove-MsolUser -UserPrincipalName $upn -TenantId $tenantID -Force


	$AdminProperties = @{
		TenantId = $tenantID
		CompanyName = $Admin.CompanyName
		DefaultDomainName = $Admin.DefaultDomainName
		UserPrincipalName = $upn
		Action = "REMOVED"
	}

	$RemovedAdmins = @()
	$RemovedAdmins += New-Object psobject -Property $AdminProperties
	$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append


}

Want to see all the current Office 365 global administrators in your customers tenants?

To confirm that all of the created admins from these scripts have been removed, or just to see which global administrators have access to your customer tenants, you can run the scripts here. If required, there’s a second script that will block the credentials of the admins that you leave in the exported CSV.

I ran a small migration on Friday last week that took a lot longer than planned due to a few unexpected issues. The customer was migrating from Telstra’s Office 365 service (AKA syndicated account) to a Microsoft Office 365 account licensed via the CSP model.

This is a pretty standard migration that we run often, though this time I ran into a number of issues.

Since it was a small migration (4 users), and it needed to be moved over in a short amount of time, I opted to do a standard PST migration on an Azure VM. The plan was as follows:

  1. Disconnect the domain from the existing tenant
  2. Add it to the new tenant by moving it to new name servers and verifying the DNS
  3. Create the required users on the new tenant
  4. Set up the Exchange accounts for the old users via their .onmicrosoft.com addresses and export the mail via PSTs
  5. Set up the Exchange accounts for the new users and import the mail

Here’s a few important details regarding the source tenant:

  • Office 365 licensed via Telstra on a syndicated account
  • Domain DNS hosted on Microsoft’s name servers

Here are some important details regarding the destination tenant:

  • Office 365 licensed via CSP model
  • Domain DNS hosted on external name servers

Problem #1: Cannot create a user with the same UPN as before

My first issue occurred when I started to set up the users on the new tenant. Three out of four worked correctly, though the fourth mailbox (unfortunately the most important one) would not create correctly on the new tenant.

I was given the error OrgIdMailboxRecentlyCreatedException and the message:

Hang on, we’re not quite ready

It looks like your account [email protected] was created 1 hour ago. It can take up to 24 hours to set up a mailbox.

Hang on, we're not quite ready

I did some testing and discovered that the error message only appears when I set the User Principal Name to match what it was on the old tenant ([email protected]). If I set it up to use a variation of the old name ([email protected]), it worked instantly.

I suspected this was to do with some internal updating issue with how Office 365 maps usernames to mailboxes, and decided to give it a bit of time.

In the meantime, the customer needed to be able to send and receive as their previous email address. So I logged onto Exchange via Powershell and ran the following cmdlets:

set-mailbox paulm -windowsemailaddress [email protected]
set-mailbox paulm -emailaddresses [email protected], [email protected]

These cmdlets keep the user principal name as [email protected] though allow the user to send and receive mail as [email protected].

Once this was confirmed working, I set it up on the user’s Outlook profile and kicked off the migration.

I left this running overnight, and in the morning faced a new problem.

Problem #2: Cannot send to BigPond addresses due to SPF issue

The next issue was related to recipients using Telstra BigPond addresses. Emails sent to external domains worked fine, though emails sent to BigPond account would fail instantly. They’d return a message stating that the message was rejected by the recipient email server.

The error message was <[email protected]> Sender rejected. IB506

SPF error

The SPF Headers in the returned email state that there is no SPF records configured for the domain:

No spf record

Does not designate permitted sender hosts

I double checked the SPF records on the new name servers and they were configured correctly. I also sent test emails to my own external addresses, and the SPF headers were fine too.

I’m not 100% sure what the cause of the issue here is, though I suspect it is due to the fact that Microsoft used to manage the DNS for this domain via their internal Office 365 name servers. If the change is taking a while to propagate throughout the system, it may still looking at their internal name servers for records relating to this domain.

Solution: Switch Name Servers back to Microsoft

To resolve this issue, I transferred management of the DNS records back to Microsoft’s Office 365 name servers, and within about half an hour, all of the issues were resolved.

I was able to rename the user to the correct email ([email protected]) and users no longer had issues mailing BigPond accounts.

Since I was trying everything to resolve this issue, I’m not sure what the ultimate fix was. Though it seems that switching the name servers back to Microsoft on the new tenant did the most good.

I suspect that this was a DNS propagation problem within Office 365, and may have been resolved in time anyway. If you’re currently experiencing it, moving the DNS to Microsoft’s Office 365 name servers may speed up the resolution.

Apple has (very slightly) changed the way we add Office 365 email accounts to iOS 10. We’ve taken the opportunity to update our most popular How to video. We’re also giving a big plug for Outlook for iOS. Microsoft have done a great job with this app, and we think it’s a must have for Office 365 users. See how you can get your email under control in our 3 minute video:

You can customise your Office 365 login screens via a service called Azure Active Directory (or Azure AD).

Microsoft Azure continues to transition to the new portal at portal.azure.com, and Azure AD is one of the last services to make the leap. Now that it’s in Preview on the new portal, we’ve made an updated video on how to easily brand your Office 365 login screens.

 

As Office 365 evolves, we need to refresh our training materials. So here’s our updated video tutorial on how to install Office from Office 365.

EmailToSharePointA common requirement for our customers is to forward emails to SharePoint Online lists. This email data usually comes from website forms or enquiry pages, though there’s no out-of-the-box way to extract the form data from an email, and upload it to separate columns in SharePoint list.

Previously I was using Cloud2050 Email Sync, though it relied on software installed on a PC to work, and only worked while that PC was operational and Outlook was open.

Here’s a solution that operates completely in the cloud using Outlook Rules, MailParser.io and Microsoft Azure Logic Apps.

The solution looks like this:

  1. Office 365 forwards email from your website’s form to your mailparser.io address via an Outlook Rule or Exchange Transport Rule.
  2. MailParser.io receives the email, extracts the form data and sends it to an Azure logic app using a Generic HTTP Webhook.
  3. Your Azure Logic App receives the form data, connects to SharePoint Online and adds the form data into the appropriate SharePoint list columns.

Prerequisites:

  • Sign up for MailParser.io – a free 30 day trial is available
  • Sign up for Microsoft Azure – use your Office 365 account, a free 30 day trial is available
  • A SharePoint List set up with the fields required for your form

Setting up MailParser

  1. Once you’ve signed up for mailparser.io, sign in and click Create New InboxCreate New Inbox In Mailparser.io
  2. Give it a name and add some notes:Name Mailparser Inbox
  3. You’ll be given an email address to forward your form emails to. Keep track of this address, as you’ll need it to receive the emails you send from Outlook or Exchange mail rules. Forward a couple of sample form emails to the address to get started.Get Mailparser Email
  4. Once your emails are received, you can set up your Parsing Rules:Add Mail Parsing Rules
  5. Usually, the mailparser will be able to automatically identify the field names and values from your forwarded email. If it doesn’t, click Try Something Else to give it some help, otherwise click OK, start with this.Automatic Mail Parsing Rule Set Up
  6. Now, we start setting up our Generic Webhook. Click Webhook Integrations in on the left menu, then click Add New Integration.
    Click Webhook Integrations
  7. Click Generic Webhook.Click Generic Webhook
  8. Give it a descriptive name and type in a sample URL (I used http://google.com) into the Target URL field. We need to use a sample first so that we can copy the webhook’s JSON payload. We then use this JSON payload to help generate the actual TargetURL from Azure Logic Apps in the next steps.Save And Test Webhook With Sample URL
  9. Next, click Save and test.
  10. Then Send test data. We expect this to fail, though it will give us the JSON payload.Send Test Data With Sample URL
  11. Copy the text from Body Payload into Notepad or Visual Studio Code.Sample URL Fails, Get Body Payload

Set up the Azure Logic App

  1. Log onto Azure at portal.azure.com. If you don’t already have a subscription, you can sign up using your Office 365 account.
  2. Click New, search for Logic App, and click Logic AppSearch For Logic App
  3. Click CreateCreate Logic App
  4. Complete the fields, placing the Azure Logic App in the region of your choice. You can name the Resource group whatever you like, or use an existing one. Click Create.Enter Logic App Details
  5. Click Edit to start editing your logic app.Edit Logic App
  6. Search for Request and click the Request TriggerCreate Request Trigger
  7. Now you can use your copied JSON Body Payload from MailParser.io as a reference for your Request Body JSON Schema.You’ll need to define the data type for each Key-Value Pair in your JSON payload. This allows you to use the separate fields in your Azure Logic App, and add the field data into the appropriate SharePoint columns.The syntax of the Request Body JSON Schema is as follows:
{
    "type": "object", 
    "properties": {
        "name": {
            "type" : " string"
            },
        "email": {
            "type" : " string"
            }
    },  
    "required":["name", "email"]
} 

You can use Visual Studio Code, Notepad++ or Notepad to edit this schema so that it describes your JSON Payload.

Replace the properties values with the name of the keys in your JSON payload. Not all fields need to be added to the required array, only the ones that you need to create a valid SharePoint list entry.

In my case, this JSON body becomes the following JSON Schema.JSON Body In Visual Studio Code
JSON Request Body Schema

  1. Paste the Schema into the Request Body Schema and click Save.Save Request To Get POST URL
  2. You will then receive the URL that you can use in Mailparser.io to send your requests:
  3. Next click + New step.Add New Step To Logic App
  4. Type SharePoint and click SharePoint – Create item.Create SharePoint List Item
  5. You may need to add a Connection to SharePoint Online. If you’re prompted, add a connection using an Office 365 account that has permission to write to the required SharePoint list. If you don’t have a SharePoint list available to accept the data, you’ll need to set one up now before proceeding.
  6. Next enter your site URL. The List Name drop down will be populated with the available lists. You should also see that the Outputs from the Request step are available to use.Enter SharePoint Site And List Details
  7. The list columns that can accept strings, as well as a few other column types will be available for you to modify. Click in each relevant column and select the relevant output.Add Outputs To SharePoint List
  8. Once you’re finished, go back to the Request Step in your Logic App and copy the URL from the Request stepCopy Request URL
  9. Return to MailParser.io, go back to Webhook integrations, and click Edit.Edit Webhook Integration
  10. Paste the URL from your Logic App Request step into the Target URL.Update Webhook Target URL
  11. Click Save and test.
  12. Click Send test data.Test Custom Webhook
  13. You should receive a response code of 202 to confirm it was sent successfully.Confirm Webhook Works
  14. You can now check Azure Logic Apps to confirm that it ran correctly.Logic App Runs Correctly
  15. You should also see the new entry in your SharePoint Online list.New Item In SharePoint

Setting up the Outlook Rule

Once you’ve confirmed it’s working, you can set up your mail rules in Outlook or via Exchange to automatically forward emails to your mailparser.io email address.

  1. Right click on an email sent via your web form. Click Rules, then Create rule.Right Click Rules Create Rule
  2. Choose a condition that matches all emails sent via your form, eg. Subject. Then click Advanced Options…Tick Subject Click Advanced Options
  3. Click Next.Click Next On Outlook Rule Wizard
  4. Tick forward it to people or public group, then click people or public group.Forward To People Or Public Group
  5. Enter the email address from Mailparser.io, click OK, then click Next twice.Paste Email From Mail Parser
  6. Turn on the rule, and choose whether you want to run it on mail already in the same folder.Turn On Outlook Rule

And that’s it. From now on, any mail sent by your website’s form will be automatically forwarded into mailparser.io, broken up into the relevant fields, and added to SharePoint Online. You can also use Azure Logic Apps to automate a bunch of other business processes. Check out the documentation here.

Similar services to Azure Logic Apps include Microsoft Flow, Zapier and IFTTT.

Data Location

Microsoft has delivered Office 365 from their Australian datacenters since the end of May 2015.

It was a big deal at the time, and it’s still a major selling point for their cloud platform, especially amongst businesses that have strict data residency requirements.

If you’ve purchased Office 365 since May 31, 2015 with an Australian billing address, you’ll be accessing your services from the Australian datacenters already. If you purchased it before then, some of your services might have moved automatically, though some may still be delivered from the Asia Pacific region.

If you’d like to move, be quick – the option is only available until October 31, 2016.

How to request a move to Australia’s datacenters

To make sure your organisation’s data is being hosted in Australia, or to request a move, follow these instructions.

  1. Log into the Office 365 Admin portal as a Global Administrator
  2. Click Settings then Organization Profile
    Organization Profile
  3. See your current Data locationData Location
  4. To move your data click Edit under Data residency optionData Residency Option
  5. Click the switch to Yes, then click SaveChanging Your Data Residency Option
  6. Within 12 months from October 31 2016, your data will be migrated to the Australian datacenters. You will be notified once it’s complete.Data Migration Confirmation

Since it’s a complex operation, no exact date for your migration can be given. See this link for more info: https://msdn.microsoft.com/en-us/library/dn878163.aspx

Connect To PowerShell First

We have a couple of customers that want to maintain distribution groups of external contacts that can be used company wide.

The way to do this as an Exchange admin is to create a Mail Contact for an external user first, and then add that mail contact to a distribution group. This can be quite an involved process, and you may not want to have users traversing the Exchange Admin Center to complete this sort of task.

To make this easier, we’ve put together a power shell script that you can download here.

Assign the minimum permissions

Any Global administrator will be able to run this powershell script, though if you want to give a user the ability to execute the commands, you’ll need to assign them to the appropriate role groups. These are Recipient Management and Organization Management. Keep in mind, even though these are the minimum permissions required to run this Powershell script, they still enable the relevant user to do pretty much everything within Exchange. For a full list of the permissions granted, see these links:

To give a user the correct permissions, connect to Exchange Online via Powershell as a global administrator and run the following commands. Replace [email protected] with the identity of the relevant user.

Add-RoleGroupMember "Recipient Management" -Member [email protected]
Add-RoleGroupMember "Organization Management" -Member [email protected]

Running the PowerShell Script

Once the user has been granted access they can run the powershell script under their own credentials.

  1. Download the script here. 
  2. Rename it with a file extension of .ps1 eg. DistributionGroups.ps1
    Rename Distribution Groups To DistributionGroups.ps1
    DistributionGroups.Ps1
  3. Run the script by right clicking the file and choosing Run in PowershellRight Click to Run With Powershell
  4. Press 1, then Enter to connect to Exchange Online. Press Enter again once the commandlets have downloaded.Connect To PowerShell First
  5. Follow the menu items within the PowerShell script to perform the following actions:
  • Add Mail Contacts to distribution groups
  • Get a list of distribution groups
  • Create a distribution group
  • Get a list of distribution group members
  • Remove a contact from all distribution groups