This update will bring extra document management capabilities from SharePoint into Microsoft Teams.
The current Microsoft Teams files experience
The document storage and collaboration functionality in Microsoft Teams is built on SharePoint. Every Microsoft Team is also an Office 365 Group, and each team has a group-connected SharePoint site which stores all the files shared amongst the team.
You can already reach this site from the files tab of your Microsoft Teams channels, however the experience within Teams is a bit limited.
An updated Document Library experience in Microsoft Teams
This update brings the full functionality of a SharePoint Document Library into Microsoft Teams. With the ability to add and manage custom columns, sort and filter files with custom views, trigger workflows and much more.
Sync files from Microsoft Teams with your PC or Mac
This is the standout feature in this update. The ability to sync files with a PC or Mac will be available from within Microsoft teams. At Ignite this year, Microsoft demonstrated the new interface during the Content Collaboration in the Modern Workplace – BRK2451 session.
This screen capture demonstrates custom columns, views and formatting, as well as the new sync button within Microsoft Teams.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2018-10-30 13:46:062019-02-21 13:23:30A new SharePoint-powered files experience is coming to Microsoft Teams
Office 365 Advanced Threat protection and Office 365 threat intelligence logs can now be integrated into your SIEM solution.
Threats discovered by these services can be made available on the audit.general workload of the Office 365 Management APIs.
What are the Office 365 Management APIs?
The Office 365 Management APIs are essentially the API version of the Office 365 Unified Audit Log
To get your Office 365 ATP info into your SIEM, you’ll need to have the Unified Audit Log enabled for your tenant. Unfortunately, it’s not enabled by default.
How to enable the Office 365 Unified Audit Log
The Office 365 Unified Audit Log is an important and useful tool which can help you secure your Microsoft Cloud environment. If you’re a Microsoft Partner, we have a longer article on enabling this for your customers’ tenants here, but to enable it for a single tenant, you have two options.
Enable the Office 365 Unified Audit Log via the Security and Compliance Center
You can log into the Security and Compliance Center at protection.office.com as a global or security administrator.
You’ll find the setting under Search and Investigation, Audit Log Search.
If the audit log isn’t enabled, click Start recording user and admin activities
Enable the Office 365 Unified Audit Log via Powershell
Connect your SIEM to the Office 365 Management APIs
Once the audit log is enabled, threats discovered by Office 365 ATP and Threat Intelligence will be available on the audit.general endpoint of the Office 365 Management API. For more information on setting this up, see the official Microsoft documentation here.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2018-10-30 12:48:312019-02-21 13:23:31Office 365 ATP can now be integrated into your SIEM
You’ve been able to open shared calendars in Outlook for iOS and Outlook for Android for a little while now, however this update makes it a lot easier.
How did Shared Calendars on Outlook for Mobile previously work?
The person who owned the calendar would send you a sharing invite
You accept the invite from within the Outlook mobile app
The shared calendar is added to your phone.
With this update to Outlook for iOS, you can now open calendars that are already shared with you.
How to open a shared calendar in Outlook for iOS
Switch to your calendars in Outlook for iOS
Open the the left menu
Tap the add calendar button
Tap Add Shared Calendars
Search for the person or group whose calendar you already have permission to access, then tap the add button next to their name
The calendar will appear in your list
Can you open Shared Calendars on Outlook for Android too?
Yep, this feature is also available for Outlook for Android.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2018-10-25 23:09:342019-02-21 13:23:31Open a shared calendar in Outlook for iOS
Some companies will block access to Outlook on the web entirely because they don’t want users to be able to download their company data externally. This new feature strikes a middle ground, so users can still access Outlook on the web, but admins can use conditional access to restrict downloads from Outlook on the web on personal or unmanaged devices.
What is Conditional Access?
Conditional access lets you define different security measures which take effect depending on how users are trying to access your company data. For example a risky sign in according to Azure Active Directory might prompt for MFA, while a sign in from inside your company network on a trusted device won’t. An unmanaged or non-compliant device might not be able to access certain apps, while compliant devices can.
How to set up Conditional Access for Outlook on the web
Add the policy via Azure Active Directory Conditional Access
In this example, we are setting up a conditional access policy for non-compliant devices which prevents users from being able to download attachments via the browser.
Valid values for the -ConditionalAccessPolicy parameter are:
Off: No conditional access policy is applied to Outlook on the web. This is the default value.
ReadOnly: Users can’t download attachments to their local computer, and can’t enable Offline Mode on non-compliant computers. They can still view attachments in the browser.
ReadOnlyPlusAttachmentsBlocked: All restrictions from ReadOnly apply, but users can’t view attachments in the browser.
Wait a few hours for the policy to apply. Once it takes effect, the previously selected users on non-compliant devices will not be able to download attachments via Outlook on the web.
What is the user experience?
The ReadOnly policy will ensure that users on non-compliant devices can’t download email attachments through Outlook on the web to their local device. They can only access them via the file viewers in the browser.
If you use the ReadOnlyPlusAttachmentsBlocked value, users will not be able to access attachments via the browser at all.
What license do I need for Conditional Access for Outlook on the web?
Conditional Access requires a subscription with Azure AD P1 or P2.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2018-10-21 21:03:392019-02-21 13:23:33Outlook on the web – Conditional Access
A compromised administrator account or an admin becoming a disgruntled ex-employee is a source of serious risk to a business. This is because traditionally admins can do whatever they want, whenever they want. To address this issue, Microsoft have developed Privileged Access Management.
What is Privileged Access Management?
Privileged Access Management works on the principle of zero standing access. That means that admins don’t have the ability to perform potentially damaging actions all of the time.
When they need to perform a task that may expose sensitive data or has potential to cause a lot of damage, they will be given just enough access to complete the task. And even then, only for a specific time and only following an audited approval process.
You can define which tasks require a privileged access request via the admin portal.
When admins want to perform one of these tasks, they can raise their requests for access via the portal or via Powershell.
A sample Powershell request to perform tasks requiring privileged access approval looks like this:
New-ElevatedAccessRequest -Task 'Exchange\New-JournalRule' -Reason 'Setting Journal per request.' -DurationHours 4
Requests can be automatically or manually approved, and requestors are notified of the approval outcome via email. All privileged access requests and approval process information is recorded for internal reviews and auditors.
Privileged Access Management License requirements
Privileged access management requires Microsoft 365 E5, Office 365 E5 or the standalone Advanced Compliance SKU.
The popular Encrypt-Only policy for Office 365 Message Encryption can now be enabled automatically as part of a DLP (Data Loss Prevention) policy.
What is the Office 365 Encrypt-Only policy?
The Encrypt-only policy is useful because it encrypts the message and prevents it from being intercepted or scanned by other mail systems. To read the messages, recipients need to sign in via a Microsoft, Google, Yahoo or Office 365 account. If they don’t have any of those accounts, they can request a one time password to access and read the email.
It’s called Encrypt-only because other encryption options in Office 365 also enforce policies that prevent a message from being forwarded or printed. The Encrypt-Only policy just encrypts the message and prevents it from being accessed by anyone who shouldn’t.
Enabling Encrypt-Only via a DLP policy
If you are using Office 365 Message Encryption already, you can set up a DLP policy that will enable Encrypt-Only on email messages that match a certain DLP trigger. These policies are configurable in the Security and Compliance Center at https://protection.office.com.
Here is a policy that is set to trigger on emails containing Australian Financial Information:
The action for this policy is to apply the Encrypt-only message encryption policy:
This feature is available now for organisations with Microsoft 365 E3 and E5, Office 365 E3 and E5 or as part of the standalone Azure Information Protection SKUs.
So this is my first video post about a Microsoft 365 roadmap update.
If you follow me on LinkedIn, you might have noticed I’ve been doing a bunch of different updates lately for the Microsoft 365 roadmap. I do this because it’s fun to see all the things that are changing and being added to the platform.
The way that those updates work is that I have an Azure function checking the roadmap API every few hours and comparing it against a version I have in a Cosmos DB database. When it finds a new or a changed feature on the roadmap, it creates a picture using an API from Imgix and starts a Microsoft Flow approval process asking for my notes. Once approved, the image and those notes are pushed on to Buffer which posts the update on my social media.
I wanted to see if I could do the same thing with video so I’ve extended that solution a bit.
Now, when I’m prompted to add notes to a roadmap update I’m also prompted to add a video to a newly generated OneDrive folder. When I add the video of me discussing the update and approve the Microsoft Flow request, another Azure Function takes the video from OneDrive and sends that over to Azure Media Services. It’s then encoded to a smaller size and automatically transcribed. It then sends me the generated subtitles, which I can correct on my phone and then approve. Once corrected, the subtitles and the encoded video are sent over to a service called Cloudinary, which has a cool video editing API.
I’ve made a bunch of different transitions and animated logos depending on which service the update is tagged with, so that each video is a little bit customised. Finally, another function makes the video via the Cloudinary API.
The cool thing about this solution is that it’s written entirely in Powershell. I’m using Azure Functions here because they make it easy to build these automated solutions using a language that I’m familiar with. So the end result is I can create a nicely formatted social media video with hardcoded subtitles from my phone. See an example of this above.
The other thing that I can do with my phone, is sign in using the Microsoft Authenticator app with passwordless sign in. Which is what this update’s about.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2018-10-13 22:20:242019-02-21 13:23:37Automating video creation with Azure Functions, Azure Media Services and Microsoft Flow
We use Gravity Forms on our website and it works pretty well. Whenever a form is completed, we receive an email – except sometimes we don’t. Just recently, I missed a few enquiries because of a configuration change on our site.
To stop this from happening, I went looking for alternative notification options for Gravity Forms that didn’t just rely on an email making it from our website to my inbox. I remembered that Zapier had a connector, however I was disappointed to discover that it only works on Developer licenses which cost $199 USD a year, and we’re running a $39 single site license.
Luckily Gravity Forms has some easy to follow API documentation that allow us to connect directly to our site’s forms and entries via REST methods.
This solution demonstrates how to build an Azure Function app in C# that retrieves the entries from a Gravity Form and sends them to Microsoft Flow. A Microsoft Flow checks each entry against a SharePoint list and if it doesn’t exist, it adds it.
The benefits of this solution is that it’s completely serverless and almost free (depending on the App Service plan). Also since it’s in Microsoft Flow, you can do anything you want with the form entries. You could create a task in Planner, add a message to Microsoft Teams channel, or pipe them directly into Dynamics 365 or Mailchimp.
The first step is to enable access to your Gravity Forms via the API.
Enable the Gravity Forms API and retrieve the form info
Sign into your site’s WordPress Admin Panel and visit the Settings section of Gravity Forms
Enable the API. Retrieve and make a note of the Public API Key and Private API Key.
Set the Impersonate account user. Your Function App will have the same form access permissions as the user that you choose here
Visit the form that you’d like to retrieve the entries for and make a note of the Form ID (eg. 1)
Make a note of all the fields in the form. We’ll be adding these as columns to a SharePoint List.
It’s also worth making a note of each field’s corresponding ID (eg. 1.3, 2, 3 etc) since the JSON representation of each field uses this and not the field name.
Create a SharePoint List to receive the form data
Sign into your SharePoint site with your Office 365 Account.
Visit Site Contents and create a new SharePoint list with an appropriate name.
You can rename the Title column to something else if Title isn’t appropriate. I changed mine to First Name.
Create columns to match the field names in your Gravity Forms form. Here’s the field names and types that we’re using:
Create a Function App in Visual Studio 2017
In previous tutorials, we’ve created Azure Functions directly in the browser. This time we’ll be using Visual Studio to test and deploy our functions.
Open Visual Studio 2017 and make sure that it’s up to at least version 15.3. You’ll need to ensure that Azure Development tooling is installed, and that you can create Azure Function Apps. See here for a list of prerequisites.
Go to File, New Project, Visual C#, Cloud, Azure Functions then create a Function App and give it a name.
If Visual Studio is completely up to date, you’ve installed all the prerequisites, but you still can’t see an option to create Azure Functions, you may need to go to Tools > Extensions and Updates > Updates > Visual Studio Marketplace and install the Azure Functions and Web Jobs Tools update.
An Azure Function app is pretty much an Azure Web App, and each Function App can contain multiple functions. To add a Function to our Function App, right click on your project in the Solution Explorer, choose Add, New Item.
Then select Azure Function and give your function a name – I’ve called this one GravityForms_Enquiries. Click Add.
Choose Timer trigger. You’ll also want to specify how often you’d like the function app to run using CRON scheduling. The default value means that your function will execute every 5 minutes. In our published function, we’re going to check for form entries every 4 hours. While we’re debugging, we’re checking every minute – just until we’re ready to publish.
Your Function should look like this
Copy and paste the following code into your function. Replace the string placeholders in the RunAsync method with your own values, and make sure that you update your namespace and function app name (if you didn’t choose GravityForms_Enquiries too).
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Net.Http;
using System.Threading.Tasks;
using System.Web;
using System.Security.Cryptography;
using System.Net.Http.Headers;
using System.Text;
namespace GCITSFunctions
{
public static class GravityForms_Enquiries
{
[FunctionName("GravityForms_Enquiries")]
public static void Run([TimerTrigger("0 0 */4 * * *")]TimerInfo myTimer, TraceWriter log)
{
//Change the Timer Trigger above to "0 * * * * *" when debugging to avoid waiting too long for it to execute.
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
string content = RunAsync().Result;
// Remove the comments from the below four lines once you've retrieved the HTTP POST URL from Microsoft Flow and added it in.
//HttpClient client = new HttpClient();
//HttpContent jsoncontent = new StringContent(content, Encoding.UTF8, "application/json");
//string flowRequest = "<Enter flow HTTP POST URL HERE>";
//var result = client.PostAsync(flowRequest, jsoncontent).Result;
}
static async Task<string> RunAsync()
{
HttpClient client = new HttpClient();
// Add the public and private keys for Gravity Forms
string publicKey = "<Enter Gravity Forms Public API Key>";
string privateKey = "<Enter Gravity Forms Private API Key>";
string method = "GET";
// Specify the form ID of the form you're retrieving entries for
string formId = "1";
string route = string.Format("forms/{0}/entries", formId);
/* Paging specifies the number of entries that will be retrieved from your form in this call, eg. 1000. You can make this higher or lower if you like.
It will retrieve the most recent entries first. */
string paging = "&paging[page_size]=1000";
string expires = Security.UtcTimestamp(new TimeSpan(0, 1, 0)).ToString();
string signature = GenerateSignature(publicKey, privateKey, method, route);
/* Replace gcits.com with your own domain name. If the call doesn't work initially, you may need to make sure that 'pretty' permalinks are enabled on your site.
See here for more information: https://www.gravityhelp.com/documentation/article/web-api/ */
string url = string.Format("//gcits.com/gravityformsapi/{0}?api_key={1}&signature={2}&expires={3}{4}", route, publicKey, signature, expires, paging);
client.BaseAddress = new Uri(url);
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var response = await client.GetAsync(client.BaseAddress);
string content = response.Content.ReadAsStringAsync().Result;
return content;
}
public static string GenerateSignature(string publicKey, string privateKey, string method, string route)
{
string expires = Security.UtcTimestamp(new TimeSpan(0, 1, 0)).ToString();
string stringToSign = string.Format("{0}:{1}:{2}:{3}", publicKey, method, route, expires);
var sig = Security.Sign(stringToSign, privateKey);
return (sig);
}
}
public class Security
{
public static string UrlEncodeTo64(byte[] bytesToEncode)
{
string returnValue
= System.Convert.ToBase64String(bytesToEncode);
return HttpUtility.UrlEncode(returnValue);
}
public static string Sign(string value, string key)
{
using (var hmac = new HMACSHA1(Encoding.ASCII.GetBytes(key)))
{
return UrlEncodeTo64(hmac.ComputeHash(Encoding.ASCII.GetBytes(value)));
}
}
public static int UtcTimestamp(TimeSpan timeSpanToAdd)
{
TimeSpan ts = (DateTime.UtcNow.Add(timeSpanToAdd) - new DateTime(1970, 1, 1, 0, 0, 0));
int expires_int = (int)ts.TotalSeconds;
return expires_int;
}
}
}
Update the TimerTrigger to ‘0 * * * * *’ while we debug
Add a reference to your function for System.Web by right clicking on your project and choosing Add, Reference.
Scroll down to System.Web, check the box and click OK.
Next we need to add a connection string for an Azure Storage account into the local.settings.json file. You can retrieve a connection string from an existing storage account via the Azure Portal, or by downloading Azure Storage Explorer from www.storageexplorer.com, signing in and copying the Connection String from the bottom left properties section. I recommend downloading Storage Explorer anyway since it’s a handy tool for working with Azure Storage accounts. If you don’t have a storage account, you’ll need to make one.
Once you’ve got the Connection string, paste it in the AzureWebJobsStorage value of the local.settings.json file.
Get your Gravity Form entries as JSON
In order for us to use your form entries in Microsoft Flow, we’ll need to show Flow what your form entries look like in JSON format. To do this, we’ll use Fiddler.
Fiddler allows you to analyse your computer’s internet traffic, as well as a bunch of other things. You might see a lot of activity from irrelevant processes, you can right click and filter these processes out.
Once your Fiddler stream is a little less busy, we’ll run the function app. Return to Visual Studio and press F5.
You’ll see the Azure Functions Core Tools window appear. This is a local version of the Azure Functions runtime that allows you to debug Azure Functions on your own computer before you deploy them.
Wait for your Azure function to execute. It should display some text that looks like this:
Now switch over to Fiddler and locate the call that it just made to your website. If all goes well, you should see a row with a result of 200 to your domain.
Click this row, and choose to decode it on the bottom right.
Select the Raw tab, and triple click the JSON result at the bottom to select it all, then copy this into Notepad for later. This is the JSON representation of your Gravity Forms form entries that we can use in Microsoft Flow.
Create a Microsoft Flow to receive the Gravity Forms entries
Create a new Blank flow, give it a name and start it with a Request Trigger. Then click Use sample payload to generate schema
Paste the JSON payload that we saved from Fiddler and click Done.
Add an action step so that we can save the flow and retrieve the HTTP POST URL. In this example I added a Notification action.
Click Create Flow, then copy the URL that was created next to HTTP POST URL.
Switch back over to Visual Studio 2017 and paste the HTTP POST URL in the placeholder for the flowRequest string variable. Next uncomment out the last four lines of the Run method.
Run the Function again to confirm that it’s sending the JSON payload to Microsoft Flow. You should see a row in Fiddler that looks like this:
Inspecting the call on the top right under the Raw tab shows that it sent the JSON payload:
When you return to Microsoft Flow, you should see a recent successful Flow run.
Open the flow run to see that the payload was received by the Request step.
Use Microsoft Flow to add the entries to a SharePoint list
Remove the action below the Request trigger and add an Apply to each step. Add the entries output from the popout menu to the ‘Select an output’ field. Then add a SharePoint – Get Items action into the Apply to each step and select or enter your SharePoint site, then choose the SharePoint list you created earlier.
Click Show advanced options and add GFID eq ‘id’ into the Filter Query field. Where ‘id’ is the id output from the Request trigger. Be sure to include the single quotes. This step checks the SharePoint List for existing entries with the same Gravity Forms entry ID.
Next add a Condition step, and click Edit in advanced mode. Copy and paste the following into this field:
@empty(body('Get_items')?['value'])
This checks whether any items were returned from SharePoint that match that Gravity Forms ID. If there weren’t any, we’ll create one.
Your flow layout should now look like this:
In the Yes section, add a SharePoint – Create Item action. Select or enter your SharePoint site, and choose the relevant SharePoint list. Refer to your notes on which fields match which field IDs, then drag the form values into the corresponding SharePoint fields.
I also added an Office 365 – Send an email action below this one so I get an extra email notification. You may want to wait until you’ve already imported all existing form entries before you add this one. If you’ve got hundreds of entries that haven’t been added to SharePoint, you’ll get hundreds of emails.
Click Update Flow, return to Visual Studio 2017 and run your Function app again (press F5).
Once it successfully completes, close the Azure Functions Core Tools window and head back over to Microsoft Flow to see it process. It should display the following when it’s done:
Next, visit your SharePoint list. You should now have the data from all Website Enquiry form entries in a single location. This data can now be used for all sorts of Microsoft Flows and business processes.
Publish your Function App to Azure
To make sure that your form data stays up to date, we need to publish our Function App to Azure.
Switch to Visual Studio, and update the timer on your function app to ‘0 0 */4 * * *‘ to make sure it doesn’t keep running each minute in the cloud
Now, right click on your project name and click Publish
Click Azure Function App, and choose Create New. (If you already have an existing Azure Function App, you can Select Existing, and specify the function app you’d like to deploy to.)
Since we’re creating a new Azure Function App we need to specify some details. As mentioned earlier, Function Apps are just Azure Web Apps. To deploy them, we need to create or choose an App Service.
Give your Function App an App Name, select your Azure subscription, choose or create a Resource Group, App Service Plan and Storage Account.
When creating an App Service plan, you can choose from the following sizes. Pricing varies depending on the underlying VM size, however the Consumption plan costs pretty much nothing. Choosing Consumption doesn’t give you a dedicated VM, and function runs are limited to 5 minutes duration. This applies to all functions within your Function App.
Once you’re happy with your settings, click OK, then Create, and wait for your App Service to deploy.
When it finishes, click Publish. Your function app is now deploying to Azure.
Sign in to https://portal.azure.com to see it in action under Web Apps. By default, your functions are in Read Only mode, and you won’t be able to view or edit the underlying C# code.
To keep track of your function’s activity, you can see function runs in the Monitor Section.
https://gcits.com/wp-content/uploads/GravityFormsToFunctions.png6601560Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2017-09-10 11:00:202017-09-11 20:12:41Connect Gravity Forms to Microsoft Flow with Azure Functions
For security and compliance in Office 365, the Unified Audit Log is probably the most important tool of all. It tracks every user and account action across all of the Office 365 services. You can run reports on deletions, shares, downloads, edits, reads etc, for all users and all products. You can also set up custom alerting to receive notifications whenever specific activities occur.
For all of it’s usefulness, the most amazing thing about it is that it’s not turned on by default.
It can be extremely frustrating when you come across a query or problem that could easily be resolved if we had access to the logs, only to find out they were never enabled in the first place. Here’s how to get it set up in your own organisation, or if you’re a Microsoft Partner, how to script it for all of your customers using Delegated Administration and PowerShell.
How to enable the Unified Audit Log for a single Office 365 tenant
If you’re only managing your own tenant, it’s quite simple to turn it on. You can do this in two ways.
How to enable the Unified Audit Log via the Security and Compliance Center for a single Office 365 tenant
Since the PowerShell command for enabling the Unified Audit Log is just one line, I assumed we’d be able to add it as a script block and run it across all of our Office 365 customers at once.
When I tried setting this up, it initially appeared to be working, though I soon received the following error:
The remote server returned an error: (401) Unauthorized.
It looks like Microsoft don’t allow you to run this particular script using Delegated Administration, though I’m not too sure why. You also can’t enable it via https://protection.office.com using your delegated admin credentials, it just seems to revert you back to the settings for your own Office 365 tenant.
In order to enable the Unified Audit Log, we’ll need to activate it using an admin within the customer’s Office 365 tenant. The remainder of this blog post contains the instructions on how to script this process.
Disclaimer
Use the following scripts at your own risk. They are designed to temporarily create Global Admins with a standard password (chosen by you) on each of your customer’s environments. If all goes well, every admin that was created should be deleted automatically. If some tenants fail to enable the Unified Audit Log correctly, the new admin for those tenants will remain (I’ve included a script to remove these ones too). Also, see step 3 for a link to a script that reports on every Unlicensed Office 365 Company Admin in your Office 365 tenant. Use it to verify that none of these temporary admins remain.
This process has three parts
PowerShell Script One: Checking Unified Audit Log Status and creating admin users
PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins
PowerShell Script Three (And Optional Script): Removing unsuccessful admins and checking tenants for all unlicensed admins.
Things you should know beforehand
For the most part, these scripts work. Using these three scripts, I’ve enabled the Unified Audit Log on 227 of our 260 delegated Office 365 customers. However, there are a few error messages that can pop up, and a few reasons that will prevent it working for some Office 365 tenants at all.
Here are a few things to keep in mind:
It doesn’t work with LITEPACK and LITEPACK_P2 subscriptions
In our case these are Telstra customers running the older Office 365 Small Business and Office 365 Small Business Premium subscriptions. You can run our Office 365 Delegated Tenant license report to identify these customers.
It does not work on customers that don’t have any subscriptions, or only has expired subscriptions.
It won’t work for Office 365 tenants that don’t have any Office 365 subscriptions, or if their Office 365 subscriptions have expired. The script will fail for these organisations with the error: The tenant organization isn’t in an Active State. Complete the administrative tasks that are active for this organization, and then try again.
It does not work on customers that only have Dynamics CRM licenses
This script doesn’t seem to run on customers that only have Dynamics CRM Online. It hasn’t been tested with customers that only have Dynamics 365.
You should wait before running the second PowerShell Script
It can take a while for the temporary admin user to receive the appropriate permissions in your customers Office 365 organisation. If you run the second script too soon, the temporary admin may not be able to pull down all the Exchange Online cmdlets to perform the required tasks.
PowerShell Script One: Checking Unified Audit Log Status and creating admin users
This script uses your own delegated admin credentials. It creates a list of all of your Office 365 Customers and reports on their subscriptions. If they have at least one subscription (active or not) it attempts to run an Exchange Online cmdlet to check whether the Unified Audit Log is enabled. If it’s enabled, it does nothing and moves onto the next customer. If it’s disabled, it creates a new user, assigns it to the Company Administrator role and adds a row to a CSV with the tenant ID, customer name and user principal name.
To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.
Modify the placeholder variables at the top of the script and run it in PowerShell.
<# This script will connect to all delegated Office 365 tenants and check whether the Unified Audit Log is enabled. If it's not, it will create an Exchange admin user with a standard password. Once it's processed, you'll need to wait a few hours (preferably a day), then run the second script. The second script connects to your customers' Office 365 tenants via the new admin users and enables the Unified Audit Log ingestion. If successful, the second script will also remove the admin users created in this script. #>
#-------------------------------------------------------------
# Here are some things you can modify:
# This is your partner admin user name that has delegated administration permission
$UserName = "[email protected]"
# IMPORTANT: This is the default password for the temporary admin users. Don't leave this as Password123, create a strong password between 8 and 16 characters containing Lowercase letters, Uppercase letters, Numbers and Symbols.
$NewAdminPassword = "Password123"
# IMPORTANT: This is the default User Principal Name prefix for the temporary admin users. Don't leave this as gcitsauditadmin, create something UNIQUE that DOESNT EXIST in any of your tenants already. If it exists, it'll be turned into an admin and then deleted.
$NewAdminUserPrefix = "gcitsauditadmin"
# This is the path for the exported CSVs. You can change this, though you'll need to make sure the path exists. This location is also referenced in the second script, so I recommend keeping it the same.
$CreatedAdminsCsv = "C:\temp\CreatedAdmins.csv"
$UALCustomersCsv = "C:\temp\UALCustomerStatus.csv"
# Here's the end of the things you can modify.
#-------------------------------------------------------------
# This script block gets the Audit Log config settings
$ScriptBlock = {Get-AdminAuditLogConfig}
$Cred = get-credential -Credential $UserName
# Connect to Azure Active Directory via Powershell
Connect-MsolService -Credential $cred
$Customers = Get-MsolPartnerContract -All
$CompanyInfo = Get-MsolCompanyInformation
Write-Host "Found $($Customers.Count) customers for $($CompanyInfo.DisplayName)"
Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "
foreach ($Customer in $Customers) {
Write-Host $Customer.Name.ToUpper()
Write-Host " "
# Get license report
Write-Host "Getting license report:"
$CustomerLicenses = Get-MsolAccountSku -TenantId $Customer.TenantId
foreach($CustomerLicense in $CustomerLicenses) {
Write-Host "$($Customer.Name) is reporting $($CustomerLicense.SkuPartNumber) with $($CustomerLicense.ActiveUnits) Active Units. They've assigned $($CustomerLicense.ConsumedUnits) of them."
}
if($CustomerLicenses.Count -gt 0){
Write-Host " "
# Get the initial domain for the customer.
$InitialDomain = Get-MsolDomain -TenantId $Customer.TenantId | Where {$_.IsInitial -eq $true}
# Construct the Exchange Online URL with the DelegatedOrg parameter.
$DelegatedOrgURL = "https://ps.outlook.com/powershell-liveid?DelegatedOrg=" + $InitialDomain.Name
Write-Host "Getting UAL setting for $($InitialDomain.Name)"
# Invoke-Command establishes a Windows PowerShell session based on the URL,
# runs the command, and closes the Windows PowerShell session.
$AuditLogConfig = Invoke-Command -ConnectionUri $DelegatedOrgURL -Credential $Cred -Authentication Basic -ConfigurationName Microsoft.Exchange -AllowRedirection -ScriptBlock $ScriptBlock -HideComputerName
Write-Host " "
Write-Host "Audit Log Ingestion Enabled:"
Write-Host $AuditLogConfig.UnifiedAuditLogIngestionEnabled
# Check whether the Unified Audit Log is already enabled and log status in a CSV.
if ($AuditLogConfig.UnifiedAuditLogIngestionEnabled) {
$UALCustomerExport = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.Name
DefaultDomainName = $Customer.DefaultDomainName
UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
DistinguishedName = $AuditLogConfig.DistinguishedName
}
$UALCustomersexport = @()
$UALCustomersExport += New-Object psobject -Property $UALCustomerExport
$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append
}
# If the Unified Audit Log isn't enabled, log the status and create the admin user.
if (!$AuditLogConfig.UnifiedAuditLogIngestionEnabled) {
$UALDisabledCustomers += $Customer
$UALCustomersExport =@()
$UALCustomerExport = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.Name
DefaultDomainName = $Customer.DefaultDomainName
UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
DistinguishedName = $AuditLogConfig.DistinguishedName
}
$UALCustomersExport += New-Object psobject -Property $UALCustomerExport
$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append
# Build the User Principal Name for the new admin user
$NewAdminUPN = -join($NewAdminUserPrefix,"@",$($InitialDomain.Name))
Write-Host " "
Write-Host "Audit Log isn't enabled for $($Customer.Name). Creating a user with UPN: $NewAdminUPN, assigning user to Company Administrators role."
Write-Host "Adding $($Customer.Name) to CSV to enable UAL in second script."
$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force
$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)
New-MsolUser -TenantId $Customer.TenantId -DisplayName "Audit Admin" -UserPrincipalName $NewAdminUPN -Password $NewAdminPassword -ForceChangePassword $false
Add-MsolRoleMember -TenantId $Customer.TenantId -RoleName "Company Administrator" -RoleMemberEmailAddress $NewAdminUPN
$AdminProperties = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.Name
DefaultDomainName = $Customer.DefaultDomainName
UserPrincipalName = $NewAdminUPN
Action = "ADDED"
}
$CreatedAdmins = @()
$CreatedAdmins += New-Object psobject -Property $AdminProperties
$CreatedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $CreatedAdminsCsv -Append
Write-Host " "
}
}
Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "
}
Write-Host "Admin Creation Completed for tenants without Unified Audit Logging, please wait 12 hours before running the second script."
Write-Host " "
See the Unified Audit Log status for your customers
One of the outputs of this script is the UALCustomerStatus.csv file. You can make a copy of this, and rerun the process at the end to compare the results.
Browse the list of created admins
The script will also create a CSV containing the details for each admin created. This CSV will be imported by the second PowerShell Script and will be used to enable the Unified Audit Log on each tenant.
PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins
This script should be run at least a few hours after the first script to ensure that the admin permissions have had time to correctly apply. If you don’t wait long enough, your admin user may not have access to the required Exchange Online cmdlets.
You’ll need to update the password in this script to reflect the password you chose for your temporary admins in the first script.
To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.
Modify the placeholder variables at the top of the script and run it in PowerShell.
<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>
#-------------------------------------------------------------
# Here are some things you can modify:
# This is your partner admin user name that has delegated administration permission
$UserName = "[email protected]"
# IMPORTANT: This is the default password for the temporary admin users. Use the same password that you specified in the first script.
$NewAdminPassword = "Password123"
# This is the CSV containing the details of the created admins generated by the first script. If you changed the path in the first script, you'll need to change it here.
$Customers = import-csv "C:\temp\CreatedAdmins.csv"
# This CSV will contain a list of all admins removed by this script.
$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"
# This CSV will contain a list of all unsuccessful admins left unchanged by this script. Use it to retry this script without having to start again.
$RemainingAdminsCsv = "C:\temp\RemainingAdmins.csv"
#-------------------------------------------------------------
$Cred = get-credential -Credential $UserName
foreach ($Customer in $Customers) {
Write-Host $Customer.CompanyName.ToUpper()
Write-Host " "
$NewAdminUPN = $Customer.UserPrincipalName
$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force
$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)
Write-Host " "
Write-Output "Getting the Exchange Online cmdlets as $NewAdminUPN"
$Session = New-PSSession -ConnectionUri https://outlook.office365.com/powershell-liveid/ `
-ConfigurationName Microsoft.Exchange -Credential $NewAdminCreds `
-Authentication Basic -AllowRedirection
Import-PSSession $Session -AllowClobber
# Enable the customization of the Exchange Organisation
Enable-OrganizationCustomization
# Enable the Unified Audit Log
Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true
# Find out whether it worked
$AuditLogConfigResult = Get-AdminAuditLogConfig
Remove-PSSession $Session
# If it worked, remove the Admin and add the removed admin details to a CSV
if($AuditLogConfigResult.UnifiedAuditLogIngestionEnabled){
# Remove the temporary admin
Write-Host "Removing the temporary Admin"
Remove-MsolUser -TenantId $Customer.TenantId -UserPrincipalName $NewAdminUPN -Force
$AdminProperties = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.CompanyName
DefaultDomainName = $Customer.DefaultDomainName
UserPrincipalName = $NewAdminUPN
Action = "REMOVED"
}
$RemovedAdmins = @()
$RemovedAdmins += New-Object psobject -Property $AdminProperties
$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append
}
# If it didn't work, keep the Admin and add the admin details to another CSV. You can use the RemainingAdmins CSV if you'd like to try again.
if(!$AuditLogConfigResult.UnifiedAuditLogIngestionEnabled){
Write-Host "Enabling Audit Log Failed, keeping the temporary Admin"
$AdminProperties = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.CompanyName
DefaultDomainName = $Customer.DefaultDomainName
UserPrincipalName = $NewAdminUPN
Action = "UNCHANGED"
}
$RemainingAdmins = @()
$RemainingAdmins += New-Object psobject -Property $AdminProperties
$RemainingAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemainingAdminsCsv -Append
}
Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "
}
View the successful Office 365 admins that were removed
If the Unified Audit Log was enabled successfully, the newly created Office 365 admin will be automatically removed. You can see the results of this in the RemovedAdmins CSV.
See the remaining Office 365 admins that couldn’t enable the Unified Audit Log
If the Unified Audit Log couldn’t be enabled, the Office 365 admin will remain unchanged. If you like, you can use the RemainingAdmins CSV in place of the CreatedAdmins CSV and rerun the second script. In our case, some tenants that couldn’t be enabled on the first try, were able to be enabled on the second and third tries.
Any tenants that weren’t able to have their Unified Audit Log enabled via PowerShell will still have the Office 365 admin active. This script will import these admins from the RemainingAdminsCsv and remove them.
Once removed, it will add them to the RemovedAdmins CSV. You can compare this to the CreatedAdmins CSV from the first script to make sure they’re all gone.
<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>
#-------------------------------------------------------------
# Here are some things you can modify:
# This is your partner admin user name that has delegated administration permission
$UserName = "[email protected]"
# This CSV contains a list of all remaining unsuccessful admins left unchanged by the second script.
$RemainingAdmins = import-csv "C:\temp\RemainingAdmins.csv"
# This CSV will contain a list of all admins removed by this script.
$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"
#-------------------------------------------------------------
$Cred = get-credential -Credential $UserName
Connect-MsolService -Credential $cred
ForEach ($Admin in $RemainingAdmins) {
$tenantID = $Admin.Tenantid
$upn = $Admin.UserPrincipalName
Write-Output "Deleting user: $upn"
Remove-MsolUser -UserPrincipalName $upn -TenantId $tenantID -Force
$AdminProperties = @{
TenantId = $tenantID
CompanyName = $Admin.CompanyName
DefaultDomainName = $Admin.DefaultDomainName
UserPrincipalName = $upn
Action = "REMOVED"
}
$RemovedAdmins = @()
$RemovedAdmins += New-Object psobject -Property $AdminProperties
$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append
}
Want to see all the current Office 365 global administrators in your customers tenants?
To confirm that all of the created admins from these scripts have been removed, or just to see which global administrators have access to your customer tenants, you can run the scripts here. If required, there’s a second script that will block the credentials of the admins that you leave in the exported CSV.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2017-03-21 23:00:352019-02-21 13:23:41Enabling the Unified Audit Log on all delegated Office 365 tenants via PowerShell
I ran a small migration on Friday last week that took a lot longer than planned due to a few unexpected issues. The customer was migrating from Telstra’s Office 365 service (AKA syndicated account) to a Microsoft Office 365 account licensed via the CSP model.
This is a pretty standard migration that we run often, though this time I ran into a number of issues.
Since it was a small migration (4 users), and it needed to be moved over in a short amount of time, I opted to do a standard PST migration on an Azure VM. The plan was as follows:
Disconnect the domain from the existing tenant
Add it to the new tenant by moving it to new name servers and verifying the DNS
Create the required users on the new tenant
Set up the Exchange accounts for the old users via their .onmicrosoft.com addresses and export the mail via PSTs
Set up the Exchange accounts for the new users and import the mail
Here’s a few important details regarding the source tenant:
Office 365 licensed via Telstra on a syndicated account
Domain DNS hosted on Microsoft’s name servers
Here are some important details regarding the destination tenant:
Office 365 licensed via CSP model
Domain DNS hosted on external name servers
Problem #1: Cannot create a user with the same UPN as before
My first issue occurred when I started to set up the users on the new tenant. Three out of four worked correctly, though the fourth mailbox (unfortunately the most important one) would not create correctly on the new tenant.
I was given the error OrgIdMailboxRecentlyCreatedException and the message:
Hang on, we’re not quite ready
It looks like your account [email protected] was created 1 hour ago. It can take up to 24 hours to set up a mailbox.
I did some testing and discovered that the error message only appears when I set the User Principal Name to match what it was on the old tenant ([email protected]). If I set it up to use a variation of the old name ([email protected]), it worked instantly.
I suspected this was to do with some internal updating issue with how Office 365 maps usernames to mailboxes, and decided to give it a bit of time.
In the meantime, the customer needed to be able to send and receive as their previous email address. So I logged onto Exchange via Powershell and ran the following cmdlets:
Once this was confirmed working, I set it up on the user’s Outlook profile and kicked off the migration.
I left this running overnight, and in the morning faced a new problem.
Problem #2: Cannot send to BigPond addresses due to SPF issue
The next issue was related to recipients using Telstra BigPond addresses. Emails sent to external domains worked fine, though emails sent to BigPond account would fail instantly. They’d return a message stating that the message was rejected by the recipient email server.
The SPF Headers in the returned email state that there is no SPF records configured for the domain:
I double checked the SPF records on the new name servers and they were configured correctly. I also sent test emails to my own external addresses, and the SPF headers were fine too.
I’m not 100% sure what the cause of the issue here is, though I suspect it is due to the fact that Microsoft used to manage the DNS for this domain via their internal Office 365 name servers. If the change is taking a while to propagate throughout the system, it may still looking at their internal name servers for records relating to this domain.
Solution: Switch Name Servers back to Microsoft
To resolve this issue, I transferred management of the DNS records back to Microsoft’s Office 365 name servers, and within about half an hour, all of the issues were resolved.
I was able to rename the user to the correct email ([email protected]) and users no longer had issues mailing BigPond accounts.
Since I was trying everything to resolve this issue, I’m not sure what the ultimate fix was. Though it seems that switching the name servers back to Microsoft on the new tenant did the most good.
I suspect that this was a DNS propagation problem within Office 365, and may have been resolved in time anyway. If you’re currently experiencing it, moving the DNS to Microsoft’s Office 365 name servers may speed up the resolution.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2016-10-16 17:05:202016-10-16 18:05:56Resolving issues when migrating from Telstra syndicated Office 365 to Office 365 via CSP
We may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
Essential Website Cookies
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to deliver the website, refuseing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.
We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
Google Analytics Cookies
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visit to our site you can disable tracking in your browser here:
Other external services
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds:
Other cookies
The following cookies are also needed - You can choose if you want to allow them: