Many companies are allowing staff to work from home and remote indefinitely, raising questions about how they can protect work data on personal or uncontrolled devices.
As IT experts for working remote Gold Coast IT Support offer the following information to help.
Because we can lose company data in a variety of ways across different devices, we need to apply a variety of protection measures. Let’s take a look at the features in Microsoft 365 that can allow companies to protect their data while users are working remotely.
Use Mobile Application Management
Despite the name, mobile application management doesn’t just apply to mobile devices, it can also protect Windows 10 devices. Mobile Application Management policies can protect company data on both managed and unmanaged devices.
It works by applying protections to the apps your teams use to access company data, like Outlook, Teams, OneDrive and SharePoint.
You can enforce restrictions on these apps to prevent data being saved, cut, copied or pasted.
You can also require a PIN when the app starts or block the app from running on a jailbroken phone or tablet.
This feature can be used to selectively wipe company data from a users device, without affecting their personal files. This is handy for organisations where staff use their personal computers and mobile devices to access company information remotely.
Set up conditional access policies
We can use Conditional Access to enforce restrictions on non-compliant or unmanaged devices. Such as blocking access entirely, or preventing particular actions like stopping users from saving attachments in Outlook on the web or syncing files to OneDrive
We can apply these protections in other ways to apps like OneDrive and SharePoint. Preventing users from syncing data to their personal devices by either blocking access or only allowing limited web only access
Expert IT advice for working remotely
Use Cloud App Security to protect data on third-party apps
These protections don’t just relate to Microsoft 365 apps like OneDrive, SharePoint and Outlook; we can use Microsoft Cloud App Security to apply additional protections to apps like Dropbox Business too. Applying protection to a third-party app like Dropbox Business can prevent users from downloading your company data to unmanaged devices.
Apps like Dropbox Business also provide their own security measures, allowing you to block access and wipe company data when a device next comes online.
Configure idle session time outs
To lessen the likelihood of the wrong people accessing company information on a shared device, we can configure idle session time outs. These will sign users out after a period of inactivity, just like your bank does.
Get alerts on suspicious activities
Cloud App Security includes built-in alerts that trigger on potentially suspicious activities. We can use these to get notified about things like mass deletions, mass downloads and unusual volumes of external sharing
Protect sensitive data with Data Loss Prevention
We can use data loss prevention to restrict or impose conditions on the sharing of sensitive information. These policies can trigger on certain keywords like project names or sensitive information types like credit card numbers, driver’s license details or tax file information. Once a file containing this info is detected, it can display a warning, be blocked from being sent or have encryption applied.
Using Cloud App Security, we can apply additional data loss prevention measures to third party apps like Box and Dropbox Business
Use Sensitivity Labels
But what happens if this all fails, and someone downloads company data to a personal, unmanaged device. To protect against this, we can apply sensitivity labels. These labels define how sensitive a particular piece of content is and in turn can enforce protections on our data. What’s more, these protections apply no matter where it ends up. These baked-in protections can limit who can access the file and what they can do with it. Preventing the wrong people from opening, copying, saving, forwarding or printing sensitive documents or emails.
In many cases, these protections can be applied automatically by scanning for those same keywords and sensitive information types that data loss prevention uses.
As you can probably tell by now, there’s a lot you can do to protect your sensitive data when people are working from home. If you need help with any of this, reach out to us below.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2021-04-14 10:26:382021-04-15 10:27:27How do I protect company data when my team works from home?
This update will bring extra document management capabilities from SharePoint into Microsoft Teams.
The current Microsoft Teams files experience
The document storage and collaboration functionality in Microsoft Teams is built on SharePoint. Every Microsoft Team is also an Office 365 Group, and each team has a group-connected SharePoint site which stores all the files shared amongst the team.
You can already reach this site from the files tab of your Microsoft Teams channels, however the experience within Teams is a bit limited.
An updated Document Library experience in Microsoft Teams
This update brings the full functionality of a SharePoint Document Library into Microsoft Teams. With the ability to add and manage custom columns, sort and filter files with custom views, trigger workflows and much more.
Sync files from Microsoft Teams with your PC or Mac
This is the standout feature in this update. The ability to sync files with a PC or Mac will be available from within Microsoft teams. At Ignite this year, Microsoft demonstrated the new interface during the Content Collaboration in the Modern Workplace – BRK2451 session.
This screen capture demonstrates custom columns, views and formatting, as well as the new sync button within Microsoft Teams.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2018-10-30 13:46:062019-02-21 13:23:30A new SharePoint-powered files experience is coming to Microsoft Teams
For security and compliance in Office 365, the Unified Audit Log is probably the most important tool of all. It tracks every user and account action across all of the Office 365 services. You can run reports on deletions, shares, downloads, edits, reads etc, for all users and all products. You can also set up custom alerting to receive notifications whenever specific activities occur.
For all of it’s usefulness, the most amazing thing about it is that it’s not turned on by default.
It can be extremely frustrating when you come across a query or problem that could easily be resolved if we had access to the logs, only to find out they were never enabled in the first place. Here’s how to get it set up in your own organisation, or if you’re a Microsoft Partner, how to script it for all of your customers using Delegated Administration and PowerShell.
How to enable the Unified Audit Log for a single Office 365 tenant
If you’re only managing your own tenant, it’s quite simple to turn it on. You can do this in two ways.
How to enable the Unified Audit Log via the Security and Compliance Center for a single Office 365 tenant
Since the PowerShell command for enabling the Unified Audit Log is just one line, I assumed we’d be able to add it as a script block and run it across all of our Office 365 customers at once.
When I tried setting this up, it initially appeared to be working, though I soon received the following error:
The remote server returned an error: (401) Unauthorized.
It looks like Microsoft don’t allow you to run this particular script using Delegated Administration, though I’m not too sure why. You also can’t enable it via https://protection.office.com using your delegated admin credentials, it just seems to revert you back to the settings for your own Office 365 tenant.
In order to enable the Unified Audit Log, we’ll need to activate it using an admin within the customer’s Office 365 tenant. The remainder of this blog post contains the instructions on how to script this process.
Disclaimer
Use the following scripts at your own risk. They are designed to temporarily create Global Admins with a standard password (chosen by you) on each of your customer’s environments. If all goes well, every admin that was created should be deleted automatically. If some tenants fail to enable the Unified Audit Log correctly, the new admin for those tenants will remain (I’ve included a script to remove these ones too). Also, see step 3 for a link to a script that reports on every Unlicensed Office 365 Company Admin in your Office 365 tenant. Use it to verify that none of these temporary admins remain.
This process has three parts
PowerShell Script One: Checking Unified Audit Log Status and creating admin users
PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins
PowerShell Script Three (And Optional Script): Removing unsuccessful admins and checking tenants for all unlicensed admins.
Things you should know beforehand
For the most part, these scripts work. Using these three scripts, I’ve enabled the Unified Audit Log on 227 of our 260 delegated Office 365 customers. However, there are a few error messages that can pop up, and a few reasons that will prevent it working for some Office 365 tenants at all.
Here are a few things to keep in mind:
It doesn’t work with LITEPACK and LITEPACK_P2 subscriptions
In our case these are Telstra customers running the older Office 365 Small Business and Office 365 Small Business Premium subscriptions. You can run our Office 365 Delegated Tenant license report to identify these customers.
It does not work on customers that don’t have any subscriptions, or only has expired subscriptions.
It won’t work for Office 365 tenants that don’t have any Office 365 subscriptions, or if their Office 365 subscriptions have expired. The script will fail for these organisations with the error: The tenant organization isn’t in an Active State. Complete the administrative tasks that are active for this organization, and then try again.
It does not work on customers that only have Dynamics CRM licenses
This script doesn’t seem to run on customers that only have Dynamics CRM Online. It hasn’t been tested with customers that only have Dynamics 365.
You should wait before running the second PowerShell Script
It can take a while for the temporary admin user to receive the appropriate permissions in your customers Office 365 organisation. If you run the second script too soon, the temporary admin may not be able to pull down all the Exchange Online cmdlets to perform the required tasks.
PowerShell Script One: Checking Unified Audit Log Status and creating admin users
This script uses your own delegated admin credentials. It creates a list of all of your Office 365 Customers and reports on their subscriptions. If they have at least one subscription (active or not) it attempts to run an Exchange Online cmdlet to check whether the Unified Audit Log is enabled. If it’s enabled, it does nothing and moves onto the next customer. If it’s disabled, it creates a new user, assigns it to the Company Administrator role and adds a row to a CSV with the tenant ID, customer name and user principal name.
To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.
Modify the placeholder variables at the top of the script and run it in PowerShell.
<# This script will connect to all delegated Office 365 tenants and check whether the Unified Audit Log is enabled. If it's not, it will create an Exchange admin user with a standard password. Once it's processed, you'll need to wait a few hours (preferably a day), then run the second script. The second script connects to your customers' Office 365 tenants via the new admin users and enables the Unified Audit Log ingestion. If successful, the second script will also remove the admin users created in this script. #>
#-------------------------------------------------------------
# Here are some things you can modify:
# This is your partner admin user name that has delegated administration permission
$UserName = "[email protected]"
# IMPORTANT: This is the default password for the temporary admin users. Don't leave this as Password123, create a strong password between 8 and 16 characters containing Lowercase letters, Uppercase letters, Numbers and Symbols.
$NewAdminPassword = "Password123"
# IMPORTANT: This is the default User Principal Name prefix for the temporary admin users. Don't leave this as gcitsauditadmin, create something UNIQUE that DOESNT EXIST in any of your tenants already. If it exists, it'll be turned into an admin and then deleted.
$NewAdminUserPrefix = "gcitsauditadmin"
# This is the path for the exported CSVs. You can change this, though you'll need to make sure the path exists. This location is also referenced in the second script, so I recommend keeping it the same.
$CreatedAdminsCsv = "C:\temp\CreatedAdmins.csv"
$UALCustomersCsv = "C:\temp\UALCustomerStatus.csv"
# Here's the end of the things you can modify.
#-------------------------------------------------------------
# This script block gets the Audit Log config settings
$ScriptBlock = {Get-AdminAuditLogConfig}
$Cred = get-credential -Credential $UserName
# Connect to Azure Active Directory via Powershell
Connect-MsolService -Credential $cred
$Customers = Get-MsolPartnerContract -All
$CompanyInfo = Get-MsolCompanyInformation
Write-Host "Found $($Customers.Count) customers for $($CompanyInfo.DisplayName)"
Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "
foreach ($Customer in $Customers) {
Write-Host $Customer.Name.ToUpper()
Write-Host " "
# Get license report
Write-Host "Getting license report:"
$CustomerLicenses = Get-MsolAccountSku -TenantId $Customer.TenantId
foreach($CustomerLicense in $CustomerLicenses) {
Write-Host "$($Customer.Name) is reporting $($CustomerLicense.SkuPartNumber) with $($CustomerLicense.ActiveUnits) Active Units. They've assigned $($CustomerLicense.ConsumedUnits) of them."
}
if($CustomerLicenses.Count -gt 0){
Write-Host " "
# Get the initial domain for the customer.
$InitialDomain = Get-MsolDomain -TenantId $Customer.TenantId | Where {$_.IsInitial -eq $true}
# Construct the Exchange Online URL with the DelegatedOrg parameter.
$DelegatedOrgURL = "https://ps.outlook.com/powershell-liveid?DelegatedOrg=" + $InitialDomain.Name
Write-Host "Getting UAL setting for $($InitialDomain.Name)"
# Invoke-Command establishes a Windows PowerShell session based on the URL,
# runs the command, and closes the Windows PowerShell session.
$AuditLogConfig = Invoke-Command -ConnectionUri $DelegatedOrgURL -Credential $Cred -Authentication Basic -ConfigurationName Microsoft.Exchange -AllowRedirection -ScriptBlock $ScriptBlock -HideComputerName
Write-Host " "
Write-Host "Audit Log Ingestion Enabled:"
Write-Host $AuditLogConfig.UnifiedAuditLogIngestionEnabled
# Check whether the Unified Audit Log is already enabled and log status in a CSV.
if ($AuditLogConfig.UnifiedAuditLogIngestionEnabled) {
$UALCustomerExport = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.Name
DefaultDomainName = $Customer.DefaultDomainName
UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
DistinguishedName = $AuditLogConfig.DistinguishedName
}
$UALCustomersexport = @()
$UALCustomersExport += New-Object psobject -Property $UALCustomerExport
$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append
}
# If the Unified Audit Log isn't enabled, log the status and create the admin user.
if (!$AuditLogConfig.UnifiedAuditLogIngestionEnabled) {
$UALDisabledCustomers += $Customer
$UALCustomersExport [email protected]()
$UALCustomerExport = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.Name
DefaultDomainName = $Customer.DefaultDomainName
UnifiedAuditLogIngestionEnabled = $AuditLogConfig.UnifiedAuditLogIngestionEnabled
UnifiedAuditLogFirstOptInDate = $AuditLogConfig.UnifiedAuditLogFirstOptInDate
DistinguishedName = $AuditLogConfig.DistinguishedName
}
$UALCustomersExport += New-Object psobject -Property $UALCustomerExport
$UALCustomersExport | Select-Object TenantId,CompanyName,DefaultDomainName,UnifiedAuditLogIngestionEnabled,UnifiedAuditLogFirstOptInDate,DistinguishedName | Export-Csv -notypeinformation -Path $UALCustomersCSV -Append
# Build the User Principal Name for the new admin user
$NewAdminUPN = -join($NewAdminUserPrefix,"@",$($InitialDomain.Name))
Write-Host " "
Write-Host "Audit Log isn't enabled for $($Customer.Name). Creating a user with UPN: $NewAdminUPN, assigning user to Company Administrators role."
Write-Host "Adding $($Customer.Name) to CSV to enable UAL in second script."
$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force
$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)
New-MsolUser -TenantId $Customer.TenantId -DisplayName "Audit Admin" -UserPrincipalName $NewAdminUPN -Password $NewAdminPassword -ForceChangePassword $false
Add-MsolRoleMember -TenantId $Customer.TenantId -RoleName "Company Administrator" -RoleMemberEmailAddress $NewAdminUPN
$AdminProperties = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.Name
DefaultDomainName = $Customer.DefaultDomainName
UserPrincipalName = $NewAdminUPN
Action = "ADDED"
}
$CreatedAdmins = @()
$CreatedAdmins += New-Object psobject -Property $AdminProperties
$CreatedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $CreatedAdminsCsv -Append
Write-Host " "
}
}
Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "
}
Write-Host "Admin Creation Completed for tenants without Unified Audit Logging, please wait 12 hours before running the second script."
Write-Host " "
See the Unified Audit Log status for your customers
One of the outputs of this script is the UALCustomerStatus.csv file. You can make a copy of this, and rerun the process at the end to compare the results.
Browse the list of created admins
The script will also create a CSV containing the details for each admin created. This CSV will be imported by the second PowerShell Script and will be used to enable the Unified Audit Log on each tenant.
PowerShell Script Two: Enabling Unified Audit Log on all Office 365 tenants and removing successful admins
This script should be run at least a few hours after the first script to ensure that the admin permissions have had time to correctly apply. If you don’t wait long enough, your admin user may not have access to the required Exchange Online cmdlets.
You’ll need to update the password in this script to reflect the password you chose for your temporary admins in the first script.
To use the script, copy and paste it into a PowerShell document. You can use Visual Studio Code, PowerShell ISE, or Notepad etc.
Modify the placeholder variables at the top of the script and run it in PowerShell.
<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>
#-------------------------------------------------------------
# Here are some things you can modify:
# This is your partner admin user name that has delegated administration permission
$UserName = "[email protected]"
# IMPORTANT: This is the default password for the temporary admin users. Use the same password that you specified in the first script.
$NewAdminPassword = "Password123"
# This is the CSV containing the details of the created admins generated by the first script. If you changed the path in the first script, you'll need to change it here.
$Customers = import-csv "C:\temp\CreatedAdmins.csv"
# This CSV will contain a list of all admins removed by this script.
$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"
# This CSV will contain a list of all unsuccessful admins left unchanged by this script. Use it to retry this script without having to start again.
$RemainingAdminsCsv = "C:\temp\RemainingAdmins.csv"
#-------------------------------------------------------------
$Cred = get-credential -Credential $UserName
foreach ($Customer in $Customers) {
Write-Host $Customer.CompanyName.ToUpper()
Write-Host " "
$NewAdminUPN = $Customer.UserPrincipalName
$secpasswd = ConvertTo-SecureString $NewAdminPassword -AsPlainText -Force
$NewAdminCreds = New-Object System.Management.Automation.PSCredential ($NewAdminUPN, $secpasswd)
Write-Host " "
Write-Output "Getting the Exchange Online cmdlets as $NewAdminUPN"
$Session = New-PSSession -ConnectionUri https://outlook.office365.com/powershell-liveid/ `
-ConfigurationName Microsoft.Exchange -Credential $NewAdminCreds `
-Authentication Basic -AllowRedirection
Import-PSSession $Session -AllowClobber
# Enable the customization of the Exchange Organisation
Enable-OrganizationCustomization
# Enable the Unified Audit Log
Set-AdminAuditLogConfig -UnifiedAuditLogIngestionEnabled $true
# Find out whether it worked
$AuditLogConfigResult = Get-AdminAuditLogConfig
Remove-PSSession $Session
# If it worked, remove the Admin and add the removed admin details to a CSV
if($AuditLogConfigResult.UnifiedAuditLogIngestionEnabled){
# Remove the temporary admin
Write-Host "Removing the temporary Admin"
Remove-MsolUser -TenantId $Customer.TenantId -UserPrincipalName $NewAdminUPN -Force
$AdminProperties = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.CompanyName
DefaultDomainName = $Customer.DefaultDomainName
UserPrincipalName = $NewAdminUPN
Action = "REMOVED"
}
$RemovedAdmins = @()
$RemovedAdmins += New-Object psobject -Property $AdminProperties
$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append
}
# If it didn't work, keep the Admin and add the admin details to another CSV. You can use the RemainingAdmins CSV if you'd like to try again.
if(!$AuditLogConfigResult.UnifiedAuditLogIngestionEnabled){
Write-Host "Enabling Audit Log Failed, keeping the temporary Admin"
$AdminProperties = @{
TenantId = $Customer.TenantId
CompanyName = $Customer.CompanyName
DefaultDomainName = $Customer.DefaultDomainName
UserPrincipalName = $NewAdminUPN
Action = "UNCHANGED"
}
$RemainingAdmins = @()
$RemainingAdmins += New-Object psobject -Property $AdminProperties
$RemainingAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemainingAdminsCsv -Append
}
Write-Host " "
Write-Host "----------------------------------------------------------"
Write-Host " "
}
View the successful Office 365 admins that were removed
If the Unified Audit Log was enabled successfully, the newly created Office 365 admin will be automatically removed. You can see the results of this in the RemovedAdmins CSV.
See the remaining Office 365 admins that couldn’t enable the Unified Audit Log
If the Unified Audit Log couldn’t be enabled, the Office 365 admin will remain unchanged. If you like, you can use the RemainingAdmins CSV in place of the CreatedAdmins CSV and rerun the second script. In our case, some tenants that couldn’t be enabled on the first try, were able to be enabled on the second and third tries.
Any tenants that weren’t able to have their Unified Audit Log enabled via PowerShell will still have the Office 365 admin active. This script will import these admins from the RemainingAdminsCsv and remove them.
Once removed, it will add them to the RemovedAdmins CSV. You can compare this to the CreatedAdmins CSV from the first script to make sure they’re all gone.
<# This script will use the admin users created by the first script to enable the Unified Audit Log in each tenant. If enabling the Unified Audit Log is successful, it'll remove the created admin. If it's not successful, it'll keep the admin in place and add it to another CSV. You can retry these tenants by modifying the $Customers value to import the RemainingAdminsCsv in the next run. #>
#-------------------------------------------------------------
# Here are some things you can modify:
# This is your partner admin user name that has delegated administration permission
$UserName = "[email protected]"
# This CSV contains a list of all remaining unsuccessful admins left unchanged by the second script.
$RemainingAdmins = import-csv "C:\temp\RemainingAdmins.csv"
# This CSV will contain a list of all admins removed by this script.
$RemovedAdminsCsv = "C:\temp\RemovedAdmins.csv"
#-------------------------------------------------------------
$Cred = get-credential -Credential $UserName
Connect-MsolService -Credential $cred
ForEach ($Admin in $RemainingAdmins) {
$tenantID = $Admin.Tenantid
$upn = $Admin.UserPrincipalName
Write-Output "Deleting user: $upn"
Remove-MsolUser -UserPrincipalName $upn -TenantId $tenantID -Force
$AdminProperties = @{
TenantId = $tenantID
CompanyName = $Admin.CompanyName
DefaultDomainName = $Admin.DefaultDomainName
UserPrincipalName = $upn
Action = "REMOVED"
}
$RemovedAdmins = @()
$RemovedAdmins += New-Object psobject -Property $AdminProperties
$RemovedAdmins | Select-Object TenantId,CompanyName,DefaultDomainName,UserPrincipalName,Action | Export-Csv -notypeinformation -Path $RemovedAdminsCsv -Append
}
Want to see all the current Office 365 global administrators in your customers tenants?
To confirm that all of the created admins from these scripts have been removed, or just to see which global administrators have access to your customer tenants, you can run the scripts here. If required, there’s a second script that will block the credentials of the admins that you leave in the exported CSV.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2017-03-21 23:00:352019-02-21 13:23:41Enabling the Unified Audit Log on all delegated Office 365 tenants via PowerShell
A common requirement for our customers is to forward emails to SharePoint Online lists. This email data usually comes from website forms or enquiry pages, though there’s no out-of-the-box way to extract the form data from an email, and upload it to separate columns in SharePoint list.
Previously I was using Cloud2050 Email Sync, though it relied on software installed on a PC to work, and only worked while that PC was operational and Outlook was open.
Here’s a solution that operates completely in the cloud using Outlook Rules, MailParser.io and Microsoft Azure Logic Apps.
The solution looks like this:
Office 365 forwards email from your website’s form to your mailparser.io address via an Outlook Rule or Exchange Transport Rule.
MailParser.io receives the email, extracts the form data and sends it to an Azure logic app using a Generic HTTP Webhook.
Your Azure Logic App receives the form data, connects to SharePoint Online and adds the form data into the appropriate SharePoint list columns.
Prerequisites:
Sign up for MailParser.io – a free 30 day trial is available
Sign up for Microsoft Azure – use your Office 365 account, a free 30 day trial is available
A SharePoint List set up with the fields required for your form
Setting up MailParser
Once you’ve signed up for mailparser.io, sign in and click Create New Inbox
Give it a name and add some notes:
You’ll be given an email address to forward your form emails to. Keep track of this address, as you’ll need it to receive the emails you send from Outlook or Exchange mail rules. Forward a couple of sample form emails to the address to get started.
Once your emails are received, you can set up your Parsing Rules:
Usually, the mailparser will be able to automatically identify the field names and values from your forwarded email. If it doesn’t, click Try Something Else to give it some help, otherwise click OK, start with this.
Now, we start setting up our Generic Webhook. Click Webhook Integrations in on the left menu, then click Add New Integration.
Click Generic Webhook.
Give it a descriptive name and type in a sample URL (I used http://google.com) into the Target URL field. We need to use a sample first so that we can copy the webhook’s JSON payload. We then use this JSON payload to help generate the actual TargetURL from Azure Logic Apps in the next steps.
Next, click Save and test.
Then Send test data. We expect this to fail, though it will give us the JSON payload.
Copy the text from Body Payload into Notepad or Visual Studio Code.
Set up the Azure Logic App
Log onto Azure at portal.azure.com. If you don’t already have a subscription, you can sign up using your Office 365 account.
Click New, search for Logic App, and click Logic App
Click Create
Complete the fields, placing the Azure Logic App in the region of your choice. You can name the Resource group whatever you like, or use an existing one. Click Create.
Click Edit to start editing your logic app.
Search for Request and click the Request Trigger
Now you can use your copied JSON Body Payload from MailParser.io as a reference for your Request Body JSON Schema.You’ll need to define the data type for each Key-Value Pair in your JSON payload. This allows you to use the separate fields in your Azure Logic App, and add the field data into the appropriate SharePoint columns.The syntax of the Request Body JSON Schema is as follows:
You can use Visual Studio Code, Notepad++ or Notepad to edit this schema so that it describes your JSON Payload.
Replace the properties values with the name of the keys in your JSON payload. Not all fields need to be added to the required array, only the ones that you need to create a valid SharePoint list entry.
In my case, this JSON body becomes the following JSON Schema.
Paste the Schema into the Request Body Schema and click Save.
You will then receive the URL that you can use in Mailparser.io to send your requests:
Next click + New step.
Type SharePoint and click SharePoint – Create item.
You may need to add a Connection to SharePoint Online. If you’re prompted, add a connection using an Office 365 account that has permission to write to the required SharePoint list. If you don’t have a SharePoint list available to accept the data, you’ll need to set one up now before proceeding.
Next enter your site URL. The List Name drop down will be populated with the available lists. You should also see that the Outputs from the Request step are available to use.
The list columns that can accept strings, as well as a few other column types will be available for you to modify. Click in each relevant column and select the relevant output.
Once you’re finished, go back to the Request Step in your Logic App and copy the URL from the Request step
Return to MailParser.io, go back to Webhook integrations, and click Edit.
Paste the URL from your Logic App Request step into the Target URL.
Click Save and test.
Click Send test data.
You should receive a response code of 202 to confirm it was sent successfully.
You can now check Azure Logic Apps to confirm that it ran correctly.
You should also see the new entry in your SharePoint Online list.
Setting up the Outlook Rule
Once you’ve confirmed it’s working, you can set up your mail rules in Outlook or via Exchange to automatically forward emails to your mailparser.io email address.
Right click on an email sent via your web form. Click Rules, then Create rule.
Choose a condition that matches all emails sent via your form, eg. Subject. Then click Advanced Options…
Click Next.
Tick forward it to people or public group, then click people or public group.
Enter the email address from Mailparser.io, click OK, then click Next twice.
Turn on the rule, and choose whether you want to run it on mail already in the same folder.
And that’s it. From now on, any mail sent by your website’s form will be automatically forwarded into mailparser.io, broken up into the relevant fields, and added to SharePoint Online. You can also use Azure Logic Apps to automate a bunch of other business processes. Check out the documentation here.
In the past few months we’ve put together a few Data Rooms for our clients – virtual places where internal and external parties can be invited to view confidential, or commercially sensitive data. One of the features of Office 365 that’s allowed us to protect these files is Information Rights Management.
What is Information Rights Management?
Information Rights Management is a feature that is only available on the Enterprise Office 365 Plans. It allows you control the security of your data, and prevent users from printing, marking up your documents or even accessing them after a specified period of time. When used in conjunction with the existing security features of SharePoint Online or OneDrive for Business, Information Rights Management can be used to lock down your important and sensitive data quite comprehensively.
Who can use it?
In order to get this set up, you must be using an Office 365 Enterprise plan, like Office 365 E1, Office 365 E3, Office 365 E4 or Office 365 E5.
Office 365 Small Business, Small Business Premium, Midsize Business, Business Essentials, Business or Business Premium miss out on the extra security measures available in Information Rights Management.
How to set up Information Rights Management in Office 365
Information Rights Management is not enabled by default, it needs to be manually configured. Here’s how it’s done.
If you’re not already taken to the Office 365 Admin center, click the App Launcher on the top left, then click the Admin tile.
Under Service Settings, click Rights Management.
Click Manage, under Protect your information.
Click Activate, to activate Rights Management for your organisation
Click Activate again.
Wait for Rights Management to activate.
Navigate to the SharePoint Admin Center, by clicking SharePoint under Admin, on the left menu.
Click Settings in the SharePoint admin center.
Click Use the IRM service specified in your configuration, then Refresh your settings.
Wait for the settings to be applied.
Navigate to the SharePoint Document Library that you want to apply Information Rights Management to, then click the Library tab at top, followed by Library Settings.
Under Permissions and Management, choose Information Rights Management.
Tick the box to restrict permissions on this library on download. Then, give your Permission Policy a name and a description that you’d like to appear for the users. Choose the settings of the policy for this document library.
Any Office documents uploaded to this document library will have the IRM policy added when they are opened or downloaded. If you would prefer that users cannot download documents, and can only view them in the browser, give them View Only permission in SharePoint Online.
A note about Information Rights Management and PDF files
In our experience, Information Rights Management works best when working with Office documents, because they can be securely opened in the browser and don’t have to be downloaded to the computer. PDF documents can also have IRM policies applied to them, though since there is no way to open these in the browser, they must be downloaded and opened with a PDF reader that supports IRM. There are a couple of options for this, though the experience may not be ideal for the user who is accessing the data, since they may have to install a supported PDF reader.
For the best user experience, you may decide to secure PDF files using the dedicated PDF security features. You can still store these PDFs on SharePoint Online on OneDrive, though it is best to keep these in a separate document library without IRM applied.
Alternatively, you can convert your PDF documents to Word Documents, for consistent IRM policies and security for all of your files.
https://gcits.com/wp-content/uploads/InformationRightsManagementInOffice365.png5821035Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2016-02-27 18:21:412016-02-28 09:40:57Use Information Rights Management to protect data on Office 365
https://gcits.com/wp-content/uploads/ShapingOutcomesIsYellow.png6301500Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2015-12-23 20:25:372021-07-07 11:50:43How Shaping Outcomes is using technology to change young lives
https://gcits.com/wp-content/uploads/MAPA.jpg7511500Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2015-12-22 21:33:382021-07-07 11:51:58AUS Recruit embraces Office 365 and SharePoint Online with an award winning solution by GCIT
When switching from Google Apps/Google for Work to Office 365, you’ll usually want to migrate your Google Drive files as well as your mail.
There are a few online tools that will do this for free, or at a cost, with varying degrees of functionality. I came across this handy article that goes into more detail on these methods.
The method that stuck out to me was the new SharePoint Online Migration API from Microsoft. A free powershell driven process. Microsoft released an IT User Guide on the steps required when it was in preview. This is the document I used, and it can be downloaded here.
I used a Microsoft Azure virtual machine to do the initial download of the Google Drive Directory – about 150 GB of data. It downloaded incredibly fast on the Azure VMs connection and completed in a couple of hours. I just used the Google Drive sync tool for this, though you can also use Google’s Takeout tool if you need to convert your Google Docs/Sheets/Slides to their Microsoft Office equivalents.
Once downloaded, I installed the updated SharePoint Online Management Shell, and followed the instructions in the provided Word documents above.
The next step was to create an Azure storage account, create the folders for the migration packages (a bunch of XML manifests outlining what needs to be migrated) and start the upload of the data.
I got an error during one of the first powershell commands that read ‘New-SPOMigrationPackage : The server could not be contacted‘. Following the instructions of this blog post, I changed the initial command to this and added the -NoAdLookup switch to resolve it.
Next we run the Set-SPOMigrationPackageAzureSource cmdlet to upload the data from the Azure Server to the Azure Storage account.
Once the data and the migration package is uploaded, the migration can be kicked off via PowerShell. Now the data is being moved from Azure to OneDrive for Business/SharePoint online. You can check the status of the Migration using the Microsoft Azure Storage Explorer, or just keep an eye on the library that you’re migrating to.
The coolest thing about this method is that it avoids the upload throttling of the ‘Open with Explorer’ method, and the syncing issues of the OneDrive for Business sync client. Best of all, it preserves the date modified metadata of the original files.
https://gcits.com/wp-content/uploads/gcit-logo-300x138.png00Elliot Munrohttps://gcits.com/wp-content/uploads/gcit-logo-300x138.pngElliot Munro2015-10-18 09:29:262019-02-21 13:23:56Migrating from Google Drive to OneDrive for Business and SharePoint Online
We may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
Essential Website Cookies
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to deliver the website, refuseing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.
We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
Google Analytics Cookies
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visit to our site you can disable tracking in your browser here:
Other external services
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds:
Other cookies
The following cookies are also needed - You can choose if you want to allow them: