Getting Creative: a Bespoke Solution for Feature Update Deployments

This is the first blog post in what I hope will be a series of posts demonstrating several custom solutions I created for things such as feature update deployments, managing local admin password rotation, provisioning Windows 10 devices, managing drive mappings and more. My reasons for creating these solutions was to overcome some of the current limitations in existing products or processes, make things more cloud-first and independent of existing on-prem infrastructure where possible, and to more exactly meet the requirements of the business.

Although I will try to provide a generalised version of the source code where possible, I am not providing complete solutions that you can go ahead and use as is. Rather my intention is to inspire your own creativity, to give working examples of what could be done if you have the time and resource, and to provide source code as a reference or starting point for your own solutions should you wish to create them!

Someone asked me recently how we deploy feature updates and it was a difficult question to answer other than to say we use a custom-built process. Having used some of the existing methods available (ConfigMgr Software Updates, ConfigMgr custom WaaS process, Windows Update for Business) we concluded there were shortcomings in each of them, and this provided inspiration to create our own, customized process to give us the control, reliability, user experience and reporting capability that we desired. Don’t get me wrong – I am not saying these methods aren’t good – they just couldn’t do things exactly the way we wanted.

So I set out to create a bespoke process – one that we could customize according to our needs, that was largely independent of our existing Configuration Manager infrastructure and that could run on any device with internet access. This required making use of cloud services in Azure as well as a lot of custom scripting! In this blog, I’ll try to cover what I did and how it works.

User Experience

First, let’s look at the end user experience of the feature update installation process – this was something key for us, to improve the user experience keeping it simple yet informative, and able to respond appropriately to any upgrade issues.

Once the update is available to a device, a toast notification is displayed notifying the user that an update is available. Initially, this displays once a day and automatically dismisses after 25 seconds. (I’ve blanked out our corporate branding in all these images)

We use a soft deadline – ie the update is never forced on the user. Enforcing compliance is handled by user communications and involvement from our local technicians. With one week left before the deadline, we increase the frequency of the notifications to twice per day.

If the deadline has passed, we take a more aggressive approach with the notifications, modifying the image and text, displaying it every 30 minutes and it doesn’t leave the screen unless the user actions or dismisses it.

The update can be installed via a shortcut on the desktop, or in the last notification it can be initiated from the notification itself.

Once triggered, a custom UI is displayed introducing the user to the update and what to expect.

When the user clicks Begin, we check that a power adapter is connected and no removable USB devices are attached – if they are, we prompt to the user to remove them first.

The update runs in three phases or stages – these correspond to the PreDownload, Install and Finalize commands on the update (more on that later). The progress of each stage is polled from the registry, as is the Setup Phase and Setup SubPhase.

Note that the user cannot cancel the update once it starts and this window will remain on the screen and on top of all other windows until the update is complete. The user can click the Hide me button, and this will shrink the window like so:

This little window also cannot be removed from the screen, but it can be moved around and is small enough to be unobtrusive. When the update has finished installing, or when the user clicks Restore, the main window will automatically display again and report the result of the update.

The colour scheme is based on Google’s material design, by the way.

If the update failed during the online phase, the user can still initiate the update from the desktop shortcut but toast notifications will no longer display as reminders. The idea is that IT can attempt to remediate the device and run the update again after.

If successful, the user can click Restart to restart the computer immediately. Then the offline phase of the upgrade runs, where you see the usual light blue screen and white Windows update text reporting that updates are being installed.

Once complete, the user will be brought back to the login screen, and we won’t bother them anymore.

If the update rolled back during the offline phase, we will detect this next time they log in and notify them one time:

Logging and Reporting

The entire update process is logged right from the start to a log file on the local machine. We also send ‘status messages’ at key points during the process and these find their way to an Azure SQL database which becomes the source for reporting on update progress across the estate (more on this later).

A PowerBI report gives visual indicators of update progress as well as a good amount of detail from each machine including update status, whether it passed or failed readiness checks and if failed, why, whether it passed the compatibility assessment, if it failed the assessment or the install we give the error code, whether any hard blocks were found, setup diag results (2004 onward), how long the update took to install and a bunch of other stuff we find useful.

Since 2004 though, we have starting inventorying certain registry keys using ConfigMgr to give us visibility of devices that won’t upgrade because of a Safeguard hold or other reason, so we can target the upgrade only to devices that aren’t reporting any known compatibility issues.

If a device performs a rollback, we can get it to upload key logs and registry key dumps to an Azure storage account where an administrator can remotely diagnose the issue.

How does it work?

Now lets dive into the process in more technical detail.

Deployment Script

The update starts life with a simple PowerShell script that does the following:

  • Creates a local directory to use to cache content, scripts and logs etc
  • Temporarily stores some domain credentials in the registry of the local SYSTEM account as encrypted secure strings for accessing content from a ConfigMgr distribution point if necessary (more on this later)
  • Downloads a manifest file that contains a list of all files and file versions that need to be downloaded to run the update. These include scripts, dlls (for the UI), xml definition files for scheduled tasks etc
  • Each file is then downloaded to the cache directory from an Azure CDN
  • 3 scheduled tasks are then registered on the client:
    • A ‘preparer’ task which runs prerequisite actions
    • A ‘file updater’ task which keeps local files up-to-date in case we wish to change something
    • A ‘content cleanup’ task which is responsible for cleaning up in the event the device gets upgraded through any means
  • A ‘status message’ is then sent as an http request, creating a new record for the device in the Azure SQL database

This script can be deployed through any method you wish, including Configuration Manager, Intune or just manually, however it should be run in SYSTEM context.

Content

All content needed for the update process to run is put into a container in a storage account in Azure. This storage account is exposed via an Azure Content Delivery Network (CDN). This means that clients can get all the content they need directly from an internet location with minimal latency no matter where they are in the world.

Feature Update Files

The files for the feature update itself are the ESD file and WindowsUpdateBox.exe that Windows Update uses. You can get these files from Windows Update, WSUS, or as in our case, from Configuration Manager via WSUS. We simply download the feature updates to a deployment package in ConfigMgr and grab the content from there.

You could of course use an ISO image and run setup.exe, but the ESD files are somewhat smaller in size and are sufficient for purpose.

The ESD files are put into the Azure CDN so the client can download them from there, but we also allow the client the option to get the FU content from a local ConfigMgr distribution point if they are connected to the corporate network locally. Having this option allows considerably quicker content download. Since IIS on the distribution points is not open to anonymous authentication, we use the domain credentials stamped to the registry to access the DP and download the content directly from IIS (credentials are cleaned from the registry after use).

Status Messages

Similar to how ConfigMgr sends status message to a management point, this solution also send status messages at key points during the process. This works by using Azure Event Grid to receive the message sent from the client as an http request. The Event Grid sends the message to an Azure Function, and the Azure Function is responsible to update the Azure SQL database created for this purpose with the current upgrade status of the device. The reason for doing it this way is that sending an http request to Event Grid is very quick and doesn’t hold up the process. Event Grid forwards the message to the Azure Function and can retry the message in the case it can’t get through immediately (although I’ve never experienced any failures or dead-lettering in practice). The Azure Function uses a Managed Identity to access the SQL database, which means the SQL database never needs to be exposed outside of its sandbox in Azure, and no credentials are needed to update the database.

We then use PowerBI to report on the data in the database to give visibility of where in the process every device is, if there are any issues that need addressing and all the stats that are useful for understanding whether devices get content from Azure or a local DP, what their approximate bandwidth is, how long downloads took, whether they were wired or wireless, make and model, upgrade time etc.

Preparation Script

After the initial deployment script has run, the entire upgrade process is driven by scheduled tasks on the client. The first task to run is the Preparation script and this attempts to run every hour until successful completion. This script does the following things:

  • Create the registry keys for the upgrade. These keys are stamped with the update progress and the results of the various actions such as pre-req checks, downloads etc. When we send a ‘status message’ we simply read these keys and send them on. Having progress stamped in the local registry is useful if we need to troubleshoot on the device directly.
  • Run readiness checks, such as
    • Checking for client OS
    • Checking disk space
  • Check for internet connectivity
  • Determine the approximate bandwidth to the Azure CDN and measure latency. This is done by downloading a 100MB file from the CDN and timing the download and using ‘psping.exe’ to measure latency. From this, we can calculate an approximate download time for the main ESD file.
  • Determine if the device is connected by wire or wireless
  • Determine if the device is connected to the corporate network
  • If the device is on the corporate network, we check latency to all the ConfigMgr distribution points to determine which one will be the best DP to get content from
  • Determine whether OS is business or consumer and which language. This helps us figure out which ESD file to use.
  • Download WindowsUpdateBox.exe and verify the hash
  • Download the feature update ESD file and verify the hash
    • Downloads of FU content is done using BITS transfers as this proved the most reliable method. Code is added to handle BITS transfer errors to add resilience.
  • Assuming all the above is done successfully, the Preparation task will be disabled and the PreDownload task created.

PreDownload Script

The purpose of this script is to run the equivalent of a compatibility assessment. When using the ESD file, this is done with the /PreDownload switch on WindowsUpdateBox.exe. Should the PreDownload fail, the error code will be logged to the registry. Since 2004, we also read the SetupDiag results and stamp these to the registry. We also check the Compat*.xml files to look for any hard blocks and if found, we log the details to the registry.

If the PreDownload failed, we change the schedule of the task to run twice a week. This allows for remediation to be performed on the device before attempting the PreDownload assessment again.

If the PreDownload succeeds, we disable the PreDownload task and create two new ones – a Notification task and an Upgrade task.

We also create a desktop shortcut that the user can use to initiate the upgrade.

Notification Script

The Notification script runs in the user context and displays toast notifications to notify the user that the upgrade is available, what the deadline is and how to upgrade, as already mentioned.

Upgrade Script

When the user clicks the desktop shortcut or the ‘Install now’ button on the toast notification, the upgrade is initiated. Because the upgrade needs to run with administrative privilege, the only thing the desktop shortcut and toast notification button does is to create an entry in the Application event log. The upgrade scheduled task is triggered when this event is created and the task runs in SYSTEM context. The UI is displayed in the user session with the help of the handy ServiceUI.exe from the MDT toolkit.

Upgrade UI

The user interface part of the upgrade is essentially a WPF application coded in PowerShell. The UI displays some basic upgrade information for the user and once they click ‘Begin’ we run the upgrade in 3 stages:

  1. PreDownload. Even though we ran this already, we run again before installing just to make sure nothing has changed since, and it doesn’t take long to run.
  2. Install. This uses the /Install switch on WindowsUpdateBox.exe and runs the main part of the online phase of the upgrade.
  3. Finalize. This uses the /Finalize switch and finalizes the update in preparation for a computer restart.

The progress of each of these phases is tracked in the registry and displayed in the UI using progress bars. If there is an issue, we notify the user and IT can get involved to remediate.

If successful, the user can restart the computer immediately or a later point (though we discourage this!). We don’t stop the user from working while the upgrade is running in the online phase and we allow them to partially hide the upgrade window so the upgrade does not hinder user productivity (similar to how WUfB installs an update in the background.)

After the user restarts the computer, the usual Windows Update screens take over until the update has installed and the user is brought to the login screen again.

Drivers and Stuff

We had considered upgrading drivers and even apps with this process, as we did for the 1903 upgrade, however user experience was important for us and we didn’t want the upgrade to take any longer than necessary, so we decided not to chain anything onto the upgrade process itself but handle other things separately. That being said, because this is a custom solution it is perfectly possible to incorporate additional activities into it if desired.

Rollback

In the event the that OS was rolled back during the offline phase, a scheduled task will run that will detect this and raise a toast notification to inform the user. We have a script that will gather logs and data from the device and upload it to a storage account in Azure where an administrator can remotely diagnose the issue. I plan to incorporate that as an automatic part of the process in a future version.

Updater Script

The solution creates an Updater scheduled task which runs once per day. The purpose of this task is to keep the solution up to date. If we want to change something in the process, add some code to a file or whatever is necessary, the Updater will take care of this.

It works by downloading a manifest file from the Azure CDN. This file contains all the files used by the solution with their current versions. If we update something, we upload the new files to the Azure storage account, purge them from the CDN and update the manifest file.

The Updater script will download the current manifest, detect that something has changed and download the required files to the device.

Cleanup Script

A Cleanup task is also created. When this task detects that the OS has been upgraded to the required version, it will remove all the scheduled tasks and cached content to leave no footprint on the device other than the log file and the registry keys.

Source Files

You can find a generalised version of the code used in this solution in my Github repo as a reference. As mentioned before though, there are many working parts to the solution including the Azure services and I haven’t documented their configuration here.

Final Comments

The main benefit of this solution for us is that it is completely customised to our needs. Although it is relatively complex to create, it is also relatively easy to maintain as well as adapt the solution for new W10 versions. We do still take advantage of products like ConfigMgr to allow devices to get content from a local DP if they are corporate connected, and ConfigMgr / Update Compliance / Desktop Analytics for helping us determine device compatibility and ConfigMgr or Intune to actually get the deployment script to the device. We also make good use of Azure services for the status messages and the cloud database, as well as PowerBI for reporting. So the solution still utilizes existing Microsoft products while giving us the control and customisations that we need to provide a better upgrade experience for our users.

Get a daily admin Audit Report for MEM / Intune

In an environment where you have multiple admin users it’s useful to audit admin activities so everyone can be aware of changes that others have made. I do this for Endpoint Configuration Manager with a daily email report built from admin status messages, so I decided to create something similar for Intune / MEM.

Admin actions are already audited for you in MEM (Tenant Administration > Audit logs) so it’s simply a case of getting that data into an email report. You can do this with Graph (which gives you more data actually) but I decided to use Log Analytics for this instead.

You need a Log Analytics workspace, and you need to configure Diagnostics settings in the MEM portal to send AuditLogs to the workspace.

Then, in order to automate sending a daily report create a service principal in Azure AD with just the permissions necessary to read data from the Log Analytics workspace. You can do this easily from the Azure portal using CloudShell. In the example below, I’m creating a new service principal with the role “Log Analytics Reader” scoped just to the Log Analytics workspace where the AuditLogs are sent to.

$DisplayName = "MEM-Reporting"
$Role = "Log Analytics Reader"
$Scope = "/subscriptions/<subscriptionId>/resourcegroups/<resourcegroupname>/providers/microsoft.operationalinsights/workspaces/<workspacename>"

$sp = New-AzADServicePrincipal -DisplayName $DisplayName -Role $Role -Scope $Scope

With the service principal created, you’ll need to make a note of the ApplicationId:

$sp.ApplicationId

And the secret:

$SP.Secret | ConvertFrom-SecureString -AsPlainText

Of course, if you prefer you can use certificate authentication instead of using the secret key.

Below is a PowerShell script that uses the Az PowerShell module to connect to the log analytics workspace as the service principal, query the IntuneAuditLogs for entries in the last 24 hours, then send them in an HTML email report. Run it with your favourite automation tool.

You’ll need the app Id and secret from the service principal, your tenant Id, your log analytics workspace Id, and don’t forget to update the email parameters.

Sample email report
# Script to send a daily audit report for admin activities in MEM/Intune
# Requirements:
# – Log Analytics Workspace
# – Intune Audit Logs saved to workspace
# – Service Principal with 'Log Analytics reader' role in workspace
# – Azure Az PowerShell modules
# Azure resource info
$ApplicationId = "abc73938-0000-0000-0000-9b01316a9123" # Service Principal Application Id
$Secret = "489j49r-0000-0000-0000-e2dc6451123" # Service Principal Secret
$TenantID = "abc894e7-00000-0000-0000-320d0334b123" # Tenant ID
$LAWorkspaceID = "abcc1e47-0000-0000-0000-b7ce2b2bb123" # Log Analytics Workspace ID
$Timespan = (New-TimeSpan Hours 24)
# Email params
$EmailParams = @{
To = 'trevor.jones@smsagent.blog'
From = 'MEMReporting@smsagent.blog'
Smtpserver = 'smsagent.mail.protection.outlook.com'
Port = 25
Subject = "MEM Audit Report | $(Get-Date Format ddMMMyyyy)"
}
# Html CSS style
$Style = @"
<style>
table {
border-collapse: collapse;
font-family: sans-serif
font-size: 12px
}
td, th {
border: 1px solid #ddd;
padding: 6px;
}
th {
padding-top: 8px;
padding-bottom: 8px;
text-align: left;
background-color: #3700B3;
color: #03DAC6
}
</style>
"@
# Connect to Azure with Service Principal
$Creds = [PSCredential]::new($ApplicationId,(ConvertTo-SecureString $Secret AsPlaintext Force))
Connect-AzAccount ServicePrincipal Credential $Creds Tenant $TenantID
# Run the Log Analytics Query
$Query = "IntuneAuditLogs | sort by TimeGenerated desc"
$Results = Invoke-AzOperationalInsightsQuery WorkspaceId $LAWorkspaceID Query $Query Timespan $Timespan
$ResultsArray = [System.Linq.Enumerable]::ToArray($Results.Results)
# Converts the results to a datatable
$DataTable = New-Object System.Data.DataTable
$Columns = @("Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID")
foreach ($Column in $Columns)
{
[void]$DataTable.Columns.Add($Column)
}
foreach ($result in $ResultsArray)
{
$Properties = $Result.Properties | ConvertFrom-Json
[void]$DataTable.Rows.Add(
$Properties.ActivityDate,
$result.Identity,
$Properties.Actor.ApplicationName,
$result.OperationName,
$result.ResultType,
$Properties.TargetDisplayNames[0],
$Properties.TargetObjectIDs[0]
)
}
# Send an email
If ($DataTable.Rows.Count -ge 1)
{
$HTML = $Datatable |
ConvertTo-Html Property "Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID" Head $Style Body "<h2>MEM Admin Activities in the last 24 hours</h2>" |
Out-String
Send-MailMessage @EmailParams Body $html BodyAsHtml
}

Installing and Configuring Additional Languages during Windows Autopilot

I was experimenting with different ways to get additional languages installed and configured during Windows Autopilot and it proved to be an interesting challenge. The following is what I settled on in the end and what produced the results that I wanted.

Here were my particular requirements, but you can customize this per your own need:

  • The primary language should be English (United Kingdom)
  • An additional secondary language of English (United States)
  • Display language should be English (United Kingdom)
  • Default input override should be English (United Kingdom)
  • System locale should be English (United Kingdom)
  • The administrative defaults for the Welcome screen and New user accounts must have a display language, input language, format and location matching the primary language (UK / UK English)
  • All optional features for the primary language should be installed (handwriting, optical character recognition, etc)

To achieve this, I basically created three elements:

  1. Installed the Local Experience Pack for English (United Kingdom)
  2. Deployed a powershell script running in administrative context that sets the administrative language defaults and system locale
  3. Deployed a powershell script running in user context that sets the correct order in the user preferred languages list

This was deployed during Autopilot to a Windows 10 1909 (United States) base image.

Local Experience Packs

Local Experience Packs (LXPs) are the modern way to go for installing additional languages since Windows 10 1803. These are published to the Microsoft Store and are automatically updated. They also install more quickly that the traditional cab language packs that you would install with DISM.

LXPs are available in the Microsoft Store for Business, so they can be synced with Intune and deployed as apps. However, the problem with using LXPs as apps during Autopilot is the order of things. The LXP needs to be installed before the PowerShell script that configures the language defaults runs, and since PowerShell scripts are not currently tracked in the ESP, and apps are the last thing to install in the device setup phase, the scripts will very likely run before the app is installed.

To get around that, I decided to get the LXP from the Volume Licensing Center instead. Then I uploaded this to a storage account in Azure, where it gets downloaded and installed by the PowerShell script. This way I can control the order and be sure the LXP is installed before making configuration changes.

When downloading from the VLC, be sure to select the Multilanguage option:

Then get the highlighted ISO. The 1903 LXPs work for 1909 also.

Get the applicable appx file and the license file from the ISO, zip them, and upload the zip file into an Azure Storage account.

When uploading the zip file, be sure to choose the Account Key authentication type:

Once uploaded, click on the blob and go to the Generate SAS page. Choose Read permissions, set an appropriate expiry date, then copy the Blob SAS URL. You will need this to download the file with PowerShell.

Administrative PowerShell Script

Now lets create a PowerShell script that will:

  • Download and install the Local Experience Pack
  • Install any optional features for the language
  • Configure language and regional settings and defaults

Here’s the script I’m using for that.

# Admin-context script to set the administrative language defaults, system locale and install optional features for the primary language
# Language codes
$PrimaryLanguage = "en-GB"
$SecondaryLanguage = "en-US"
$PrimaryInputCode = "0809:00000809"
$SecondaryInputCode = "0409:00000409"
$PrimaryGeoID = "242"
# Enable side-loading
# Required for appx/msix prior to build 18956 (1909 insider)
New-ItemProperty Path HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\AppModelUnlock Name AllowAllTrustedApps Value 1 PropertyType DWORD Force
# Provision Local Experience Pack
$BlobURL = "https://mystorageaccount.blob.core.windows.net/mycontainer/en-gb.zip?sp=r&st=2020-03-26T18:02:28Z&se=2050-03-27T00:02:28Z&spr=https&sv=2019-02-02&sr=b&sig=91234567890OYr%2BI0RcryhGFy1DNMlzhfIWbQ%3D"
$DownloadedFile = "$env:LOCALAPPDATA\en-GB.zip"
Try
{
$WebClient = New-Object System.Net.WebClient
$WebClient.DownloadFile($BlobURL, $DownloadedFile)
Unblock-File Path $DownloadedFile ErrorAction SilentlyContinue
Expand-Archive Path $DownloadedFile DestinationPath $env:LOCALAPPDATA Force ErrorAction Stop
Add-AppxProvisionedPackage Online PackagePath "$env:LOCALAPPDATA\en-gb\LanguageExperiencePack.en-gb.Neutral.appx" LicensePath "$env:LOCALAPPDATA\en-gb\License.xml" ErrorAction Stop
Remove-Item Path $DownloadedFile Force ErrorAction SilentlyContinue
}
Catch
{
Write-Host "Failed to install Local Experience Pack: $_"
}
# Install optional features for primary language
$UKCapabilities = Get-WindowsCapability Online | Where {$_.Name -match "$PrimaryLanguage" -and $_.State -ne "Installed"}
$UKCapabilities | foreach {
Add-WindowsCapability Online Name $_.Name
}
# Apply custom XML to set administrative language defaults
$XML = @"
<gs:GlobalizationServices xmlns:gs="urn:longhornGlobalizationUnattend">
<!– user list –>
<gs:UserList>
<gs:User UserID="Current" CopySettingsToDefaultUserAcct="true" CopySettingsToSystemAcct="true"/>
</gs:UserList>
<!– GeoID –>
<gs:LocationPreferences>
<gs:GeoID Value="$PrimaryGeoID"/>
</gs:LocationPreferences>
<gs:MUILanguagePreferences>
<gs:MUILanguage Value="$PrimaryLanguage"/>
<gs:MUIFallback Value="$SecondaryLanguage"/>
</gs:MUILanguagePreferences>
<!– system locale –>
<gs:SystemLocale Name="$PrimaryLanguage"/>
<!– input preferences –>
<gs:InputPreferences>
<gs:InputLanguageID Action="add" ID="$PrimaryInputCode" Default="true"/>
<gs:InputLanguageID Action="add" ID="$SecondaryInputCode"/>
</gs:InputPreferences>
<!– user locale –>
<gs:UserLocale>
<gs:Locale Name="$PrimaryLanguage" SetAsCurrent="true" ResetAllSettings="false"/>
</gs:UserLocale>
</gs:GlobalizationServices>
"@
New-Item Path $env:TEMP Name "en-GB.xml" ItemType File Value $XML Force
$Process = Start-Process FilePath Control.exe ArgumentList "intl.cpl,,/f:""$env:Temp\en-GB.xml""" NoNewWindow PassThru Wait
$Process.ExitCode

A quick walkthrough:

First, I’ve entered the locale IDs for the primary and secondary languages, as well as the keyboard layout hex codes, and finally the Geo location ID for the primary language as variables.

Then we set a registry key to allow side-loading (required for older W10 versions for the install of appx/msix).

Next we download and install the LXP. You’ll need to enter the URL you copied earlier for the Azure blob, and update the zip filename as required, as well as the LXP filename.

Then we install any optional features for the primary language that aren’t already installed.

Then we define the content of an XML file that will be used to set the language and locale preferences. Obviously customize that per your requirement.

Then we save that content to a file and apply it.

Create the PowerShell script in Intune, make sure you don’t run it using the logged on credentials, and deploy it to your Autopilot AAD group.

User PowerShell Script

Now we need to create a very simple script that will run in the user context. This script simply makes sure that the list of preferred languages is in the correct order, as by default it will look like this:

This script will run for each user that logs in. It won’t run immediately so the order may be wrong when you first log in, but it doesn’t take long before it runs. Create the script in Intune, remember to run it using the logged on credentials, and deploy it to your Autopilot AAD group.

# User-context script to set the Language List
# Language codes
$PrimaryLanguage = "en-GB"
$SecondaryLanguage = "en-US"
$PrimaryInputCode = "0809:00000809"
$SecondaryInputCode = "0409:00000409"
# Set preferred languages
$NewLanguageList = New-WinUserLanguageList Language $PrimaryLanguage
$NewLanguageList.Add([Microsoft.InternationalSettings.Commands.WinUserLanguage]::new($SecondaryLanguage))
$NewLanguageList[1].InputMethodTips.Clear()
$NewLanguageList[1].InputMethodTips.Add($PrimaryInputCode)
$NewLanguageList[1].InputMethodTips.Add($SecondaryInputCode)
Set-WinUserLanguageList $NewLanguageList Force

The Result

After running the Autopilot deployment and logging in, everything checks out 🙂

Managing Intune PowerShell Scripts with Microsoft Graph

In this blog I’ll cover how to list, get, create, update, delete and assign PowerShell scripts in Intune using Microsoft Graph and PowerShell.

Although you can use the Invoke-WebRequest or Invoke-RestMethod cmdlets when working with MS Graph, I prefer to use the Microsoft.Graph.Intune module, aka Intune PowerShell SDK, as it more nicely handles getting an auth token and we don’t have to create any headers, so get that module installed.

In the Graph API, PowerShell scripts live under the deviceManagementScript resource type and these are still only available in the beta schema so they are subject to change.

Connect to MS Graph

First off, let’s connect to MS Graph and set the schema to beta:

If ((Get-MSGraphEnvironment).SchemaVersion -ne "beta")
{
    $null = Update-MSGraphEnvironment -SchemaVersion beta
}
$Graph = Connect-MSGraph

List PowerShell Scripts

Now we can list the PowerShell scripts we have in Intune:

$URI = "deviceManagement/deviceManagementScripts"
$IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
If ($IntuneScripts.value)
{
    $IntuneScripts = $IntuneScripts.value
}

If we take a look at the results, we’ll see that the script content is not included when we list scripts. It is included when we get a single script, as we’ll see next.

Get a PowerShell Script

To get a specific script, we need to know its Id. To get that, first let’s create a simple function where we can pass a script name and use the Get method to retrieve the script details.

Function Get-IntunePowerShellScript {
    Param($ScriptName)
    $URI = "deviceManagement/deviceManagementScripts" 
    $IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
    If ($IntuneScripts.value)
    {
        $IntuneScripts = $IntuneScripts.value
    }
    $IntuneScript = $IntuneScripts | Where {$_.displayName -eq "$ScriptName"}
    Return $IntuneScript
}

Now we can use this function to get the script Id and then call Get again adding the script Id to the URL:

$ScriptName = "Escrow Bitlocker Recovery Keys to AAD"
$Script = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($Script.id)"
$IntuneScript = Invoke-MSGraphRequest -HttpMethod GET -Url $URI

If we look at the result, we can see that the script content is now returned, albeit in binary form:

View Script Content

To view the script, we simply need to convert it:

$Base64 =[Convert]::FromBase64String($IntuneScript.scriptContent)
[System.Text.Encoding]::UTF8.GetString($Base64)

Create a Script

Now lets create a new script. To create a script we will read in a script file and convert it into base64. We add this together with other required parameters into some JSON before posting the request.

When reading and converting the script content use UTF8. Other character sets may not decode properly at run-time on the client-side and result in script execution failure.

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD"
    RunAsAccount = "system" # or user
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts"
$Response = Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

We can now see our script in the portal:

Update a Script

To update an existing script, we follow a similar process to creating a new script, we create some JSON that contains the updated parameters then call the Patch method to update it. But first we need to get the Id of the script we want to update, using our previously created function:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

In this example I have updated the content in the source script file so I need to read it in again, as well as updating the description of the script:

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD (Updated 2020-03-19)"
    RunAsAccount = "system"
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.id)"
$Response = Invoke-MSGraphRequest -HttpMethod PATCH -Url $URI -Content $Json

We can call Get on the script again and check the lastModifiedDateTime entry to verify that the script was updated, or check in the portal.

Add an Assignment

Before the script will execute anywhere it needs to be assigned to a group. To do that, we need the objectId of the AAD group we want to assign it to. To work with AAD groups I prefer to use the AzureAD module, so install that before continuing.

We need to again get the script that we want to assign:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

Then get the Azure AD group:

$AzureAD = Connect-AzureAD -AccountId $Graph.UPN
$GroupName = "Intune - [Test] Bitlocker Key Escrow"
$Group = Get-AzureADGroup -SearchString $GroupName

Then we prepare the necessary JSON and post the assignment

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($Group.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

To replace the current assignment with a new assignment, simply change the group name and run the same code again. To add an additional assignment or multiple assignments, you’ll need to post all the assignments at the same time, for example:

$GroupNameA = "Intune - [Test] Bitlocker Key Escrow"
$GroupNameB = "Intune - [Test] Autopilot SelfDeploying Provisioning"
$GroupA = Get-AzureADGroup -SearchString $GroupNameA
$GroupB = Get-AzureADGroup -SearchString $GroupNameB

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupA.ObjectId)"
        },
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupB.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

Delete an Assignment

I haven’t yet figured out how to delete an assignment – the current documentation appears to be incorrect. If you can figure this out please let me know!

Delete a Script

To delete a script, we simply get the script Id and call the Delete method on it:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)"
Invoke-MSGraphRequest -HttpMethod DELETE -Url $URI 

Delete Device Records in AD / AAD / Intune / Autopilot / ConfigMgr with PowerShell

I’ve done a lot of testing with Windows Autopilot in recent times. Most of my tests are done in virtual machines, which are ideal as I can simply dispose of them after. But you also need to cleanup the device records that were created in Azure Active Directory, Intune, the Autopilot registration service, Microsoft Endpoint Manager (if you’re using it) and Active Directory in the case of Hybrid-joined devices.

To make this a bit easier, I wrote the following PowerShell script. You simply enter the device name and it’ll go and search for that device in any of the above locations that you specify and delete the device records.

The script assumes you have the appropriate permissions, and requires the Microsoft.Graph.Intune and AzureAD PowerShell modules, as well as the Configuration Manager module if you want to delete from there.

You can delete from all of the above locations with the -All switch, or you can specify any combination, for example -AAD -Intune -ConfigMgr, or -AD -Intune etc.

In the case of the Autopilot device registration, the device must also exist in Intune before you attempt to delete it as the Intune record is used to determine the serial number of the device.

Please test thoroughly before using on any production device!

Examples

Delete-AutopilotedDeviceRecords -ComputerName PC01 -All
@(
    'PC01'
    'PC02'
    'PC03'
) | foreach {
    Delete-AutopilotedDeviceRecords -ComputerName $_ -AAD -Intune
}

Output

Script

[CmdletBinding(DefaultParameterSetName='All')]
Param
(
[Parameter(ParameterSetName='All',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
[Parameter(ParameterSetName='Individual',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
$ComputerName,
[Parameter(ParameterSetName='All')]
[switch]$All = $True,
[Parameter(ParameterSetName='Individual')]
[switch]$AD,
[Parameter(ParameterSetName='Individual')]
[switch]$AAD,
[Parameter(ParameterSetName='Individual')]
[switch]$Intune,
[Parameter(ParameterSetName='Individual')]
[switch]$Autopilot,
[Parameter(ParameterSetName='Individual')]
[switch]$ConfigMgr
)
Set-Location $env:SystemDrive
# Load required modules
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Importing modules…" NoNewline
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module Microsoft.Graph.Intune ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module AzureAD ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module $env:SMS_ADMIN_UI_PATH.Replace('i386','ConfigurationManager.psd1') ErrorAction Stop
}
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
# Authenticate with Azure
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-Host "Authenticating with MS Graph and Azure AD…" NoNewline
$intuneId = Connect-MSGraph ErrorAction Stop
$aadId = Connect-AzureAD AccountId $intuneId.UPN ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "Error!" ForegroundColor Red
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
Write-host "$($ComputerName.ToUpper())" ForegroundColor Yellow
Write-Host "===============" ForegroundColor Yellow
# Delete from AD
If ($PSBoundParameters.ContainsKey("AD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Active Directory " ForegroundColor Yellow NoNewline
Write-host "computer account…" NoNewline
$Searcher = [ADSISearcher]::new()
$Searcher.Filter = "(sAMAccountName=$ComputerName`$)"
[void]$Searcher.PropertiesToLoad.Add("distinguishedName")
$ComputerAccount = $Searcher.FindOne()
If ($ComputerAccount)
{
Write-host "Success" ForegroundColor Green
Write-Host " Deleting computer account…" NoNewline
$DirectoryEntry = $ComputerAccount.GetDirectoryEntry()
$Result = $DirectoryEntry.DeleteTree()
Write-Host "Success" ForegroundColor Green
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Azure AD
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Azure AD " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
[array]$AzureADDevices = Get-AzureADDevice SearchString $ComputerName All:$true ErrorAction Stop
If ($AzureADDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
Foreach ($AzureADDevice in $AzureADDevices)
{
Write-host " Deleting DisplayName: $($AzureADDevice.DisplayName) | ObjectId: $($AzureADDevice.ObjectId) | DeviceId: $($AzureADDevice.DeviceId) …" NoNewline
Remove-AzureADDevice ObjectId $AzureADDevice.ObjectId ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Intune
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Intune " ForegroundColor Yellow NoNewline
Write-host "managed device record/s…" NoNewline
[array]$IntuneDevices = Get-IntuneManagedDevice Filter "deviceName eq '$ComputerName'" ErrorAction Stop
If ($IntuneDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("All"))
{
foreach ($IntuneDevice in $IntuneDevices)
{
Write-host " Deleting DeviceName: $($IntuneDevice.deviceName) | Id: $($IntuneDevice.Id) | AzureADDeviceId: $($IntuneDevice.azureADDeviceId) | SerialNumber: $($IntuneDevice.serialNumber) …" NoNewline
Remove-IntuneManagedDevice managedDeviceId $IntuneDevice.Id Verbose ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete Autopilot device
If ($PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
If ($IntuneDevices.Count -ge 1)
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Autopilot " ForegroundColor Yellow NoNewline
Write-host "device registration…" NoNewline
$AutopilotDevices = New-Object System.Collections.ArrayList
foreach ($IntuneDevice in $IntuneDevices)
{
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities?`$filter=contains(serialNumber,'$($IntuneDevice.serialNumber)')"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod GET ErrorAction Stop
[void]$AutopilotDevices.Add($AutopilotDevice)
}
Write-Host "Success" ForegroundColor Green
foreach ($device in $AutopilotDevices)
{
Write-host " Deleting SerialNumber: $($Device.value.serialNumber) | Model: $($Device.value.model) | Id: $($Device.value.id) | GroupTag: $($Device.value.groupTag) | ManagedDeviceId: $($device.value.managedDeviceId) …" NoNewline
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities/$($device.value.Id)"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod DELETE ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
}
# Delete from ConfigMgr
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "ConfigMgr " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
$SiteCode = (Get-PSDrive PSProvider CMSITE ErrorAction Stop).Name
Set-Location ("$SiteCode" + ":") ErrorAction Stop
[array]$ConfigMgrDevices = Get-CMDevice Name $ComputerName Fast ErrorAction Stop
Write-Host "Success" ForegroundColor Green
foreach ($ConfigMgrDevice in $ConfigMgrDevices)
{
Write-host " Deleting Name: $($ConfigMgrDevice.Name) | ResourceID: $($ConfigMgrDevice.ResourceID) | SMSID: $($ConfigMgrDevice.SMSID) | UserDomainName: $($ConfigMgrDevice.UserDomainName) …" NoNewline
Remove-CMDevice InputObject $ConfigMgrDevice Force ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
Set-Location $env:SystemDrive

Setting the Computer Description During Windows Autopilot

I’ve been getting to grips with Windows Autopilot recently and, having a long history working with SCCM, I’ve found it hard not to compare it with the power of traditional OSD using a task sequence. In fact, one of my goals was to basically try to reproduce what I’m doing in OSD with Autopilot in order to end up with the same result – and it’s been a challenge.

I like the general concept of Autopilot and don’t get me wrong – it’s getting better all the time – but it still has its shortcomings that require a bit of creativity to work around. One of the things I do during OSD is to set the computer description in AD. That’s fairly easy to do in a task sequence; you can just script it and run the step using credentials that have the permission to make that change.

In Autopilot however (hybrid AAD join scenario), although you can run Powershell scripts too, they will only run in SYSTEM context during the Autopilot process. That means you either need to give computer accounts the permission to change their own properties in AD, or you have to find a way to run that code using alternate credentials. You can run scripts in the context of the logged-on user, but I don’t want to do that – in fact I disable the user ESP – I want to use a specific account that has those permissions.

You could use SCCM to do it post-deployment if you are co-managing the device, but ideally I want everything to be native to Autopilot where possible, and move away from the hybrid mentality of do what you can with Intune, and use SCCM for the rest.

It is possible to execute code in another user context from SYSTEM context, but when making changes in AD the DirectoryEntry operation kept erroring with “An operations error occurred”. After researching, I realized it is due to AD not accepting the authentication token as it’s being passed a second time and not directly. I tried creating a separate powershell process, a background job, a runspace with specific credentials – nothing would play ball. Anyway, I found a way to get around that by using the AccountManagement .Net class, which allows you to create a context using specific credentials.

In this example, I’m setting the computer description based on the model and serial number of the device. You need to provide the username and password for the account you will perform the AD operation with. I’ve put the password in clear text in this example, but in the real world we store the credentials in an Azure Keyvault and load them in dynamically at runtime with some POSH code to avoid storing them in the script. I hope in the future we will be able to run Powershell scripts with Intune in a specific user context, as you can with steps in an SCCM task sequence.

# Set credentials
$ADAccount = "mydomain\myADaccount"
$ADPassword = "Pa$$w0rd"

# Set initial description
$Model = Get-WMIObject -Class Win32_ComputerSystem -Property Model -ErrorAction Stop| Select -ExpandProperty Model
$SerialNumber = Get-WMIObject -Class Win32_BIOS -Property SerialNumber -ErrorAction Stop | Select -ExpandProperty SerialNumber
$Description = "$Model - $SerialNumber"

# Set some type accelerators
Add-Type -AssemblyName System.DirectoryServices.AccountManagement -ErrorAction Stop
$Accelerators = [PowerShell].Assembly.GetType("System.Management.Automation.TypeAccelerators")
$Accelerators::Add("PrincipalContext",[System.DirectoryServices.AccountManagement.PrincipalContext])
$Accelerators::Add("ContextType",[System.DirectoryServices.AccountManagement.ContextType])
$Accelerators::Add("Principal",[System.DirectoryServices.AccountManagement.ComputerPrincipal])
$Accelerators::Add("IdentityType",[System.DirectoryServices.AccountManagement.IdentityType])

# Connect to AD and set the computer description
$Domain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$PrincipalContext = [PrincipalContext]::new([ContextType]::Domain,$Domain,$ADAccount,$ADPassword)
$Account = [Principal]::FindByIdentity($PrincipalContext,[IdentityType]::Name,$env:COMPUTERNAME)
$LDAPObject = $Account.GetUnderlyingObject()
If ($LDAPObject.Properties["description"][0])
{
    $LDAPObject.Properties["description"][0] = $Description
}
Else
{
    [void]$LDAPObject.Properties["description"].Add($Description)
}
$LDAPObject.CommitChanges()
$Account.Dispose()

Querying for Devices in Azure AD and Intune with PowerShell and Microsoft Graph

Recently I needed to get a list of devices in both Azure Active Directory and Intune and I found that using the online portals I could not filter devices by the parameters that I needed. So I turned to Microsoft Graph to get the data instead. You can use the Microsoft Graph Explorer to query via the Graph REST API, however, the query capabilities of the API are still somewhat limited. To find the data I needed, I had to query the Graph REST API using PowerShell, where I can take advantage of the greater filtering capabilities of PowerShell’s Where-Object.

To use the Graph API, you need to authenticate first. A cool guy named Dave Falkus has published a number of PowerShell scripts on GitHub that use the Graph API with Intune, and these contain some code to authenticate with the API. Rather than re-invent the wheel, we can use his functions to get the authentication token that we need.

First, we need the AzureRM or Azure AD module installed as we use the authentication libraries that are included with it.

Next, open one of the scripts that Dave has published on GitHub, for example here, and copy the function Get-AuthToken into your script.

The also copy the Authentication code region into your script, ie the section between the following:


#region Authentication
...
#endregion

If you run this code it’ll ask you for an account name to authenticate with from your Azure AD. Once authenticated, we have a token we can use with the Graph REST API saved as a globally-scoped variable $authToken.

Get Devices from Azure AD

To get devices from Azure AD, we can use the following function, which I take no credit for as I have simply modified a function written by Dave.


Function Get-AzureADDevices(){

[cmdletbinding()]

$graphApiVersion = "v1.0"
$Resource = "devices"
$QueryParams = ""

    try {

        $uri = "https://graph.microsoft.com/$graphApiVersion/$($Resource)$QueryParams"
        Invoke-RestMethod -Uri $uri -Headers $authToken -Method Get
    }

    catch {

    $ex = $_.Exception
    $errorResponse = $ex.Response.GetResponseStream()
    $reader = New-Object System.IO.StreamReader($errorResponse)
    $reader.BaseStream.Position = 0
    $reader.DiscardBufferedData()
    $responseBody = $reader.ReadToEnd();
    Write-Host "Response content:`n$responseBody" -f Red
    Write-Error "Request to $Uri failed with HTTP Status $($ex.Response.StatusCode) $($ex.Response.StatusDescription)"
    write-host
    break

    }

}

In the $graphAPIVersion parameter, you can use the current version of the API.

Now we can run the following code, which will use the API to return all devices in your Azure AD and save them to them a hash table which organizes the results by operating system version.


# Return the data
$ADDeviceResponse = Get-AzureADDevices
$ADDevices = $ADDeviceResponse.Value
$NextLink = $ADDeviceResponse.'@odata.nextLink'
# Need to loop the requests because only 100 results are returned each time
While ($NextLink -ne $null)
{
    $ADDeviceResponse = Invoke-RestMethod -Uri $NextLink -Headers $authToken -Method Get
    $NextLink = $ADDeviceResponse.'@odata.nextLink'
    $ADDevices += $ADDeviceResponse.Value
}

Write-Host "Found $($ADDevices.Count) devices in Azure AD" -ForegroundColor Yellow
$ADDevices.operatingSystem | group -NoElement

$DeviceTypes = $ADDevices.operatingSystem | group -NoElement | Select -ExpandProperty Name
$AzureADDevices = @{}
Foreach ($DeviceType in $DeviceTypes)
{
    $AzureADDevices.$DeviceType = $ADDevices | where {$_.operatingSystem -eq "$DeviceType"} | Sort displayName
}

Write-host "Devices have been saved to a variable. Enter '`$AzureADDevices' to view."

It will tell you how many devices it found, and how many devices there are by operating system version / device type.

2018-10-22 16_06_14-Windows PowerShell ISE

We can now use the $AzureADDevices hash table to query the data as we wish.

For example, here I search for an iPhone that belongs to a particular user:


$AzureADDevices.Iphone | where {$_.displayName -match 'nik'}

Here I am looking for the count of Windows devices that are hybrid Azure AD joined, and display the detail in the GridView.


($AzureADDevices.Windows | where {$_.trustType -eq 'ServerAd'}).Count
$AzureADDevices.Windows | where {$_.trustType -eq 'ServerAd'} | Out-GridView

And here I’m looking for all MacOS devices that are not compliant with policy.


($AzureADDevices.MacOS | where {$_.isCompliant -ne "True"}) | Out-GridView

Get Devices from Intune

To get devices from Intune, we can take a similar approach. Again no credit for this function as its modified from Dave’s code.


Function Get-IntuneDevices(){

[cmdletbinding()]

# Defining Variables
$graphApiVersion = "v1.0"
$Resource = "deviceManagement/managedDevices"

try {

    $uri = "https://graph.microsoft.com/$graphApiVersion/$Resource"
    (Invoke-RestMethod -Uri $uri -Headers $authToken -Method Get).Value

}

    catch {

    $ex = $_.Exception
    $errorResponse = $ex.Response.GetResponseStream()
    $reader = New-Object System.IO.StreamReader($errorResponse)
    $reader.BaseStream.Position = 0
    $reader.DiscardBufferedData()
    $responseBody = $reader.ReadToEnd();
    Write-Host "Response content:`n$responseBody" -f Red
    Write-Error "Request to $Uri failed with HTTP Status $($ex.Response.StatusCode) $($ex.Response.StatusDescription)"
    write-host
    break

    }

}

Running the following code will return all devices in Intune and save them to a hash table again organised by operating system.


$MDMDevices = Get-IntuneDevices

Write-Host "Found $($MDMDevices.Count) devices in Intune" -ForegroundColor Yellow
$MDMDevices.operatingSystem | group -NoElement

$IntuneDeviceTypes = $MDMDevices.operatingSystem | group -NoElement | Select -ExpandProperty Name
$IntuneDevices = @{}
Foreach ($IntuneDeviceType in $IntuneDeviceTypes)
{
    $IntuneDevices.$IntuneDeviceType = $MDMDevices | where {$_.operatingSystem -eq "$IntuneDeviceType"} | Sort displayName
}

Write-host "Devices have been saved to a variable. Enter '`$IntuneDevices' to view."

Now we can query data using the $IntuneDevices variable.

Here I am querying for the count of compliant and non-compliant iOS devices.


$IntuneDevices.iOS | group complianceState -NoElement

Here I am querying for all non-compliant iOS devices, specifying the columns I want to see, sort the results and outputting into table format.


$IntuneDevices.iOS |
    where {$_.complianceState -eq "noncompliant"} |
    Select userDisplayName,deviceName,imei,managementState,complianceGracePeriodExpirationDateTime |
    Sort userDisplayName |
    ft

All Windows devices sorted by username:


$IntuneDevices.Windows | Select userDisplayName,deviceName | Sort userDisplayName

Windows devices managed by SCCM:


$IntuneDevices.Windows | where {$_.managementAgent -eq "ConfigurationManagerClientMdm"} | Out-GridView

Windows devices enrolled using Windows auto enrollment:


$IntuneDevices.Windows | where {$_.deviceEnrollmentType -eq "windowsAutoEnrollment"} | Out-GridView

Windows devices enrolled by SCCM co-management:


$IntuneDevices.Windows | where {$_.deviceEnrollmentType -eq "windowsCoManagement"} | Out-GridView

You can, of course, expand this into users and other resource types, not just devices. You just need the right URL construct for the data type you want to query.

Intune Client-Side Logs in Windows 10

Note to self (and anyone interested!) about the client-side location of logs and management components of Intune on a Windows 10 device.

Diagnostic Report

A diagnostic report can be generated client-side from Settings > Access Work and School > Connected to <Tenant>’s Azure AD > Info > Create Report

The report will be saved to:

C:\Users\Public\Public Documents\MDMDiagnostics\MDMDiagReport.html

Intune Management Extension

Information on the parameters for the IME can be found in the registry:

HKLM:\Software\Microsoft\EnterpriseDesktopAppManagement\<SID>\MSI\<ProductCode>

The MSI itself can be found here, together with an installer log:

C:\Windows\System32\config\systemprofile\AppData\Local\mdm

Note: if you disconnect a device from Azure AD and rejoin it again, you will need to reinstall the IME as it will have a different device identifier.

IME logs can be found here:

C:\ProgramData\Microsoft\IntuneManagementExtension\Logs

The logs are:

  • AgentExecutor
  • ClientHealth
  • IntuneManagementExtension

Script Execution

When a PowerShell script is run on the client from Intune, the scripts and the script output will be stored here, but only until execution is complete:

C:\Program files (x86)\Microsoft Intune Management Extension\Policies\Scripts

C:\Program files (x86)\Microsoft Intune Management Extension\Policies\Results

A transcript of the script execution can be found underneath C:_showmewindows (a hidden folder)

The full content of the script will also be logged in the IntuneManagementExtension.log (be careful of sensitive data in scripts!)

The error code and result output of the script can also be found in the registry:

HKLM:\Software\Microsoft\IntuneManagementExtension\Policies\<UserGUID>\<ScriptGUID>

Event Logs

There are a couple of MDM event logs which can be found here:

Applications and Services Logs > Microsoft > Windows > DeviceManagement-Enterprise-Diagnostics-Provider

Services

The IME runs as a service called “Microsoft Intune Management Extension”. You can restart this to force a check for new policies.

Scheduled Task

The IME runs a health evaluation every day as a scheduled task, and logs the results in the ClientHealth.log:

Microsoft > Intune > Intune Management Extension Health Evaluation

If you know of any other log locations, please let me know!

Lots of great info on the IME by Oliver Kieselbach here and here.

Getting Data from the Intune Data Warehouse with PowerShell

The Intune Data Warehouse is a great addition to the Microsoft Intune service allowing visibility of historical data for reporting, data and trend analysis for your Microsoft MDM environment. It comes with an OData feed that allows you to connect to the data with PowerBI, Microsoft’s reporting and data visualization service.

The Data Warehouse RESTful API (currently in Beta) can be used to get data from the warehouse using a REST client. I decided to explore how to do this with PowerShell so I can run some ad-hoc queries and analyse trends in the data.

To get data from the Intune Data Warehouse we need to do three main things:

  1. Create a native App in Azure and give it access to the Intune Data Warehouse
  2. Authenticate with Azure using OAuth 2.0 and get an access token
  3. Invoke the RESTful web service using http

Create a Native App in Azure

In your Azure portal, go to Azure Active Directory > App registrations. Click New application registration.

Give it a name, make sure it is a Native app (do not use Web app / API) and use the redirect URI https://login.live.com/oauth20_desktop.srf .

appreg

Click Create.

Once created, make a note of the Application ID as we will need this later.

Now, in the App in the Settings blade, click Required permissions > Add > Select an API and select Microsoft Intune API.

In the Add API access blade click Select permissions and grant the delegated permission Get data warehouse information from Microsoft Intune.

perms

Save your changes.

Authenticate with Azure

To authenticate with Azure I wrote the following function in PowerShell:

Function New-IntuneDataWarehouseAccessToken {
# Function to get an access token for the Intune Data Warehouse
# To be used in conjunction with the function Get-IntuneDataWarehouseData
# Will download NuGet if required to install the latest Active Directory Authentication Library package
[CmdletBinding()]
Param(
[Parameter()]
$NuGetDirectory = "$Env:USERPROFILE\NuGet",
[Parameter()]
$RedirectURL = "https://login.live.com/oauth20_desktop.srf", # this is the RedirectURL of your InTune Data Warehouse Native app in Azure
[Parameter()]
$ClientID = "8d0d82ed-f664-4b38-93d8-75ad70165832" # this is the application ID of your InTune Data Warehouse Native app in Azure
)
# Create a NuGet directory in UserProfile area if the supplied path does not exist
If (!(Test-Path $NuGetDirectory))
{
$null = New-Item Path $Env:USERPROFILE Name NuGet ItemType directory
$NuGetDirectory = "$Env:USERPROFILE\NuGet"
}
# Check whether a NuGet Directory exists and if the Microsoft.IdentityModel.Clients.ActiveDirectory package is in there
# If not, do the needful
If ((Get-ChildItem $NuGetDirectory Directory).Name -notmatch "Microsoft.IdentityModel.Clients.ActiveDirectory")
{
# Download NuGet to UserProfile and create a temporary alias
$sourceNugetExe = "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"
$targetNugetExe = "$NuGetDirectory\nuget.exe"
Invoke-WebRequest $sourceNugetExe OutFile $targetNugetExe
Set-Alias nuget $targetNugetExe Scope Script
# Download the latest Active Directory Authentication Library package
nuget install Microsoft.IdentityModel.Clients.ActiveDirectory OutputDirectory $NuGetDirectory
}
# Add the ADAL library
$DLLPath = "$Env:USERPROFILE\NuGet\" + "$((Get-ChildItem $env:USERPROFILE\NuGet Filter "Microsoft.IdentityModel.Clients.ActiveDirectory*" | Sort Name Descending | Select First 1).Name)" + "\lib\net45"
Add-Type Path "$DLLPath\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
# Create the authentication context
$AuthenticationContext = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext("https://login.windows.net/common/oauth2/authorize")
# Get Access Token for the user
$Resource = "https://api.manage.microsoft.com/"
$PlatformParams = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.PlatformParameters("Auto")
$Result = $AuthenticationContext.AcquireTokenAsync($Resource,$clientID,$RedirectURL,$PlatformParams).Result
$script:AccessToken = $Result.AccessToken
Return "Your access token expires at $($Result.ExpiresOn.DateTime)"
}

What the code does…

To authenticate with Azure we need to use the Active Directory Authentication Library from Microsoft. This library is actually included in resources like the ConfigMgr client and the AzureRM PowerShell module, but these do not have the latest versions of the library and the methods it contains have changed over time. So I prefer to use the latest version of the library (3.17.2 at the time of writing) which is available as a NuGet package.

The function will download NuGet and use it to download the latest version of the ADAL library to your user profile area. Once we have the library we will add it in PowerShell. Then we will acquire an access token.

The access token expires after an hour so once created, the token will probably be good for your current session. If it expires, simply run the function again. The access token will be saved to a variable in the script scope and will be used by the function that queries the data warehouse.

How to use it…

Make sure the following parameters contain the required values. I recommend that you store those in the function directly so you don’t have to add them every time.

Simply run the function:


New-IntuneDataWarehouseAccessToken

If you have not previously authenticated with Azure in your current session you will be prompted to sign in to your Azure account:

auth

The first time you use the native app you created, you will also be prompted for permission:

Prompt

Invoke the Web Service using the OData feed

Now that we have an access token, we can invoke the web service using http. I wrote the following PowerShell function to do that:

Function Get-IntuneDataWarehouseData {
# Function to query the Intune Data Warehouse for data
# Requires an access token to be created first via the New-IntuneDataWarehouseAccessToken function
[CmdletBinding()]
Param(
[Parameter()] # this is the custom feed URL for your for your tenant for the InTune Data Warehouse
$WarehouseUrl = "https://fef.msun02.manage.microsoft.com/ReportingService/DataWarehouseFEService?api-version=beta",
[Parameter()]
$DataEntity = "devices", # this is the Data Entity you wish to query
[Parameter()]
$Filter, # OData Query parameter
[Parameter()]
$Top, # OData Query parameter
[Parameter()]
$OrderBy, # OData Query parameter
[Parameter()]
$Select, # OData Query parameter
[Parameter()]
$Skip, # OData Query parameter
[Parameter()][Switch]
$ListDataEntities # Use this switch to list the available data entities
)
# Create the custom URL
$UriBuilder = new-object System.UriBuilder($warehouseUrl)
If ($ListDataEntities)
{
$UriBuilder = new-object System.UriBuilder($WarehouseUrl)
}
Else
{
$URI = $WarehouseUrl.Insert($WarehouseUrl.IndexOf("?"), "/$DataEntity")
# Add query parameters
If ($filter -ne $null)
{
$URI = "$URI&`$filter=$Filter"
}
If ($select -ne $null)
{
$URI = "$URI&`$select=$select"
}
If ($top -ne $null)
{
$URI = "$URI&`$top=$top"
}
If ($orderby -ne $null)
{
$URI = "$URI&`$orderby=$orderby"
}
If ($skip -ne $null)
{
$URI = "$URI&`$skip=$skip"
}
$UriBuilder = new-object System.UriBuilder($URI)
}
# Create an HttpClient
$HttpClient = New-Object System.Net.Http.HttpClient
$HttpClient.Timeout = [timespan]"00:10:00" # Extend the timeout to 10 minutes in case of slow network / high data volume
$HttpClient.DefaultRequestHeaders.Authorization = New-Object System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", $AccessToken)
# Load the data
$Result = (($httpClient.GetAsync($uriBuilder.Uri).Result).Content.ReadAsStringAsync().Result | ConvertFrom-Json).Value
If ($ListDataEntities)
{
$Result = $Result.URL | Sort
}
$HttpClient.Dispose()
return $Result
}

What the code does…

The function uses the custom OData feed URL for your tenant, creates an http client to invoke the web service and gets data for the data entity (ie collection, or table) that you specify. The results are then returned to the PowerShell console.

You can read more about the data model for the warehouse and get a reference for the various data entities and their relationships on the Microsoft Docs site.

How to use it…

Make sure the following parameter is set in the function:

  • WarehouseUrl

This Url is the custom feed URL for your tenant and you can find it from the Intune blade in Azure. On the Overview blade, on the right you find Other tasks and underneath Set up Intune Data Warehouse.

url

To list the data entities that are available to query use the ListDataEntities switch:


Get-IntuneDataWarehouseData -ListDataEntities

entities

To return the data from a specific data entity, use the DataEntity parameter. This example returns data from the devices table.


Get-IntuneDataWarehouseData -DataEntity devices

devices

Working with the Data

The API supports a few query parameters in the OData protocol v4.0, so rather than returning all the results in the data entity, you can narrow them down. However, I have noticed that the query parameters do not always work as expected when they are combined, at least in the beta version of the API.

For example, you can use the Filter parameter to return only matching results. This query finds a specific device:


Get-IntuneDataWarehouseData -DataEntity devices -Filter "deviceName eq 'SW-IT-LT-AZURE1'"

You can select only specific properties to be returned in the results using the Select parameter:


Get-IntuneDataWarehouseData -DataEntity devices -Select 'deviceName,serialNumber,lastContact'

Select the top 5 results:


Get-IntuneDataWarehouseData -DataEntity devices -Top 5

Skip 10 results and return the rest:


Get-IntuneDataWarehouseData -DataEntity devices -Skip 10

Sort results by a particular property:


Get-IntuneDataWarehouseData -DataEntity devices -OrderBy 'lastContact'

You can learn more about how to use query parameters in the OData protocol from the Microsoft Developer site for the Graph API.

These query parameters are certainly helpful, but for relational queries where you want to reference more than one table or entity, the API comes short and more complex Powershell code is required.

For example, here I am searching for the ethernet MAC address from the most recent hardware inventory for a device, and even with query parameters applied I still need to do some filtering and sorting in Powershell. Remember that the data warehouse only contains snapshots of historic data, so for current data you can use the Graph API instead, but this is just for an example.


$DeviceName = "SW-IT-LT-AZURE1"
Get-IntuneDataWarehouseData -DataEntity 'mdmDeviceInventoryHistories' -Select 'deviceKey,datekey,ethernetMac' |
    Where {$_.devicekey -eq ((Get-IntuneDataWarehouseData -DataEntity 'devices' -Filter "deviceName eq '$DeviceName'").devicekey)} |
    Sort datekey -Descending |
    Select -First 1 |
    Select -ExpandProperty ethernetMac 

This code takes a while to run however because it’s querying the data warehouse more than once to get the data. Another way to do this would be to first load the required device entities into memory, then I can query them more quickly and run other queries from this cached data.

The following code will load just those two entities into a hash table:


$DataEntities = "mdmDeviceInventoryHistories","devices"
$DataHash = @{}
foreach ($DataEntity in $DataEntities)
{
    Write-host "Loading $DataEntity"
    [void]$DataHash.Add($DataEntity,(Get-IntuneDataWarehouseData -DataEntity $DataEntity))
}

Then I can run the following code to get the ethernet MAC address and it returns the result instantly:


$DeviceName = "SW-IT-LT-AZURE1"
$DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select -First 1 |
    Select -ExpandProperty ethernetMac 

You could load the entire data warehouse into memory using the following code, then you can simply work with the resultant hashtable:


$DataEntities = Get-IntuneDataWarehouseData -ListDataEntities
$DataHash = @{}
foreach ($DataEntity in $DataEntities)
{
    Write-host "Loading $DataEntity"
    [void]$DataHash.Add($DataEntity,(Get-IntuneDataWarehouseData -DataEntity $DataEntity))
}

The benefit of a data warehouse of course is that you can review snapshots of data over a period of time and analyse the data for trends or identify when things changed. The following example is using the data hashtable and is reporting the device inventory history of a specific device over time. The Intune data warehouse keeps up to 90 days of historic data. In particular, I want to see how much the free space on disk is changing over time.


$DeviceName = "SW-IT-LT-158"
$Results = $DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select dateKey, deviceName,
        softwareVersion,
        @{e={$([math]::Round(($_.storageFree / 1GB),2))};l="storageFree (GB)"},
        @{e={$([math]::Round(($_.storageTotal / 1GB),2))};l="storageTotal (GB)"}
foreach ($Result in $results){
    $Result | Add-Member -MemberType NoteProperty -Name date -Value (($DataHash['dates'] | Where {$_.dateKey -eq $Result.dateKey}).fullDate | Get-Date -Format "dd MMM yyyy")
}
$Results | Select deviceName,date,softwareVersion,'storageFree (GB)','storageTotal (GB)' | ft

You can readily see that it’s necessary to manipulate the data quite a bit to get the results I want to see, for example in order to do something equivalent to a ‘join’ in SQL I am using Where-Object, and in order to add the data from another table to my results I am using Add-Member. I am also converting the values of the storage data into GB and formatting the date using the UK short date code.

trend1

The results are returned in an array object, but for data like this it can also be useful to use a datatable as you would for SQL data for example.  Then you can add / remove columns, change column order, set the datatype for a column, change headers etc.

This code does exactly the same thing as the last example, but using a datatable for the results.


$DeviceName = "sw-it-lt-158"
$Datatable = New-Object System.Data.DataTable
[void]$Datatable.Columns.AddRange(@('deviceName','date','softwareVersion','storageFree (GB)','storageTotal (GB)'))
$Results = $DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select dateKey,
        deviceName,
        softwareVersion,
        @{e={$([math]::Round(($_.storageFree / 1GB),2))};l="storageFree (GB)"},
        @{e={$([math]::Round(($_.storageTotal / 1GB),2))};l="storageTotal (GB)"}
foreach ($Result in $results){
    [datetime]$Date = ($DataHash['dates'] | Where {$_.dateKey -eq $Result.dateKey}).fullDate
    [void]$DataTable.Rows.Add($Result.deviceName,$Date.ToShortDateString(),$Result.softwareVersion,$Result.'storageFree (GB)', $Result.'storageTotal (GB)')
}
$Datatable | ft

Reviewing the results I can see that the available disk space is decreasing slightly over time. It would be nice to see that data represented graphically, and of course this is where the integration with PowerBI will shine, but we can also generate graphical charts in Powershell, so let’s give that a go.

Here is a function I wrote that will generate a spline chart using the .Net chart controls and display it in a WPF window. It takes a single series of data and you need to provide a title, a data object as an array, the X and Y axis names (which must match the header names in the data object).

Function New-SingleSeriesSplineChart {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
$Title,
[Parameter(Mandatory=$true)]
$Data,
[Parameter(Mandatory=$true)]
$AxisX,
[Parameter(Mandatory=$true)]
$AxisY
)
# Add required assemblies
Add-Type AssemblyName PresentationFramework,System.Windows.Forms,System.Windows.Forms.DataVisualization
# Create a WPF Window
$Window = New-object System.Windows.Window
$Window.Title = $Title
$window.Height = 800
$Window.Width = 800
$Window.WindowStartupLocation = "CenterScreen"
# Add an image to the Window
$Image = New-Object System.Windows.Controls.Image
$Image.Height = "NaN"
$Image.Width = "NaN"
$Window.AddChild($Image)
# Function to create a .Net Spline Chart
Function Create-SplineChart {
param($Title,$Data,$AxisX,$AxisY)
# Create a chart object
$Chart = New-object System.Windows.Forms.DataVisualization.Charting.Chart
$Chart.Width = 800
$Chart.Height = 800
$Chart.Left = 10
$Chart.Top = 10
# Create a chartarea to draw on and add this to the chart
$ChartArea = New-Object System.Windows.Forms.DataVisualization.Charting.ChartArea
$ChartArea.AxisY.Minimum = 0
$ChartArea.AxisX.Minimum = 0
$ChartArea.AxisX.Interval = 1
$ChartArea.AxisX.IsLabelAutoFit = $false
$ChartArea.AxisX.LabelStyle.Angle = -45
$ChartArea.Area3DStyle.Enable3D = $True
$ChartArea.Area3DStyle.Inclination = "10"
$ChartArea.Area3DStyle.Rotation = "10"
$ChartArea.BackColor = "AliceBlue"
$ChartArea.AxisX.LabelStyle.Font = (New-Object System.Drawing.Font ArgumentList "Segui", "12")
$Chart.ChartAreas.Add($ChartArea)
[void]$Chart.Series.Add($AxisY)
# Add a legend
$Legend = New-Object System.Windows.Forms.DataVisualization.Charting.Legend
$Chart.Legends.Add($Legend)
$Chart.Legends[0].Docking = "Bottom"
$Chart.Legends[0].Font = (New-Object System.Drawing.Font ArgumentList "Segui", "12")
$Chart.Legends[0].Alignment = "Center"
# Add a datapoint for each value specified in the provided data
$Data | foreach {
$datapoint = new-object System.Windows.Forms.DataVisualization.Charting.DataPoint(0, $_.$AxisY)
$datapoint.AxisLabel = $_.$AxisX
$Chart.Series[$AxisY].Points.Add($datapoint)
}
# Set the chart type
$Chart.Series[$AxisY].ChartType = [System.Windows.Forms.DataVisualization.Charting.SeriesChartType]::Spline
# Set the title of the Chart
$TitleObj = new-object System.Windows.Forms.DataVisualization.Charting.Title
$Chart.Titles.Add($TitleObj)
$Chart.Titles[0].Font = (New-Object System.Drawing.Font ArgumentList "Segui", "18")
$Chart.Titles[0].Text = $Title
# Save the chart to a memory stream
$Stream = New-Object System.IO.MemoryStream
$Chart.SaveImage($Stream,"png")
$script:ImageStream = $Stream.GetBuffer()
$Stream.Dispose()
}
# Add an event to display the chart when the window is opened
$Window.Add_ContentRendered({
# Create the Chart
CreateSplineChart Title $Title Data $Data AxisX $AxisX AxisY $AxisY
# Set the image source
$image.Source = $ImageStream
$This.Activate()
})
# Display window
$null = $window.Dispatcher.InvokeAsync{$window.ShowDialog()}.Wait()
}

To generate the chart, I will use the results from my previous example (not the datatable but the array), sort them by date, select the last 20 data snapshots, select the X and Y axis data into a new object and provide this to the chart function:


$Data = $Results | Sort dateKey | Select date,'storageFree (GB)' | Select -Last 20
New-SingleSeriesSplineChart -Title "Trend of Available Free Storage on SW-IT-LT-158" -Data $Data -AxisX "date" -AxisY "storageFree (GB)"

Now I have a nice graphical view 🙂

SplineChart

I have focused just on devices in this blog, but there is lots of data available in the Intune Data Warehouse including users, policies, compliance, configurations, MAM data etc, all of which can provide valuable insights into your MDM estate and whether you use PowerShell, PowerBI, Excel or whichever tool, the ability to view and analyse historic data is a welcome improvement to the ever-evolving Intune service.