Use Proactive remediations to report on or install the Microsoft Update Health tools

Microsoft recently made a download available for their Update Health tools – if you’re using Microsoft Endpoint Manager and enrolling or co-managing Windows devices these tools need to be installed to make use of the capability for expediting quality updates.

For devices connected to Windows Update or Windows Update for Business these tools should already be installed, but in some cases they aren’t – and those devices can’t have updates expedited.

If the tools aren’t installed, MS would like some feedback so they can figure out why – but since they made the tools available for download, we also have the option to manually deploy them if desired, which is something you might want to do when migrating from another software update solution like MEMCM to MEM / WUfB, for example – pre-deploy these tools so they will be ready to use the expedite feature right away.

I created some scripts that can be used with Proactive remediations – the detection script will report on whether the Update Health tools are installed, and optionally you can use the remediation script to go ahead and download and install the tools if they are missing, without needing to package them as an app.

Devices with the tools installed will report the tools version and install date in the Pre-remediation detection output. Devices where the tools are missing will report “Update tools not installed” and if remediated will show the same tools info in the Post-remediation detection output.

I’d suggest to just run a one-time deployment when you need rather than run it repeatedly on a schedule.

Querying Windows Build Version History with the Intune Data Warehouse and PowerShell

I had an interesting requirement recently which was to review the OS build numbers of a group of computers over time. This would reveal not only when they got patched, but which patches they installed and when they installed a feature update, for example. Since we aren’t using the Data warehouse in MEMCM and our historic hardware inventory data doesn’t contain the full OS build numbers with the patch level, I turned to MEM and the Intune Data Warehouse to get the information.

A couple of years ago I posted a blog detailing how to query the IDW with PowerShell, using the recommended app registration in Azure and the whole 9 yards, but this time I simplified things a bit.

The collection we need to query is the devicePropertyHistories collection which contains the osVersion property. According to the MS docs, the amount of snapshot data retained varies from table to table, but in my tenant I was able to retrieve 60 days of historic data from this collection.

Unfortunately, the OData query options are limited for the data warehouse, supporting only a subset of the options you might normally use querying Microsoft Graph, for example. This means we basically need to pull in the entire content of the collection, then search on it. We can limit the number of days we want to retrieve data for though, which can speed up the REST API responses.

To start, lets set the $ProgressPreference variable – this speeds up our web requests by not displaying the progress bar.

$ProgressPreference = 'SilentlyContinue'

Then lets define the URL we want to use to connect to the data warehouse. You can get the base URL from the MEM console > Reports > Data warehouse. We’re using the API version 1.0 and I’ve added the devicePropertyHistories collection to the URL.

$DataWarehouseURL = "https://fef.<mytenant>.manage.microsoft.com/ReportingService/DataWarehouseFEService/devicePropertyHistories?api-version=v1.0"
# Optionally set the maximum number of days to query
#$DataWarehouseURL = "https://fef.<mytenant>.manage.microsoft.com/ReportingService/DataWarehouseFEService/devicePropertyHistories?api-version=v1.0&maxhistorydays=14"

Next, let’s get an access token. This assumes that your account has the appropriate permissions or role as detailed here. We’ll use the Az.Account module, so get that installed if you don’t have it.

# Get token for current user (needs appropriate role like Intune Service Administrator)
# Install-Module Az.Accounts
Connect-AzAccount 
$Token = Get-AzAccessToken -ResourceUrl "https://api.manage.microsoft.com/"

Now we’ll prepare the request header, invoke the request, convert and filter the results to find only Windows 10 devices and add them to an arraylist.

$headers = @{'Authorization'="Bearer " + $Token.Token}
$Result = [System.Collections.ArrayList]::new()
Write-host "Processing $DataWarehouseURL"
$WebRequest = Invoke-WebRequest -Uri $DataWarehouseURL -Method Get -Headers $headers
$Content = $WebRequest.Content | ConvertFrom-Json
[void]$Result.Add(($Content.value | Where {$_.osVersion -match "10.0.1" -and $_.jailBroken -eq "Unknown"}))

Because the results are paged and there are more than will be returned in a single request, we loop through the ‘nextLink’ URLs for each request until we’ve built the entire dataset. This can take some time depending how far back you are querying and how much data you have.

Do {
    Write-host "Processing $($Content.'@odata.nextLink')"
    $WebRequest = Invoke-WebRequest -Uri $Content.'@odata.nextLink' -Method Get -Headers $headers
    $Content = $WebRequest.Content | ConvertFrom-Json
    [void]$Result.Add(($Content.value | Where {$_.osVersion -match "10.0.1" -and $_.jailBroken -eq "Unknown"}))
}
While ($null -ne $Content.'@odata.nextLink')

Finally, we flip the arraylist into a regular array (just because I want to :))

$Results = @()
Foreach ($item in $Result)
{
    $Results += $item
}

Now the $Results variable contains our entire dataset, which you can view and query as you wish.

In my case, I want to search for the osVersion changes over time for an individual machine, so:

$ComputerName = 'PC001'
$Records = $Results | 
    Where {$_.deviceName -eq $ComputerName} | 
    Select deviceName,osVersion,@{l='Date';e={(Get-Date ("$(($_.dateKey.ToString()).Substring(0,4))" + " " + "$(($_.dateKey.ToString()).Substring(4,2))" + " " + "$(($_.dateKey.ToString()).Substring(6,2))").ToDateTime((Get-Culture).DateTimeFormat)).ToLongDateString()}}
$Records

The dateKey isn’t in a DateTime friendly format, so we manipulate it a bit to avoid having to cross reference the dates entity.

Viewing the results, I can see this device is on 20H2 and is installing the monthly CU’s not long after they are released

In another example, I can see the device was on 20H2 and got upgraded to 21H1 with the current CU and a couple of days later after ‘patch Tuesday’ (is it a myth?) it got a newer CU.

Finally, for a group of computers I wanted to see what patch level they were at before they got the 20H2 feature update installed, and what patch level they were at after. I prepared a CSV file with computer names, and ran the following code, exporting the results to another CSV file.

$RecordArray = @()
$PCList = Get-Content D:\Temp\Successes.csv -ReadCount 0
foreach ($PC in $PCList)
{
    $PC
    $Records = $Results | Where {$_.deviceName -eq $PC} | Select deviceName,osVersion,@{l='Date';e={(Get-Date ("$(($_.dateKey.ToString()).Substring(0,4))" + " " + "$(($_.dateKey.ToString()).Substring(4,2))" + " " + "$(($_.dateKey.ToString()).Substring(6,2))").ToDateTime((Get-Culture).DateTimeFormat)).ToLongDateString()}}
    If ($Records)
    {
        $UpdateRecord = $Records | Where {$_.osVersion -match "10.0.19042."} | Select -First 1
        $PriorUpdateRecordIndex = $Records.IndexOf($UpdateRecord) -1
        $PriorUpdateRecord = $Records[$PriorUpdateRecordIndex]
        $RecordArray += [PSCustomObject]@{
            deviceName = $PC
            osVersionBeforeUpdate = $PriorUpdateRecord.osVersion
            dateBeforeUpdate = $PriorUpdateRecord.Date
            osVersionAfterUpdate = $UpdateRecord.osVersion
            dateAfterUpdate = $UpdateRecord.Date
        }
    }
}
$RecordArray | Export-CSV  D:\Temp\Successes_Upgrades.csv -Force -NoTypeInformation

This made for some interesting reading!

There’s a wealth of data to access in the Intune Data warehouse and you can use PowerShell, any REST API client, or even Power BI to query and report on the data. Give it a go!

Create custom Intune reports with Microsoft Graph, Azure Automation and Power BI

Microsoft Endpoint Manager aka Intune has been around for a while now and has evolved quite significantly since its early days and the old Silverlight portal (remember that?). Historically Intune hasn’t been particularly good with its reporting capability, but since end 2019 and the announcement of the new reporting framework, things are starting to improve with more built-in reports appearing all the time and now the ability to create your own Azure monitor workbooks from log analytics – welcome additions for sure.

That being said, if you use Power BI as a corporate reporting service there’s no better place to create custom reports according to your needs and share those reports with interested parties without having to give them access to Intune and the MEM portal. Intune data can be queried using Microsoft Graph yet there is currently no native way to use Microsoft Graph as data source in Power BI short of developing a custom connector. With the advent of log analytics data for Intune, we will be able to export log analytics queries to Power BI using M query language which looks promising.

In the meantime, we need to use a little creativity to get data out of Intune and into Power BI to furnish a custom report. I’ve seen examples of using Logic Apps and Function app in Azure as an intermediary process, which is cool. I decided to take a slightly different approach to this and use an Azure automation account to simply export data from Microsoft Graph on a schedule and dump it into an Azure storage account. Power BI supports the use of blob storage as a data source so this works quite nicely.

  • An Azure automation runbook queries Microsoft Graph, organizes the data a bit then exports it into CSV files
  • The CSV files are then uploaded as blobs in a container in a storage account
  • Power BI then uses those CSV files as its datasource allowing us to create custom reports from the data

There are a few advantages to this approach:

  • You can use Power BI to create your own custom reports from any Microsoft Graph data across Microsoft 365, and you can combine data from different sources in a single report
  • Reports support using a scheduled refresh in the Power BI service for keeping data up-to-date
  • Everything is in the cloud – there is no requirement for on-prem resources or a data gateway
  • You can use a managed identity or a Run as account for simplified, secure no-credential authentication
  • Graph data can be manipulated using PowerShell before export allowing customization of the final data set

On the last point, some data in Microsoft Graph contains nested data, eg a field may contain its own set of key-value pairs and you would want to expand these out into their own fields to report on them. Or you may wish to create your own calculated fields from the data, for example how many days since the last sync of an enrolled device.

Additionally Microsoft recently announced support for enabling a managed identity for an automation account, and this is excellent for simplifying and improving the security of access to Microsoft Graph.

The only slight disadvantage to this approach perhaps is that data in the report won’t be live – it will only be as good as your refresh schedule.

Just to give an example, here are a couple of screenshots of a report I created from Intune data for enrolled devices, including Android, iOS and Windows.

I’ve created a new docs site where you can download this report as a template, the runbook used to export the data as well as access detailed instructions on how to set up a data export from MS Graph into Power BI.

docs.smsagent.blog

There’s also a bonus runbook that exports a list of unhealthy MEMCM clients and send it as an email report. We don’t want any unhealthy MEMCM clients now do we?!

Getting Creative: a Bespoke Solution for Feature Update Deployments

This is the first blog post in what I hope will be a series of posts demonstrating several custom solutions I created for things such as feature update deployments, managing local admin password rotation, provisioning Windows 10 devices, managing drive mappings and more. My reasons for creating these solutions was to overcome some of the current limitations in existing products or processes, make things more cloud-first and independent of existing on-prem infrastructure where possible, and to more exactly meet the requirements of the business.

Although I will try to provide a generalised version of the source code where possible, I am not providing complete solutions that you can go ahead and use as is. Rather my intention is to inspire your own creativity, to give working examples of what could be done if you have the time and resource, and to provide source code as a reference or starting point for your own solutions should you wish to create them!

Someone asked me recently how we deploy feature updates and it was a difficult question to answer other than to say we use a custom-built process. Having used some of the existing methods available (ConfigMgr Software Updates, ConfigMgr custom WaaS process, Windows Update for Business) we concluded there were shortcomings in each of them, and this provided inspiration to create our own, customized process to give us the control, reliability, user experience and reporting capability that we desired. Don’t get me wrong – I am not saying these methods aren’t good – they just couldn’t do things exactly the way we wanted.

So I set out to create a bespoke process – one that we could customize according to our needs, that was largely independent of our existing Configuration Manager infrastructure and that could run on any device with internet access. This required making use of cloud services in Azure as well as a lot of custom scripting! In this blog, I’ll try to cover what I did and how it works.

User Experience

First, let’s look at the end user experience of the feature update installation process – this was something key for us, to improve the user experience keeping it simple yet informative, and able to respond appropriately to any upgrade issues.

Once the update is available to a device, a toast notification is displayed notifying the user that an update is available. Initially, this displays once a day and automatically dismisses after 25 seconds. (I’ve blanked out our corporate branding in all these images)

We use a soft deadline – ie the update is never forced on the user. Enforcing compliance is handled by user communications and involvement from our local technicians. With one week left before the deadline, we increase the frequency of the notifications to twice per day.

If the deadline has passed, we take a more aggressive approach with the notifications, modifying the image and text, displaying it every 30 minutes and it doesn’t leave the screen unless the user actions or dismisses it.

The update can be installed via a shortcut on the desktop, or in the last notification it can be initiated from the notification itself.

Once triggered, a custom UI is displayed introducing the user to the update and what to expect.

When the user clicks Begin, we check that a power adapter is connected and no removable USB devices are attached – if they are, we prompt to the user to remove them first.

The update runs in three phases or stages – these correspond to the PreDownload, Install and Finalize commands on the update (more on that later). The progress of each stage is polled from the registry, as is the Setup Phase and Setup SubPhase.

Note that the user cannot cancel the update once it starts and this window will remain on the screen and on top of all other windows until the update is complete. The user can click the Hide me button, and this will shrink the window like so:

This little window also cannot be removed from the screen, but it can be moved around and is small enough to be unobtrusive. When the update has finished installing, or when the user clicks Restore, the main window will automatically display again and report the result of the update.

The colour scheme is based on Google’s material design, by the way.

If the update failed during the online phase, the user can still initiate the update from the desktop shortcut but toast notifications will no longer display as reminders. The idea is that IT can attempt to remediate the device and run the update again after.

If successful, the user can click Restart to restart the computer immediately. Then the offline phase of the upgrade runs, where you see the usual light blue screen and white Windows update text reporting that updates are being installed.

Once complete, the user will be brought back to the login screen, and we won’t bother them anymore.

If the update rolled back during the offline phase, we will detect this next time they log in and notify them one time:

Logging and Reporting

The entire update process is logged right from the start to a log file on the local machine. We also send ‘status messages’ at key points during the process and these find their way to an Azure SQL database which becomes the source for reporting on update progress across the estate (more on this later).

A PowerBI report gives visual indicators of update progress as well as a good amount of detail from each machine including update status, whether it passed or failed readiness checks and if failed, why, whether it passed the compatibility assessment, if it failed the assessment or the install we give the error code, whether any hard blocks were found, setup diag results (2004 onward), how long the update took to install and a bunch of other stuff we find useful.

Since 2004 though, we have starting inventorying certain registry keys using ConfigMgr to give us visibility of devices that won’t upgrade because of a Safeguard hold or other reason, so we can target the upgrade only to devices that aren’t reporting any known compatibility issues.

If a device performs a rollback, we can get it to upload key logs and registry key dumps to an Azure storage account where an administrator can remotely diagnose the issue.

How does it work?

Now lets dive into the process in more technical detail.

Deployment Script

The update starts life with a simple PowerShell script that does the following:

  • Creates a local directory to use to cache content, scripts and logs etc
  • Temporarily stores some domain credentials in the registry of the local SYSTEM account as encrypted secure strings for accessing content from a ConfigMgr distribution point if necessary (more on this later)
  • Downloads a manifest file that contains a list of all files and file versions that need to be downloaded to run the update. These include scripts, dlls (for the UI), xml definition files for scheduled tasks etc
  • Each file is then downloaded to the cache directory from an Azure CDN
  • 3 scheduled tasks are then registered on the client:
    • A ‘preparer’ task which runs prerequisite actions
    • A ‘file updater’ task which keeps local files up-to-date in case we wish to change something
    • A ‘content cleanup’ task which is responsible for cleaning up in the event the device gets upgraded through any means
  • A ‘status message’ is then sent as an http request, creating a new record for the device in the Azure SQL database

This script can be deployed through any method you wish, including Configuration Manager, Intune or just manually, however it should be run in SYSTEM context.

Content

All content needed for the update process to run is put into a container in a storage account in Azure. This storage account is exposed via an Azure Content Delivery Network (CDN). This means that clients can get all the content they need directly from an internet location with minimal latency no matter where they are in the world.

Feature Update Files

The files for the feature update itself are the ESD file and WindowsUpdateBox.exe that Windows Update uses. You can get these files from Windows Update, WSUS, or as in our case, from Configuration Manager via WSUS. We simply download the feature updates to a deployment package in ConfigMgr and grab the content from there.

You could of course use an ISO image and run setup.exe, but the ESD files are somewhat smaller in size and are sufficient for purpose.

The ESD files are put into the Azure CDN so the client can download them from there, but we also allow the client the option to get the FU content from a local ConfigMgr distribution point if they are connected to the corporate network locally. Having this option allows considerably quicker content download. Since IIS on the distribution points is not open to anonymous authentication, we use the domain credentials stamped to the registry to access the DP and download the content directly from IIS (credentials are cleaned from the registry after use).

Status Messages

Similar to how ConfigMgr sends status message to a management point, this solution also send status messages at key points during the process. This works by using Azure Event Grid to receive the message sent from the client as an http request. The Event Grid sends the message to an Azure Function, and the Azure Function is responsible to update the Azure SQL database created for this purpose with the current upgrade status of the device. The reason for doing it this way is that sending an http request to Event Grid is very quick and doesn’t hold up the process. Event Grid forwards the message to the Azure Function and can retry the message in the case it can’t get through immediately (although I’ve never experienced any failures or dead-lettering in practice). The Azure Function uses a Managed Identity to access the SQL database, which means the SQL database never needs to be exposed outside of its sandbox in Azure, and no credentials are needed to update the database.

We then use PowerBI to report on the data in the database to give visibility of where in the process every device is, if there are any issues that need addressing and all the stats that are useful for understanding whether devices get content from Azure or a local DP, what their approximate bandwidth is, how long downloads took, whether they were wired or wireless, make and model, upgrade time etc.

Preparation Script

After the initial deployment script has run, the entire upgrade process is driven by scheduled tasks on the client. The first task to run is the Preparation script and this attempts to run every hour until successful completion. This script does the following things:

  • Create the registry keys for the upgrade. These keys are stamped with the update progress and the results of the various actions such as pre-req checks, downloads etc. When we send a ‘status message’ we simply read these keys and send them on. Having progress stamped in the local registry is useful if we need to troubleshoot on the device directly.
  • Run readiness checks, such as
    • Checking for client OS
    • Checking disk space
  • Check for internet connectivity
  • Determine the approximate bandwidth to the Azure CDN and measure latency. This is done by downloading a 100MB file from the CDN and timing the download and using ‘psping.exe’ to measure latency. From this, we can calculate an approximate download time for the main ESD file.
  • Determine if the device is connected by wire or wireless
  • Determine if the device is connected to the corporate network
  • If the device is on the corporate network, we check latency to all the ConfigMgr distribution points to determine which one will be the best DP to get content from
  • Determine whether OS is business or consumer and which language. This helps us figure out which ESD file to use.
  • Download WindowsUpdateBox.exe and verify the hash
  • Download the feature update ESD file and verify the hash
    • Downloads of FU content is done using BITS transfers as this proved the most reliable method. Code is added to handle BITS transfer errors to add resilience.
  • Assuming all the above is done successfully, the Preparation task will be disabled and the PreDownload task created.

PreDownload Script

The purpose of this script is to run the equivalent of a compatibility assessment. When using the ESD file, this is done with the /PreDownload switch on WindowsUpdateBox.exe. Should the PreDownload fail, the error code will be logged to the registry. Since 2004, we also read the SetupDiag results and stamp these to the registry. We also check the Compat*.xml files to look for any hard blocks and if found, we log the details to the registry.

If the PreDownload failed, we change the schedule of the task to run twice a week. This allows for remediation to be performed on the device before attempting the PreDownload assessment again.

If the PreDownload succeeds, we disable the PreDownload task and create two new ones – a Notification task and an Upgrade task.

We also create a desktop shortcut that the user can use to initiate the upgrade.

Notification Script

The Notification script runs in the user context and displays toast notifications to notify the user that the upgrade is available, what the deadline is and how to upgrade, as already mentioned.

Upgrade Script

When the user clicks the desktop shortcut or the ‘Install now’ button on the toast notification, the upgrade is initiated. Because the upgrade needs to run with administrative privilege, the only thing the desktop shortcut and toast notification button does is to create an entry in the Application event log. The upgrade scheduled task is triggered when this event is created and the task runs in SYSTEM context. The UI is displayed in the user session with the help of the handy ServiceUI.exe from the MDT toolkit.

Upgrade UI

The user interface part of the upgrade is essentially a WPF application coded in PowerShell. The UI displays some basic upgrade information for the user and once they click ‘Begin’ we run the upgrade in 3 stages:

  1. PreDownload. Even though we ran this already, we run again before installing just to make sure nothing has changed since, and it doesn’t take long to run.
  2. Install. This uses the /Install switch on WindowsUpdateBox.exe and runs the main part of the online phase of the upgrade.
  3. Finalize. This uses the /Finalize switch and finalizes the update in preparation for a computer restart.

The progress of each of these phases is tracked in the registry and displayed in the UI using progress bars. If there is an issue, we notify the user and IT can get involved to remediate.

If successful, the user can restart the computer immediately or a later point (though we discourage this!). We don’t stop the user from working while the upgrade is running in the online phase and we allow them to partially hide the upgrade window so the upgrade does not hinder user productivity (similar to how WUfB installs an update in the background.)

After the user restarts the computer, the usual Windows Update screens take over until the update has installed and the user is brought to the login screen again.

Drivers and Stuff

We had considered upgrading drivers and even apps with this process, as we did for the 1903 upgrade, however user experience was important for us and we didn’t want the upgrade to take any longer than necessary, so we decided not to chain anything onto the upgrade process itself but handle other things separately. That being said, because this is a custom solution it is perfectly possible to incorporate additional activities into it if desired.

Rollback

In the event the that OS was rolled back during the offline phase, a scheduled task will run that will detect this and raise a toast notification to inform the user. We have a script that will gather logs and data from the device and upload it to a storage account in Azure where an administrator can remotely diagnose the issue. I plan to incorporate that as an automatic part of the process in a future version.

Updater Script

The solution creates an Updater scheduled task which runs once per day. The purpose of this task is to keep the solution up to date. If we want to change something in the process, add some code to a file or whatever is necessary, the Updater will take care of this.

It works by downloading a manifest file from the Azure CDN. This file contains all the files used by the solution with their current versions. If we update something, we upload the new files to the Azure storage account, purge them from the CDN and update the manifest file.

The Updater script will download the current manifest, detect that something has changed and download the required files to the device.

Cleanup Script

A Cleanup task is also created. When this task detects that the OS has been upgraded to the required version, it will remove all the scheduled tasks and cached content to leave no footprint on the device other than the log file and the registry keys.

Source Files

You can find a generalised version of the code used in this solution in my Github repo as a reference. As mentioned before though, there are many working parts to the solution including the Azure services and I haven’t documented their configuration here.

Final Comments

The main benefit of this solution for us is that it is completely customised to our needs. Although it is relatively complex to create, it is also relatively easy to maintain as well as adapt the solution for new W10 versions. We do still take advantage of products like ConfigMgr to allow devices to get content from a local DP if they are corporate connected, and ConfigMgr / Update Compliance / Desktop Analytics for helping us determine device compatibility and ConfigMgr or Intune to actually get the deployment script to the device. We also make good use of Azure services for the status messages and the cloud database, as well as PowerBI for reporting. So the solution still utilizes existing Microsoft products while giving us the control and customisations that we need to provide a better upgrade experience for our users.

Get a daily admin Audit Report for MEM / Intune

In an environment where you have multiple admin users it’s useful to audit admin activities so everyone can be aware of changes that others have made. I do this for Endpoint Configuration Manager with a daily email report built from admin status messages, so I decided to create something similar for Intune / MEM.

Admin actions are already audited for you in MEM (Tenant Administration > Audit logs) so it’s simply a case of getting that data into an email report. You can do this with Graph (which gives you more data actually) but I decided to use Log Analytics for this instead.

You need a Log Analytics workspace, and you need to configure Diagnostics settings in the MEM portal to send AuditLogs to the workspace.

Then, in order to automate sending a daily report create a service principal in Azure AD with just the permissions necessary to read data from the Log Analytics workspace. You can do this easily from the Azure portal using CloudShell. In the example below, I’m creating a new service principal with the role “Log Analytics Reader” scoped just to the Log Analytics workspace where the AuditLogs are sent to.

$DisplayName = "MEM-Reporting"
$Role = "Log Analytics Reader"
$Scope = "/subscriptions/<subscriptionId>/resourcegroups/<resourcegroupname>/providers/microsoft.operationalinsights/workspaces/<workspacename>"

$sp = New-AzADServicePrincipal -DisplayName $DisplayName -Role $Role -Scope $Scope

With the service principal created, you’ll need to make a note of the ApplicationId:

$sp.ApplicationId

And the secret:

$SP.Secret | ConvertFrom-SecureString -AsPlainText

Of course, if you prefer you can use certificate authentication instead of using the secret key.

Below is a PowerShell script that uses the Az PowerShell module to connect to the log analytics workspace as the service principal, query the IntuneAuditLogs for entries in the last 24 hours, then send them in an HTML email report. Run it with your favourite automation tool.

You’ll need the app Id and secret from the service principal, your tenant Id, your log analytics workspace Id, and don’t forget to update the email parameters.

Sample email report
# Script to send a daily audit report for admin activities in MEM/Intune
# Requirements:
# – Log Analytics Workspace
# – Intune Audit Logs saved to workspace
# – Service Principal with 'Log Analytics reader' role in workspace
# – Azure Az PowerShell modules
# Azure resource info
$ApplicationId = "abc73938-0000-0000-0000-9b01316a9123" # Service Principal Application Id
$Secret = "489j49r-0000-0000-0000-e2dc6451123" # Service Principal Secret
$TenantID = "abc894e7-00000-0000-0000-320d0334b123" # Tenant ID
$LAWorkspaceID = "abcc1e47-0000-0000-0000-b7ce2b2bb123" # Log Analytics Workspace ID
$Timespan = (New-TimeSpan Hours 24)
# Email params
$EmailParams = @{
To = 'trevor.jones@smsagent.blog'
From = 'MEMReporting@smsagent.blog'
Smtpserver = 'smsagent.mail.protection.outlook.com'
Port = 25
Subject = "MEM Audit Report | $(Get-Date Format ddMMMyyyy)"
}
# Html CSS style
$Style = @"
<style>
table {
border-collapse: collapse;
font-family: sans-serif
font-size: 12px
}
td, th {
border: 1px solid #ddd;
padding: 6px;
}
th {
padding-top: 8px;
padding-bottom: 8px;
text-align: left;
background-color: #3700B3;
color: #03DAC6
}
</style>
"@
# Connect to Azure with Service Principal
$Creds = [PSCredential]::new($ApplicationId,(ConvertTo-SecureString $Secret AsPlaintext Force))
Connect-AzAccount ServicePrincipal Credential $Creds Tenant $TenantID
# Run the Log Analytics Query
$Query = "IntuneAuditLogs | sort by TimeGenerated desc"
$Results = Invoke-AzOperationalInsightsQuery WorkspaceId $LAWorkspaceID Query $Query Timespan $Timespan
$ResultsArray = [System.Linq.Enumerable]::ToArray($Results.Results)
# Converts the results to a datatable
$DataTable = New-Object System.Data.DataTable
$Columns = @("Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID")
foreach ($Column in $Columns)
{
[void]$DataTable.Columns.Add($Column)
}
foreach ($result in $ResultsArray)
{
$Properties = $Result.Properties | ConvertFrom-Json
[void]$DataTable.Rows.Add(
$Properties.ActivityDate,
$result.Identity,
$Properties.Actor.ApplicationName,
$result.OperationName,
$result.ResultType,
$Properties.TargetDisplayNames[0],
$Properties.TargetObjectIDs[0]
)
}
# Send an email
If ($DataTable.Rows.Count -ge 1)
{
$HTML = $Datatable |
ConvertTo-Html Property "Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID" Head $Style Body "<h2>MEM Admin Activities in the last 24 hours</h2>" |
Out-String
Send-MailMessage @EmailParams Body $html BodyAsHtml
}

Installing and Configuring Additional Languages during Windows Autopilot

I was experimenting with different ways to get additional languages installed and configured during Windows Autopilot and it proved to be an interesting challenge. The following is what I settled on in the end and what produced the results that I wanted.

Here were my particular requirements, but you can customize this per your own need:

  • The primary language should be English (United Kingdom)
  • An additional secondary language of English (United States)
  • Display language should be English (United Kingdom)
  • Default input override should be English (United Kingdom)
  • System locale should be English (United Kingdom)
  • The administrative defaults for the Welcome screen and New user accounts must have a display language, input language, format and location matching the primary language (UK / UK English)
  • All optional features for the primary language should be installed (handwriting, optical character recognition, etc)

To achieve this, I basically created three elements:

  1. Installed the Local Experience Pack for English (United Kingdom)
  2. Deployed a powershell script running in administrative context that sets the administrative language defaults and system locale
  3. Deployed a powershell script running in user context that sets the correct order in the user preferred languages list

This was deployed during Autopilot to a Windows 10 1909 (United States) base image.

Local Experience Packs

Local Experience Packs (LXPs) are the modern way to go for installing additional languages since Windows 10 1803. These are published to the Microsoft Store and are automatically updated. They also install more quickly that the traditional cab language packs that you would install with DISM.

LXPs are available in the Microsoft Store for Business, so they can be synced with Intune and deployed as apps. However, the problem with using LXPs as apps during Autopilot is the order of things. The LXP needs to be installed before the PowerShell script that configures the language defaults runs, and since PowerShell scripts are not currently tracked in the ESP, and apps are the last thing to install in the device setup phase, the scripts will very likely run before the app is installed.

To get around that, I decided to get the LXP from the Volume Licensing Center instead. Then I uploaded this to a storage account in Azure, where it gets downloaded and installed by the PowerShell script. This way I can control the order and be sure the LXP is installed before making configuration changes.

When downloading from the VLC, be sure to select the Multilanguage option:

Then get the highlighted ISO. The 1903 LXPs work for 1909 also.

Get the applicable appx file and the license file from the ISO, zip them, and upload the zip file into an Azure Storage account.

When uploading the zip file, be sure to choose the Account Key authentication type:

Once uploaded, click on the blob and go to the Generate SAS page. Choose Read permissions, set an appropriate expiry date, then copy the Blob SAS URL. You will need this to download the file with PowerShell.

Administrative PowerShell Script

Now lets create a PowerShell script that will:

  • Download and install the Local Experience Pack
  • Install any optional features for the language
  • Configure language and regional settings and defaults

Here’s the script I’m using for that.

# Admin-context script to set the administrative language defaults, system locale and install optional features for the primary language
# Language codes
$PrimaryLanguage = "en-GB"
$SecondaryLanguage = "en-US"
$PrimaryInputCode = "0809:00000809"
$SecondaryInputCode = "0409:00000409"
$PrimaryGeoID = "242"
# Enable side-loading
# Required for appx/msix prior to build 18956 (1909 insider)
New-ItemProperty Path HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\AppModelUnlock Name AllowAllTrustedApps Value 1 PropertyType DWORD Force
# Provision Local Experience Pack
$BlobURL = "https://mystorageaccount.blob.core.windows.net/mycontainer/en-gb.zip?sp=r&st=2020-03-26T18:02:28Z&se=2050-03-27T00:02:28Z&spr=https&sv=2019-02-02&sr=b&sig=91234567890OYr%2BI0RcryhGFy1DNMlzhfIWbQ%3D"
$DownloadedFile = "$env:LOCALAPPDATA\en-GB.zip"
Try
{
$WebClient = New-Object System.Net.WebClient
$WebClient.DownloadFile($BlobURL, $DownloadedFile)
Unblock-File Path $DownloadedFile ErrorAction SilentlyContinue
Expand-Archive Path $DownloadedFile DestinationPath $env:LOCALAPPDATA Force ErrorAction Stop
Add-AppxProvisionedPackage Online PackagePath "$env:LOCALAPPDATA\en-gb\LanguageExperiencePack.en-gb.Neutral.appx" LicensePath "$env:LOCALAPPDATA\en-gb\License.xml" ErrorAction Stop
Remove-Item Path $DownloadedFile Force ErrorAction SilentlyContinue
}
Catch
{
Write-Host "Failed to install Local Experience Pack: $_"
}
# Install optional features for primary language
$UKCapabilities = Get-WindowsCapability Online | Where {$_.Name -match "$PrimaryLanguage" -and $_.State -ne "Installed"}
$UKCapabilities | foreach {
Add-WindowsCapability Online Name $_.Name
}
# Apply custom XML to set administrative language defaults
$XML = @"
<gs:GlobalizationServices xmlns:gs="urn:longhornGlobalizationUnattend">
<!– user list –>
<gs:UserList>
<gs:User UserID="Current" CopySettingsToDefaultUserAcct="true" CopySettingsToSystemAcct="true"/>
</gs:UserList>
<!– GeoID –>
<gs:LocationPreferences>
<gs:GeoID Value="$PrimaryGeoID"/>
</gs:LocationPreferences>
<gs:MUILanguagePreferences>
<gs:MUILanguage Value="$PrimaryLanguage"/>
<gs:MUIFallback Value="$SecondaryLanguage"/>
</gs:MUILanguagePreferences>
<!– system locale –>
<gs:SystemLocale Name="$PrimaryLanguage"/>
<!– input preferences –>
<gs:InputPreferences>
<gs:InputLanguageID Action="add" ID="$PrimaryInputCode" Default="true"/>
<gs:InputLanguageID Action="add" ID="$SecondaryInputCode"/>
</gs:InputPreferences>
<!– user locale –>
<gs:UserLocale>
<gs:Locale Name="$PrimaryLanguage" SetAsCurrent="true" ResetAllSettings="false"/>
</gs:UserLocale>
</gs:GlobalizationServices>
"@
New-Item Path $env:TEMP Name "en-GB.xml" ItemType File Value $XML Force
$Process = Start-Process FilePath Control.exe ArgumentList "intl.cpl,,/f:""$env:Temp\en-GB.xml""" NoNewWindow PassThru Wait
$Process.ExitCode

A quick walkthrough:

First, I’ve entered the locale IDs for the primary and secondary languages, as well as the keyboard layout hex codes, and finally the Geo location ID for the primary language as variables.

Then we set a registry key to allow side-loading (required for older W10 versions for the install of appx/msix).

Next we download and install the LXP. You’ll need to enter the URL you copied earlier for the Azure blob, and update the zip filename as required, as well as the LXP filename.

Then we install any optional features for the primary language that aren’t already installed.

Then we define the content of an XML file that will be used to set the language and locale preferences. Obviously customize that per your requirement.

Then we save that content to a file and apply it.

Create the PowerShell script in Intune, make sure you don’t run it using the logged on credentials, and deploy it to your Autopilot AAD group.

User PowerShell Script

Now we need to create a very simple script that will run in the user context. This script simply makes sure that the list of preferred languages is in the correct order, as by default it will look like this:

This script will run for each user that logs in. It won’t run immediately so the order may be wrong when you first log in, but it doesn’t take long before it runs. Create the script in Intune, remember to run it using the logged on credentials, and deploy it to your Autopilot AAD group.

# User-context script to set the Language List
# Language codes
$PrimaryLanguage = "en-GB"
$SecondaryLanguage = "en-US"
$PrimaryInputCode = "0809:00000809"
$SecondaryInputCode = "0409:00000409"
# Set preferred languages
$NewLanguageList = New-WinUserLanguageList Language $PrimaryLanguage
$NewLanguageList.Add([Microsoft.InternationalSettings.Commands.WinUserLanguage]::new($SecondaryLanguage))
$NewLanguageList[1].InputMethodTips.Clear()
$NewLanguageList[1].InputMethodTips.Add($PrimaryInputCode)
$NewLanguageList[1].InputMethodTips.Add($SecondaryInputCode)
Set-WinUserLanguageList $NewLanguageList Force

The Result

After running the Autopilot deployment and logging in, everything checks out 🙂

Managing Intune PowerShell Scripts with Microsoft Graph

In this blog I’ll cover how to list, get, create, update, delete and assign PowerShell scripts in Intune using Microsoft Graph and PowerShell.

Although you can use the Invoke-WebRequest or Invoke-RestMethod cmdlets when working with MS Graph, I prefer to use the Microsoft.Graph.Intune module, aka Intune PowerShell SDK, as it more nicely handles getting an auth token and we don’t have to create any headers, so get that module installed.

In the Graph API, PowerShell scripts live under the deviceManagementScript resource type and these are still only available in the beta schema so they are subject to change.

Connect to MS Graph

First off, let’s connect to MS Graph and set the schema to beta:

If ((Get-MSGraphEnvironment).SchemaVersion -ne "beta")
{
    $null = Update-MSGraphEnvironment -SchemaVersion beta
}
$Graph = Connect-MSGraph

List PowerShell Scripts

Now we can list the PowerShell scripts we have in Intune:

$URI = "deviceManagement/deviceManagementScripts"
$IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
If ($IntuneScripts.value)
{
    $IntuneScripts = $IntuneScripts.value
}

If we take a look at the results, we’ll see that the script content is not included when we list scripts. It is included when we get a single script, as we’ll see next.

Get a PowerShell Script

To get a specific script, we need to know its Id. To get that, first let’s create a simple function where we can pass a script name and use the Get method to retrieve the script details.

Function Get-IntunePowerShellScript {
    Param($ScriptName)
    $URI = "deviceManagement/deviceManagementScripts" 
    $IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
    If ($IntuneScripts.value)
    {
        $IntuneScripts = $IntuneScripts.value
    }
    $IntuneScript = $IntuneScripts | Where {$_.displayName -eq "$ScriptName"}
    Return $IntuneScript
}

Now we can use this function to get the script Id and then call Get again adding the script Id to the URL:

$ScriptName = "Escrow Bitlocker Recovery Keys to AAD"
$Script = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($Script.id)"
$IntuneScript = Invoke-MSGraphRequest -HttpMethod GET -Url $URI

If we look at the result, we can see that the script content is now returned, albeit in binary form:

View Script Content

To view the script, we simply need to convert it:

$Base64 =[Convert]::FromBase64String($IntuneScript.scriptContent)
[System.Text.Encoding]::UTF8.GetString($Base64)

Create a Script

Now lets create a new script. To create a script we will read in a script file and convert it into base64. We add this together with other required parameters into some JSON before posting the request.

When reading and converting the script content use UTF8. Other character sets may not decode properly at run-time on the client-side and result in script execution failure.

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD"
    RunAsAccount = "system" # or user
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts"
$Response = Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

We can now see our script in the portal:

Update a Script

To update an existing script, we follow a similar process to creating a new script, we create some JSON that contains the updated parameters then call the Patch method to update it. But first we need to get the Id of the script we want to update, using our previously created function:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

In this example I have updated the content in the source script file so I need to read it in again, as well as updating the description of the script:

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD (Updated 2020-03-19)"
    RunAsAccount = "system"
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.id)"
$Response = Invoke-MSGraphRequest -HttpMethod PATCH -Url $URI -Content $Json

We can call Get on the script again and check the lastModifiedDateTime entry to verify that the script was updated, or check in the portal.

Add an Assignment

Before the script will execute anywhere it needs to be assigned to a group. To do that, we need the objectId of the AAD group we want to assign it to. To work with AAD groups I prefer to use the AzureAD module, so install that before continuing.

We need to again get the script that we want to assign:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

Then get the Azure AD group:

$AzureAD = Connect-AzureAD -AccountId $Graph.UPN
$GroupName = "Intune - [Test] Bitlocker Key Escrow"
$Group = Get-AzureADGroup -SearchString $GroupName

Then we prepare the necessary JSON and post the assignment

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($Group.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

To replace the current assignment with a new assignment, simply change the group name and run the same code again. To add an additional assignment or multiple assignments, you’ll need to post all the assignments at the same time, for example:

$GroupNameA = "Intune - [Test] Bitlocker Key Escrow"
$GroupNameB = "Intune - [Test] Autopilot SelfDeploying Provisioning"
$GroupA = Get-AzureADGroup -SearchString $GroupNameA
$GroupB = Get-AzureADGroup -SearchString $GroupNameB

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupA.ObjectId)"
        },
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupB.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

Delete an Assignment

I haven’t yet figured out how to delete an assignment – the current documentation appears to be incorrect. If you can figure this out please let me know!

Delete a Script

To delete a script, we simply get the script Id and call the Delete method on it:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)"
Invoke-MSGraphRequest -HttpMethod DELETE -Url $URI 

Delete Device Records in AD / AAD / Intune / Autopilot / ConfigMgr with PowerShell

I’ve done a lot of testing with Windows Autopilot in recent times. Most of my tests are done in virtual machines, which are ideal as I can simply dispose of them after. But you also need to cleanup the device records that were created in Azure Active Directory, Intune, the Autopilot registration service, Microsoft Endpoint Manager (if you’re using it) and Active Directory in the case of Hybrid-joined devices.

To make this a bit easier, I wrote the following PowerShell script. You simply enter the device name and it’ll go and search for that device in any of the above locations that you specify and delete the device records.

The script assumes you have the appropriate permissions, and requires the Microsoft.Graph.Intune and AzureAD PowerShell modules, as well as the Configuration Manager module if you want to delete from there.

You can delete from all of the above locations with the -All switch, or you can specify any combination, for example -AAD -Intune -ConfigMgr, or -AD -Intune etc.

In the case of the Autopilot device registration, the device must also exist in Intune before you attempt to delete it as the Intune record is used to determine the serial number of the device.

Please test thoroughly before using on any production device!

Examples

Delete-AutopilotedDeviceRecords -ComputerName PC01 -All
@(
    'PC01'
    'PC02'
    'PC03'
) | foreach {
    Delete-AutopilotedDeviceRecords -ComputerName $_ -AAD -Intune
}

Output

Script

[CmdletBinding(DefaultParameterSetName='All')]
Param
(
[Parameter(ParameterSetName='All',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
[Parameter(ParameterSetName='Individual',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
$ComputerName,
[Parameter(ParameterSetName='All')]
[switch]$All = $True,
[Parameter(ParameterSetName='Individual')]
[switch]$AD,
[Parameter(ParameterSetName='Individual')]
[switch]$AAD,
[Parameter(ParameterSetName='Individual')]
[switch]$Intune,
[Parameter(ParameterSetName='Individual')]
[switch]$Autopilot,
[Parameter(ParameterSetName='Individual')]
[switch]$ConfigMgr
)
Set-Location $env:SystemDrive
# Load required modules
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Importing modules…" NoNewline
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module Microsoft.Graph.Intune ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module AzureAD ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module $env:SMS_ADMIN_UI_PATH.Replace('i386','ConfigurationManager.psd1') ErrorAction Stop
}
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
# Authenticate with Azure
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-Host "Authenticating with MS Graph and Azure AD…" NoNewline
$intuneId = Connect-MSGraph ErrorAction Stop
$aadId = Connect-AzureAD AccountId $intuneId.UPN ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "Error!" ForegroundColor Red
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
Write-host "$($ComputerName.ToUpper())" ForegroundColor Yellow
Write-Host "===============" ForegroundColor Yellow
# Delete from AD
If ($PSBoundParameters.ContainsKey("AD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Active Directory " ForegroundColor Yellow NoNewline
Write-host "computer account…" NoNewline
$Searcher = [ADSISearcher]::new()
$Searcher.Filter = "(sAMAccountName=$ComputerName`$)"
[void]$Searcher.PropertiesToLoad.Add("distinguishedName")
$ComputerAccount = $Searcher.FindOne()
If ($ComputerAccount)
{
Write-host "Success" ForegroundColor Green
Write-Host " Deleting computer account…" NoNewline
$DirectoryEntry = $ComputerAccount.GetDirectoryEntry()
$Result = $DirectoryEntry.DeleteTree()
Write-Host "Success" ForegroundColor Green
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Azure AD
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Azure AD " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
[array]$AzureADDevices = Get-AzureADDevice SearchString $ComputerName All:$true ErrorAction Stop
If ($AzureADDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
Foreach ($AzureADDevice in $AzureADDevices)
{
Write-host " Deleting DisplayName: $($AzureADDevice.DisplayName) | ObjectId: $($AzureADDevice.ObjectId) | DeviceId: $($AzureADDevice.DeviceId)" NoNewline
Remove-AzureADDevice ObjectId $AzureADDevice.ObjectId ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Intune
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Intune " ForegroundColor Yellow NoNewline
Write-host "managed device record/s…" NoNewline
[array]$IntuneDevices = Get-IntuneManagedDevice Filter "deviceName eq '$ComputerName'" ErrorAction Stop
If ($IntuneDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("All"))
{
foreach ($IntuneDevice in $IntuneDevices)
{
Write-host " Deleting DeviceName: $($IntuneDevice.deviceName) | Id: $($IntuneDevice.Id) | AzureADDeviceId: $($IntuneDevice.azureADDeviceId) | SerialNumber: $($IntuneDevice.serialNumber)" NoNewline
Remove-IntuneManagedDevice managedDeviceId $IntuneDevice.Id Verbose ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete Autopilot device
If ($PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
If ($IntuneDevices.Count -ge 1)
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Autopilot " ForegroundColor Yellow NoNewline
Write-host "device registration…" NoNewline
$AutopilotDevices = New-Object System.Collections.ArrayList
foreach ($IntuneDevice in $IntuneDevices)
{
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities?`$filter=contains(serialNumber,'$($IntuneDevice.serialNumber)')"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod GET ErrorAction Stop
[void]$AutopilotDevices.Add($AutopilotDevice)
}
Write-Host "Success" ForegroundColor Green
foreach ($device in $AutopilotDevices)
{
Write-host " Deleting SerialNumber: $($Device.value.serialNumber) | Model: $($Device.value.model) | Id: $($Device.value.id) | GroupTag: $($Device.value.groupTag) | ManagedDeviceId: $($device.value.managedDeviceId)" NoNewline
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities/$($device.value.Id)"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod DELETE ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
}
# Delete from ConfigMgr
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "ConfigMgr " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
$SiteCode = (Get-PSDrive PSProvider CMSITE ErrorAction Stop).Name
Set-Location ("$SiteCode" + ":") ErrorAction Stop
[array]$ConfigMgrDevices = Get-CMDevice Name $ComputerName Fast ErrorAction Stop
Write-Host "Success" ForegroundColor Green
foreach ($ConfigMgrDevice in $ConfigMgrDevices)
{
Write-host " Deleting Name: $($ConfigMgrDevice.Name) | ResourceID: $($ConfigMgrDevice.ResourceID) | SMSID: $($ConfigMgrDevice.SMSID) | UserDomainName: $($ConfigMgrDevice.UserDomainName)" NoNewline
Remove-CMDevice InputObject $ConfigMgrDevice Force ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
Set-Location $env:SystemDrive

Setting the Computer Description During Windows Autopilot

I’ve been getting to grips with Windows Autopilot recently and, having a long history working with SCCM, I’ve found it hard not to compare it with the power of traditional OSD using a task sequence. In fact, one of my goals was to basically try to reproduce what I’m doing in OSD with Autopilot in order to end up with the same result – and it’s been a challenge.

I like the general concept of Autopilot and don’t get me wrong – it’s getting better all the time – but it still has its shortcomings that require a bit of creativity to work around. One of the things I do during OSD is to set the computer description in AD. That’s fairly easy to do in a task sequence; you can just script it and run the step using credentials that have the permission to make that change.

In Autopilot however (hybrid AAD join scenario), although you can run Powershell scripts too, they will only run in SYSTEM context during the Autopilot process. That means you either need to give computer accounts the permission to change their own properties in AD, or you have to find a way to run that code using alternate credentials. You can run scripts in the context of the logged-on user, but I don’t want to do that – in fact I disable the user ESP – I want to use a specific account that has those permissions.

You could use SCCM to do it post-deployment if you are co-managing the device, but ideally I want everything to be native to Autopilot where possible, and move away from the hybrid mentality of do what you can with Intune, and use SCCM for the rest.

It is possible to execute code in another user context from SYSTEM context, but when making changes in AD the DirectoryEntry operation kept erroring with “An operations error occurred”. After researching, I realized it is due to AD not accepting the authentication token as it’s being passed a second time and not directly. I tried creating a separate powershell process, a background job, a runspace with specific credentials – nothing would play ball. Anyway, I found a way to get around that by using the AccountManagement .Net class, which allows you to create a context using specific credentials.

In this example, I’m setting the computer description based on the model and serial number of the device. You need to provide the username and password for the account you will perform the AD operation with. I’ve put the password in clear text in this example, but in the real world we store the credentials in an Azure Keyvault and load them in dynamically at runtime with some POSH code to avoid storing them in the script. I hope in the future we will be able to run Powershell scripts with Intune in a specific user context, as you can with steps in an SCCM task sequence.

# Set credentials
$ADAccount = "mydomain\myADaccount"
$ADPassword = "Pa$$w0rd"

# Set initial description
$Model = Get-WMIObject -Class Win32_ComputerSystem -Property Model -ErrorAction Stop| Select -ExpandProperty Model
$SerialNumber = Get-WMIObject -Class Win32_BIOS -Property SerialNumber -ErrorAction Stop | Select -ExpandProperty SerialNumber
$Description = "$Model - $SerialNumber"

# Set some type accelerators
Add-Type -AssemblyName System.DirectoryServices.AccountManagement -ErrorAction Stop
$Accelerators = [PowerShell].Assembly.GetType("System.Management.Automation.TypeAccelerators")
$Accelerators::Add("PrincipalContext",[System.DirectoryServices.AccountManagement.PrincipalContext])
$Accelerators::Add("ContextType",[System.DirectoryServices.AccountManagement.ContextType])
$Accelerators::Add("Principal",[System.DirectoryServices.AccountManagement.ComputerPrincipal])
$Accelerators::Add("IdentityType",[System.DirectoryServices.AccountManagement.IdentityType])

# Connect to AD and set the computer description
$Domain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$PrincipalContext = [PrincipalContext]::new([ContextType]::Domain,$Domain,$ADAccount,$ADPassword)
$Account = [Principal]::FindByIdentity($PrincipalContext,[IdentityType]::Name,$env:COMPUTERNAME)
$LDAPObject = $Account.GetUnderlyingObject()
If ($LDAPObject.Properties["description"][0])
{
    $LDAPObject.Properties["description"][0] = $Description
}
Else
{
    [void]$LDAPObject.Properties["description"].Add($Description)
}
$LDAPObject.CommitChanges()
$Account.Dispose()