Get the current patch level for Windows 10 with PowerShell

I was working on some updates to our unified reporting solution for Windows Updates (ie WUfB + MEMCM) and I wanted to figure out simply from the OS build number whether a Windows 10 workstation has the latest cumulative update installed. The only reliable and useable static list I could find for Windows 10 build numbers is Microsoft’s Windows 10 Update History web page, so I decided to build a PowerShell script that parses the page to get current patch info.

The script below can be used to report which OS build a Windows 10 workstation is currently on as well as which update is the latest update available to the device. It can also report on all Windows updates published for the version of Windows 10 a workstation is currently on.

Run the script as is to show you:

  • Current OS version
  • Current OS Edition
  • Current OS Build number
  • The installed update that corresponds to that build number, as well as the KB number and a link to the info page
  • The latest available update for the OS version
Get-CurrentPatchInfo

Compare the latest available update with the currently installed one to know if the OS is up-to-date.

If there are Preview or Out-of-band updates available that are more recent than the one you have installed, you can exclude those from being reported as a latest available update, so you can just focus on the cumulative updates.

Get-CurrentPatchInfo -ExcludePreview -ExcludeOutofBand

You can also list all Windows updates that Microsoft have published for your OS version like so:

Get-CurrentPatchInfo -ListAvailable

Again focus on just the cumulative updates if you want by excluding Preview and Out-of-band updates from the list:

Get-CurrentPatchInfo -ListAvailable -ExcludePreview -ExcludeOutofBand

Obviously the script does require internet access and I’ve tested it on PowerShell 5.1 and 7.1.

[CmdletBinding()]
Param(
[switch]$ListAllAvailable,
[switch]$ExcludePreview,
[switch]$ExcludeOutofBand
)
$ProgressPreference = 'SilentlyContinue'
$URI = "https://aka.ms/WindowsUpdateHistory" # Windows 10 release history
Function Get-MyWindowsVersion {
[CmdletBinding()]
Param
(
$ComputerName = $env:COMPUTERNAME
)
$Table = New-Object System.Data.DataTable
$Table.Columns.AddRange(@("ComputerName","Windows Edition","Version","OS Build"))
$ProductName = (Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion' Name ProductName).ProductName
Try
{
$Version = (Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion' Name ReleaseID ErrorAction Stop).ReleaseID
}
Catch
{
$Version = "N/A"
}
$CurrentBuild = (Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion' Name CurrentBuild).CurrentBuild
$UBR = (Get-ItemProperty 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion' Name UBR).UBR
$OSVersion = $CurrentBuild + "." + $UBR
$TempTable = New-Object System.Data.DataTable
$TempTable.Columns.AddRange(@("ComputerName","Windows Edition","Version","OS Build"))
[void]$TempTable.Rows.Add($env:COMPUTERNAME,$ProductName,$Version,$OSVersion)
Return $TempTable
}
Function Convert-ParsedArray {
Param($Array)
$ArrayList = New-Object System.Collections.ArrayList
foreach ($item in $Array)
{
[void]$ArrayList.Add([PSCustomObject]@{
Update = $item.outerHTML.Split('>')[1].Replace('</a','').Replace('&#x2014;','')
KB = "KB" + $item.href.Split('/')[-1]
InfoURL = "https://support.microsoft.com" + $item.href
OSBuild = $item.outerHTML.Split('(OS ')[1].Split()[1] # Just for sorting
})
}
Return $ArrayList
}
If ($PSVersionTable.PSVersion.Major -ge 6)
{
$Response = Invoke-WebRequest Uri $URI ErrorAction Stop
}
else
{
$Response = Invoke-WebRequest Uri $URI UseBasicParsing ErrorAction Stop
}
If (!($Response.Links))
{ throw "Response was not parsed as HTML"}
$VersionDataRaw = $Response.Links | where {$_.outerHTML -match "supLeftNavLink" -and $_.outerHTML -match "KB"}
$CurrentWindowsVersion = Get-MyWindowsVersion ErrorAction Stop
If ($ListAllAvailable)
{
If ($ExcludePreview -and $ExcludeOutofBand)
{
$AllAvailable = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0] -and $_.outerHTML -notmatch "Preview" -and $_.outerHTML -notmatch "Out-of-band"}
}
ElseIf ($ExcludePreview)
{
$AllAvailable = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0] -and $_.outerHTML -notmatch "Preview"}
}
ElseIf ($ExcludeOutofBand)
{
$AllAvailable = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0] -and $_.outerHTML -notmatch "Out-of-band"}
}
Else
{
$AllAvailable = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0]}
}
$UniqueList = (Convert-ParsedArray Array $AllAvailable) | Sort OSBuild Descending Unique
$Table = New-Object System.Data.DataTable
[void]$Table.Columns.AddRange(@('Update','KB','InfoURL'))
foreach ($Update in $UniqueList)
{
[void]$Table.Rows.Add(
$Update.Update,
$Update.KB,
$Update.InfoURL
)
}
Return $Table
}
$CurrentPatch = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'} | Select First 1
If ($ExcludePreview -and $ExcludeOutofBand)
{
$LatestAvailablePatch = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0] -and $_.outerHTML -notmatch "Out-of-band" -and $_.outerHTML -notmatch "Preview"} | Select First 1
}
ElseIf ($ExcludePreview)
{
$LatestAvailablePatch = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0] -and $_.outerHTML -notmatch "Preview"} | Select First 1
}
ElseIf ($ExcludeOutofBand)
{
$LatestAvailablePatch = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0] -and $_.outerHTML -notmatch "Out-of-band"} | Select First 1
}
Else
{
$LatestAvailablePatch = $VersionDataRaw | where {$_.outerHTML -match $CurrentWindowsVersion.'OS Build'.Split('.')[0]} | Select First 1
}
$Table = New-Object System.Data.DataTable
[void]$Table.Columns.AddRange(@('OSVersion','OSEdition','OSBuild','CurrentInstalledUpdate','CurrentInstalledUpdateKB','CurrentInstalledUpdateInfoURL','LatestAvailableUpdate','LastestAvailableUpdateKB','LastestAvailableUpdateInfoURL'))
[void]$Table.Rows.Add(
$CurrentWindowsVersion.Version,
$CurrentWindowsVersion.'Windows Edition',
$CurrentWindowsVersion.'OS Build',
$CurrentPatch.outerHTML.Split('>')[1].Replace('</a','').Replace('&#x2014;',''),
"KB" + $CurrentPatch.href.Split('/')[-1],
"https://support.microsoft.com" + $CurrentPatch.href,
$LatestAvailablePatch.outerHTML.Split('>')[1].Replace('</a','').Replace('&#x2014;',''),
"KB" + $LatestAvailablePatch.href.Split('/')[-1],
"https://support.microsoft.com" + $LatestAvailablePatch.href
)
Return $Table

Calculating the Offline Time for a Windows 10 Upgrade

For my Windows 10 feature update installation process, I like to gather lots of statistics around the upgrade itself as well as the devices they are running on so we can later report on these. These stats can be useful for identifying areas of potential improvement in the upgrade process. One stat I gather is the offline time for the upgrade, ie the time between when the downlevel (online) phase is completed and the computer is restarted and the time when the offline phases have completed and the OS is brought back to the logon screen again. Knowing this value across the estate helps to gauge the user experience and how much time is being spent waiting for the offline phases to complete.

To calculate this value is actually straightforward – you can do it by searching the SYSTEM event log for the last time the computer was restarted and comparing it with the installation time of the OS which gets recorded in WMI after the offline phases have completed successfully. The only thing is, after the offline phase is complete the event logs are refreshed and previous log entries are removed, so you have to search the event log in the Windows.old folder instead. You have to do this before the Windows.old folder gets automatically removed (depending on your policy) and manual rollback is no longer possible.

The PowerShell code below searches for the most recent event ID 1074, compares the date with the OS install date value in WMI (use the *CIM* cmdlets to get an automatic conversion to [DateTime]) and outputs as a TimeSpan which you can log however you want.

The good news is that for a 20H2 upgrade from media – at least in my various tests – the offline time has been impressively low.

$Params = @{
    Path = "$env:SystemDrive\Windows.old\Windows\System32\winevt\Logs\System.evtx"
    Id = 1074
}
$LatestRestartEvent = (Get-WinEvent -FilterHashtable $Params -ErrorAction SilentlyContinue | Select -First 1)
$InstallFinishedDate = Get-CimInstance Win32_OperatingSystem | Select -ExpandProperty InstallDate
If ($LatestRestartEvent)
{
    $UpgradeOfflineTime = $InstallFinishedDate - $LatestRestartEvent.TimeCreated
}

Getting Creative: a Bespoke Solution for Feature Update Deployments

This is the first blog post in what I hope will be a series of posts demonstrating several custom solutions I created for things such as feature update deployments, managing local admin password rotation, provisioning Windows 10 devices, managing drive mappings and more. My reasons for creating these solutions was to overcome some of the current limitations in existing products or processes, make things more cloud-first and independent of existing on-prem infrastructure where possible, and to more exactly meet the requirements of the business.

Although I will try to provide a generalised version of the source code where possible, I am not providing complete solutions that you can go ahead and use as is. Rather my intention is to inspire your own creativity, to give working examples of what could be done if you have the time and resource, and to provide source code as a reference or starting point for your own solutions should you wish to create them!

Someone asked me recently how we deploy feature updates and it was a difficult question to answer other than to say we use a custom-built process. Having used some of the existing methods available (ConfigMgr Software Updates, ConfigMgr custom WaaS process, Windows Update for Business) we concluded there were shortcomings in each of them, and this provided inspiration to create our own, customized process to give us the control, reliability, user experience and reporting capability that we desired. Don’t get me wrong – I am not saying these methods aren’t good – they just couldn’t do things exactly the way we wanted.

So I set out to create a bespoke process – one that we could customize according to our needs, that was largely independent of our existing Configuration Manager infrastructure and that could run on any device with internet access. This required making use of cloud services in Azure as well as a lot of custom scripting! In this blog, I’ll try to cover what I did and how it works.

User Experience

First, let’s look at the end user experience of the feature update installation process – this was something key for us, to improve the user experience keeping it simple yet informative, and able to respond appropriately to any upgrade issues.

Once the update is available to a device, a toast notification is displayed notifying the user that an update is available. Initially, this displays once a day and automatically dismisses after 25 seconds. (I’ve blanked out our corporate branding in all these images)

We use a soft deadline – ie the update is never forced on the user. Enforcing compliance is handled by user communications and involvement from our local technicians. With one week left before the deadline, we increase the frequency of the notifications to twice per day.

If the deadline has passed, we take a more aggressive approach with the notifications, modifying the image and text, displaying it every 30 minutes and it doesn’t leave the screen unless the user actions or dismisses it.

The update can be installed via a shortcut on the desktop, or in the last notification it can be initiated from the notification itself.

Once triggered, a custom UI is displayed introducing the user to the update and what to expect.

When the user clicks Begin, we check that a power adapter is connected and no removable USB devices are attached – if they are, we prompt to the user to remove them first.

The update runs in three phases or stages – these correspond to the PreDownload, Install and Finalize commands on the update (more on that later). The progress of each stage is polled from the registry, as is the Setup Phase and Setup SubPhase.

Note that the user cannot cancel the update once it starts and this window will remain on the screen and on top of all other windows until the update is complete. The user can click the Hide me button, and this will shrink the window like so:

This little window also cannot be removed from the screen, but it can be moved around and is small enough to be unobtrusive. When the update has finished installing, or when the user clicks Restore, the main window will automatically display again and report the result of the update.

The colour scheme is based on Google’s material design, by the way.

If the update failed during the online phase, the user can still initiate the update from the desktop shortcut but toast notifications will no longer display as reminders. The idea is that IT can attempt to remediate the device and run the update again after.

If successful, the user can click Restart to restart the computer immediately. Then the offline phase of the upgrade runs, where you see the usual light blue screen and white Windows update text reporting that updates are being installed.

Once complete, the user will be brought back to the login screen, and we won’t bother them anymore.

If the update rolled back during the offline phase, we will detect this next time they log in and notify them one time:

Logging and Reporting

The entire update process is logged right from the start to a log file on the local machine. We also send ‘status messages’ at key points during the process and these find their way to an Azure SQL database which becomes the source for reporting on update progress across the estate (more on this later).

A PowerBI report gives visual indicators of update progress as well as a good amount of detail from each machine including update status, whether it passed or failed readiness checks and if failed, why, whether it passed the compatibility assessment, if it failed the assessment or the install we give the error code, whether any hard blocks were found, setup diag results (2004 onward), how long the update took to install and a bunch of other stuff we find useful.

Since 2004 though, we have starting inventorying certain registry keys using ConfigMgr to give us visibility of devices that won’t upgrade because of a Safeguard hold or other reason, so we can target the upgrade only to devices that aren’t reporting any known compatibility issues.

If a device performs a rollback, we can get it to upload key logs and registry key dumps to an Azure storage account where an administrator can remotely diagnose the issue.

How does it work?

Now lets dive into the process in more technical detail.

Deployment Script

The update starts life with a simple PowerShell script that does the following:

  • Creates a local directory to use to cache content, scripts and logs etc
  • Temporarily stores some domain credentials in the registry of the local SYSTEM account as encrypted secure strings for accessing content from a ConfigMgr distribution point if necessary (more on this later)
  • Downloads a manifest file that contains a list of all files and file versions that need to be downloaded to run the update. These include scripts, dlls (for the UI), xml definition files for scheduled tasks etc
  • Each file is then downloaded to the cache directory from an Azure CDN
  • 3 scheduled tasks are then registered on the client:
    • A ‘preparer’ task which runs prerequisite actions
    • A ‘file updater’ task which keeps local files up-to-date in case we wish to change something
    • A ‘content cleanup’ task which is responsible for cleaning up in the event the device gets upgraded through any means
  • A ‘status message’ is then sent as an http request, creating a new record for the device in the Azure SQL database

This script can be deployed through any method you wish, including Configuration Manager, Intune or just manually, however it should be run in SYSTEM context.

Content

All content needed for the update process to run is put into a container in a storage account in Azure. This storage account is exposed via an Azure Content Delivery Network (CDN). This means that clients can get all the content they need directly from an internet location with minimal latency no matter where they are in the world.

Feature Update Files

The files for the feature update itself are the ESD file and WindowsUpdateBox.exe that Windows Update uses. You can get these files from Windows Update, WSUS, or as in our case, from Configuration Manager via WSUS. We simply download the feature updates to a deployment package in ConfigMgr and grab the content from there.

You could of course use an ISO image and run setup.exe, but the ESD files are somewhat smaller in size and are sufficient for purpose.

The ESD files are put into the Azure CDN so the client can download them from there, but we also allow the client the option to get the FU content from a local ConfigMgr distribution point if they are connected to the corporate network locally. Having this option allows considerably quicker content download. Since IIS on the distribution points is not open to anonymous authentication, we use the domain credentials stamped to the registry to access the DP and download the content directly from IIS (credentials are cleaned from the registry after use).

Status Messages

Similar to how ConfigMgr sends status message to a management point, this solution also send status messages at key points during the process. This works by using Azure Event Grid to receive the message sent from the client as an http request. The Event Grid sends the message to an Azure Function, and the Azure Function is responsible to update the Azure SQL database created for this purpose with the current upgrade status of the device. The reason for doing it this way is that sending an http request to Event Grid is very quick and doesn’t hold up the process. Event Grid forwards the message to the Azure Function and can retry the message in the case it can’t get through immediately (although I’ve never experienced any failures or dead-lettering in practice). The Azure Function uses a Managed Identity to access the SQL database, which means the SQL database never needs to be exposed outside of its sandbox in Azure, and no credentials are needed to update the database.

We then use PowerBI to report on the data in the database to give visibility of where in the process every device is, if there are any issues that need addressing and all the stats that are useful for understanding whether devices get content from Azure or a local DP, what their approximate bandwidth is, how long downloads took, whether they were wired or wireless, make and model, upgrade time etc.

Preparation Script

After the initial deployment script has run, the entire upgrade process is driven by scheduled tasks on the client. The first task to run is the Preparation script and this attempts to run every hour until successful completion. This script does the following things:

  • Create the registry keys for the upgrade. These keys are stamped with the update progress and the results of the various actions such as pre-req checks, downloads etc. When we send a ‘status message’ we simply read these keys and send them on. Having progress stamped in the local registry is useful if we need to troubleshoot on the device directly.
  • Run readiness checks, such as
    • Checking for client OS
    • Checking disk space
  • Check for internet connectivity
  • Determine the approximate bandwidth to the Azure CDN and measure latency. This is done by downloading a 100MB file from the CDN and timing the download and using ‘psping.exe’ to measure latency. From this, we can calculate an approximate download time for the main ESD file.
  • Determine if the device is connected by wire or wireless
  • Determine if the device is connected to the corporate network
  • If the device is on the corporate network, we check latency to all the ConfigMgr distribution points to determine which one will be the best DP to get content from
  • Determine whether OS is business or consumer and which language. This helps us figure out which ESD file to use.
  • Download WindowsUpdateBox.exe and verify the hash
  • Download the feature update ESD file and verify the hash
    • Downloads of FU content is done using BITS transfers as this proved the most reliable method. Code is added to handle BITS transfer errors to add resilience.
  • Assuming all the above is done successfully, the Preparation task will be disabled and the PreDownload task created.

PreDownload Script

The purpose of this script is to run the equivalent of a compatibility assessment. When using the ESD file, this is done with the /PreDownload switch on WindowsUpdateBox.exe. Should the PreDownload fail, the error code will be logged to the registry. Since 2004, we also read the SetupDiag results and stamp these to the registry. We also check the Compat*.xml files to look for any hard blocks and if found, we log the details to the registry.

If the PreDownload failed, we change the schedule of the task to run twice a week. This allows for remediation to be performed on the device before attempting the PreDownload assessment again.

If the PreDownload succeeds, we disable the PreDownload task and create two new ones – a Notification task and an Upgrade task.

We also create a desktop shortcut that the user can use to initiate the upgrade.

Notification Script

The Notification script runs in the user context and displays toast notifications to notify the user that the upgrade is available, what the deadline is and how to upgrade, as already mentioned.

Upgrade Script

When the user clicks the desktop shortcut or the ‘Install now’ button on the toast notification, the upgrade is initiated. Because the upgrade needs to run with administrative privilege, the only thing the desktop shortcut and toast notification button does is to create an entry in the Application event log. The upgrade scheduled task is triggered when this event is created and the task runs in SYSTEM context. The UI is displayed in the user session with the help of the handy ServiceUI.exe from the MDT toolkit.

Upgrade UI

The user interface part of the upgrade is essentially a WPF application coded in PowerShell. The UI displays some basic upgrade information for the user and once they click ‘Begin’ we run the upgrade in 3 stages:

  1. PreDownload. Even though we ran this already, we run again before installing just to make sure nothing has changed since, and it doesn’t take long to run.
  2. Install. This uses the /Install switch on WindowsUpdateBox.exe and runs the main part of the online phase of the upgrade.
  3. Finalize. This uses the /Finalize switch and finalizes the update in preparation for a computer restart.

The progress of each of these phases is tracked in the registry and displayed in the UI using progress bars. If there is an issue, we notify the user and IT can get involved to remediate.

If successful, the user can restart the computer immediately or a later point (though we discourage this!). We don’t stop the user from working while the upgrade is running in the online phase and we allow them to partially hide the upgrade window so the upgrade does not hinder user productivity (similar to how WUfB installs an update in the background.)

After the user restarts the computer, the usual Windows Update screens take over until the update has installed and the user is brought to the login screen again.

Drivers and Stuff

We had considered upgrading drivers and even apps with this process, as we did for the 1903 upgrade, however user experience was important for us and we didn’t want the upgrade to take any longer than necessary, so we decided not to chain anything onto the upgrade process itself but handle other things separately. That being said, because this is a custom solution it is perfectly possible to incorporate additional activities into it if desired.

Rollback

In the event the that OS was rolled back during the offline phase, a scheduled task will run that will detect this and raise a toast notification to inform the user. We have a script that will gather logs and data from the device and upload it to a storage account in Azure where an administrator can remotely diagnose the issue. I plan to incorporate that as an automatic part of the process in a future version.

Updater Script

The solution creates an Updater scheduled task which runs once per day. The purpose of this task is to keep the solution up to date. If we want to change something in the process, add some code to a file or whatever is necessary, the Updater will take care of this.

It works by downloading a manifest file from the Azure CDN. This file contains all the files used by the solution with their current versions. If we update something, we upload the new files to the Azure storage account, purge them from the CDN and update the manifest file.

The Updater script will download the current manifest, detect that something has changed and download the required files to the device.

Cleanup Script

A Cleanup task is also created. When this task detects that the OS has been upgraded to the required version, it will remove all the scheduled tasks and cached content to leave no footprint on the device other than the log file and the registry keys.

Source Files

You can find a generalised version of the code used in this solution in my Github repo as a reference. As mentioned before though, there are many working parts to the solution including the Azure services and I haven’t documented their configuration here.

Final Comments

The main benefit of this solution for us is that it is completely customised to our needs. Although it is relatively complex to create, it is also relatively easy to maintain as well as adapt the solution for new W10 versions. We do still take advantage of products like ConfigMgr to allow devices to get content from a local DP if they are corporate connected, and ConfigMgr / Update Compliance / Desktop Analytics for helping us determine device compatibility and ConfigMgr or Intune to actually get the deployment script to the device. We also make good use of Azure services for the status messages and the cloud database, as well as PowerBI for reporting. So the solution still utilizes existing Microsoft products while giving us the control and customisations that we need to provide a better upgrade experience for our users.

Windows 10 Feature Update Readiness PowerBI Report (MEMCM version)

Following on from my previous post where I shared a PowerBI report that provides information on Windows 10 feature update blocks using Update Compliance and Desktop Analytics, in this post I will share another report that exposes similar data, but this time built from custom hardware inventory data in MEMCM.

Outside of the Windows setup process, feature update compatibility is assessed on a Windows 10 device by the ‘Microsoft Compatibility Appraiser’ – a scheduled task that runs daily and the results of which form part of the telemetry data that gets uploaded to Microsoft from the device. The specifics of this process are still somewhat shrouded in mystery, but thanks to the dedication of people like Adam Gross we can lift the lid a little bit on understanding compatibility assessment results. I highly recommend reading his blog if your interested in a more in-depth understanding.

Some compatibility data is stored in the registry under HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AppCompatFlags, and in particular the subkey TargetVersionUpgradeExperienceIndicators contains useful compatibility information for different W10 releases such as the GatedBlockIds, otherwise known as the Safeguard Hold Ids, some of which are published by Microsoft on their Windows 10 Known Issues pages. Since the release of Windows 10 2004, many devices have been prevented from receiving a W10 feature update by these Safeguard holds, so reporting on them can be useful to identify which devices are affected by which blocks, and which devices are not affected and are candidates for upgrade.

By inventorying these registry keys with MEMCM I built the PowerBI report shown above where you can view data such as:

  • Which Safeguard holds are affecting which devices
  • Rank of model count
  • Rank of OS version count
  • The upgrade experience indicators (UE)
  • UE red block reasons

Note that this data alone doesn’t replace a solution like Desktop Analytics which can help identify devices with potential app or driver compatibility issues, but it’s certainly helpful with the Safeguard holds.

You can also use this data to build collections in MEMCM containing devices that are affected by a Safeguard hold. Because this is based on inventory data, when a Safeguard hold is released by Microsoft those devices will move naturally out those collections.

Understanding the data

Because of the lack of any public documentation around the compatibility appraiser process, we have to take (hopefully!) intelligent guesses as to what the data means.

Under the TargetVersionUpgradeExperienceIndicators registry key for example, you may find subkeys for 19H1, 20H1, 21H1 or even older Windows 10 versions. I haven’t found any keys for *H2 releases though, and I can only assume it’s because the Safeguard holds for a H1 release are the same for the H2 release. From the Windows 10 Known Issues documentation this seems to be the case.

There is also a UNV subkey – I assume that means Universal and contains data that applies across any feature update.

Under the *H1 keys (I suppose I should call it a branch, really) we can try to understand some of the main keys such as:

  • FailedPrereqs – I haven’t seen any devices yet that actually failed the appraiser’s prerequisites, but I assume the details would be logged here if they were.
  • AppraiserVersion, SdbVer, Version, DateVer* – I assume these indicate the version of the compatibility appraiser database used for the assessment
  • DataExp*, DataRel* – these seem to indicate the release and expiry dates for the Appraiser database so my assumption is a new one will be downloaded at or before expiry
  • GatedBlock* – the Id key in particular gives the Safeguard Hold Id/s that are blocking the device from upgrade
  • Perf – this appears to be a general assessment of the performance of the device. A low performing device will likely take longer to upgrade
  • UpgEx* – these seem to be a traffic-light rating for the ‘upgrade experience’. The UpgExU seems to stand for Upgrade Experience Usage – I don’t know what the difference between the two is. Green is good, right, so a green device is going to be a good upgrade experience, yellow or orange not so great, red is a blocker. I don’t know exactly what defines each colour other than that…
  • RedReason – if you’ve got a red device, it’s blocked from upgrade by something – but this isn’t related to Safeguard holds as far as I can tell. It seems to be related to the keys under HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AppCompatFlags\CompatMarkers\*H1, such as BlockedByBios, BlockedByCpu for example. The only one I’ve seen in practice is the SystemDriveTooFull block.

Configure Custom Hardware Inventory

Alright, so first we need to configure hardware inventory in MEMCM to include the registry keys we want to use. You can use the RegKey2Mof utility, or you can download the files below to update your Configuration.mof file and your Client Settings / hardware inventory classes. I’ll assume you are familiar with that process.

Configuration.mof.additions

20H1_AppCompat.mof

21H1_AppCompat.mof

If you choose to use different names for the created classes, you’ll need to update the PowerBI report as it uses those names.

Download the PowerBI Report

Download the PBI template from here:

Windows 10 Feature Updates Readiness

Open opening, you’ll need to add the SQL server and database name for your MEMCM database:

You won’t see any data until devices have started sending hardware inventory that includes the custom classes.

Note that I have included pages for both 20H1 and 21H1, but the latter is just a placeholder for now as no actual compatibility data will be available until that version is released, or close to.

MEMCM collections

You can also build collections like those shown above by adding a query rule and using the names created by the custom classes – in this case TwentyHOne and TwentyOneHOne. Use the value option to find which GatedBlockIds are present in your environment.

Hope it helps!

PowerBI Reports for Windows 10 Feature Update Compliance

This morning I saw an interesting tweet from Sandy Zeng with a Log Analytics workbook she’d created for W10 feature updates based on Update Compliance data. I’d been meaning to create a similar report for that myself in PowerBI for some time, so I took inspiration from her tweet and got to work on something!

Microsoft’s Update Compliance solution can be used to report on software updates and feature updates status across your estate from Windows telemetry data. If you use Desktop Analytics, you can even combine the data for richer reporting.

Log Analytics allows exporting of queries in Power Query M formula language, which can be imported into PowerBI to create some nice reports.

Here’s a screenshot of what I ended up with. You can filter the data to view devices with Safeguard holds, for example, which since Windows 10 2004 has been a show stopper for many wanting to upgrade…

There are pages for 2004 and 20H2, as well as a page listing some of the known Safeguard hold IDs that have been publicly disclosed by Microsoft.

You can download this report for your own use with the links below. Note I have created two reports – the first assumes you have both Update Compliance and Desktop Analytics using the same Log Analytics workspace. If this is not the case for you, download the second report which doesn’t link to DA data and just uses Update Compliance. The only data I’ve included from DA is the make and model since these can be helpful in analysing devices affected by Safeguard holds.

Windows 10 Feature Update Compliance

Windows 10 Feature Update Compliance (no DA)

To create your own report, you’ll need the latest version of PowerBI desktop installed, with preview support for dynamic M query parameters.

Open PowerBI and go to File > Options and settings > Options > Preview features and enable Dynamic M Query Parameters.

Restart PowerBI and open the downloaded PBI template. Upon opening, you’ll be prompted for the Log Analytics workspace ID. You can find this on the Overview pane of your workspace in the Azure portal.

The reports contain data with a timespan of the last 48 hours, but you can change this if you want by editing the queries in the Advanced editor, and changing the value “P2D”.

You also might want to play around with the LastScan filter so you only get devices with a recent scan date, and avoid duplicates.

Hope it’s helpful!

Get a daily admin Audit Report for MEM / Intune

In an environment where you have multiple admin users it’s useful to audit admin activities so everyone can be aware of changes that others have made. I do this for Endpoint Configuration Manager with a daily email report built from admin status messages, so I decided to create something similar for Intune / MEM.

Admin actions are already audited for you in MEM (Tenant Administration > Audit logs) so it’s simply a case of getting that data into an email report. You can do this with Graph (which gives you more data actually) but I decided to use Log Analytics for this instead.

You need a Log Analytics workspace, and you need to configure Diagnostics settings in the MEM portal to send AuditLogs to the workspace.

Then, in order to automate sending a daily report create a service principal in Azure AD with just the permissions necessary to read data from the Log Analytics workspace. You can do this easily from the Azure portal using CloudShell. In the example below, I’m creating a new service principal with the role “Log Analytics Reader” scoped just to the Log Analytics workspace where the AuditLogs are sent to.

$DisplayName = "MEM-Reporting"
$Role = "Log Analytics Reader"
$Scope = "/subscriptions/<subscriptionId>/resourcegroups/<resourcegroupname>/providers/microsoft.operationalinsights/workspaces/<workspacename>"

$sp = New-AzADServicePrincipal -DisplayName $DisplayName -Role $Role -Scope $Scope

With the service principal created, you’ll need to make a note of the ApplicationId:

$sp.ApplicationId

And the secret:

$SP.Secret | ConvertFrom-SecureString -AsPlainText

Of course, if you prefer you can use certificate authentication instead of using the secret key.

Below is a PowerShell script that uses the Az PowerShell module to connect to the log analytics workspace as the service principal, query the IntuneAuditLogs for entries in the last 24 hours, then send them in an HTML email report. Run it with your favourite automation tool.

You’ll need the app Id and secret from the service principal, your tenant Id, your log analytics workspace Id, and don’t forget to update the email parameters.

Sample email report
# Script to send a daily audit report for admin activities in MEM/Intune
# Requirements:
# – Log Analytics Workspace
# – Intune Audit Logs saved to workspace
# – Service Principal with 'Log Analytics reader' role in workspace
# – Azure Az PowerShell modules
# Azure resource info
$ApplicationId = "abc73938-0000-0000-0000-9b01316a9123" # Service Principal Application Id
$Secret = "489j49r-0000-0000-0000-e2dc6451123" # Service Principal Secret
$TenantID = "abc894e7-00000-0000-0000-320d0334b123" # Tenant ID
$LAWorkspaceID = "abcc1e47-0000-0000-0000-b7ce2b2bb123" # Log Analytics Workspace ID
$Timespan = (New-TimeSpan Hours 24)
# Email params
$EmailParams = @{
To = 'trevor.jones@smsagent.blog'
From = 'MEMReporting@smsagent.blog'
Smtpserver = 'smsagent.mail.protection.outlook.com'
Port = 25
Subject = "MEM Audit Report | $(Get-Date Format ddMMMyyyy)"
}
# Html CSS style
$Style = @"
<style>
table {
border-collapse: collapse;
font-family: sans-serif
font-size: 12px
}
td, th {
border: 1px solid #ddd;
padding: 6px;
}
th {
padding-top: 8px;
padding-bottom: 8px;
text-align: left;
background-color: #3700B3;
color: #03DAC6
}
</style>
"@
# Connect to Azure with Service Principal
$Creds = [PSCredential]::new($ApplicationId,(ConvertTo-SecureString $Secret AsPlaintext Force))
Connect-AzAccount ServicePrincipal Credential $Creds Tenant $TenantID
# Run the Log Analytics Query
$Query = "IntuneAuditLogs | sort by TimeGenerated desc"
$Results = Invoke-AzOperationalInsightsQuery WorkspaceId $LAWorkspaceID Query $Query Timespan $Timespan
$ResultsArray = [System.Linq.Enumerable]::ToArray($Results.Results)
# Converts the results to a datatable
$DataTable = New-Object System.Data.DataTable
$Columns = @("Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID")
foreach ($Column in $Columns)
{
[void]$DataTable.Columns.Add($Column)
}
foreach ($result in $ResultsArray)
{
$Properties = $Result.Properties | ConvertFrom-Json
[void]$DataTable.Rows.Add(
$Properties.ActivityDate,
$result.Identity,
$Properties.Actor.ApplicationName,
$result.OperationName,
$result.ResultType,
$Properties.TargetDisplayNames[0],
$Properties.TargetObjectIDs[0]
)
}
# Send an email
If ($DataTable.Rows.Count -ge 1)
{
$HTML = $Datatable |
ConvertTo-Html Property "Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID" Head $Style Body "<h2>MEM Admin Activities in the last 24 hours</h2>" |
Out-String
Send-MailMessage @EmailParams Body $html BodyAsHtml
}

Get Program Execution History from a ConfigMgr Client with PowerShell

Have you ever been in the situation where something unexpected happens on a users computer and people start pointing their fingers at the ConfigMgr admin and asking “has anyone deployed something with SCCM?” Well, I decided to write a PowerShell script to retrieve the execution history for ConfigMgr programs on a local or remote client. This gives clear visibility of when and which deployments such as applications/programs/task sequences have run on the client and hopefully acquit you (or prove you guilty!)

Program execution history can be found in the registry but it doesn’t contain the name of the associated package, so I joined that data with software distribution data from WMI to give a better view.

You can run the script against the local machine, or a remote machine if you have PS remoting enabled. You can also run it against multiple machines at the same time and combine the data if desired. I recommend to pipe the results to grid view.

Get-CMClientExecutionHistory -Computername PC001,PC002 | Out-GridView
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
[string[]]$ComputerName = $env:COMPUTERNAME
)
Begin
{
$Code = {
# Get Execution History from registry, and package details from WMI
$ExecutionHistoryKey = "HKLM:\SOFTWARE\Microsoft\SMS\Mobile Client\Software Distribution\Execution History"
$ContextKeys = Get-ChildItem $ExecutionHistoryKey | Select ExpandProperty PSChildName
foreach ($ContextKey in $ContextKeys)
{
If ($ContextKey -eq "System")
{
$ContextKey = "Machine"
}
Else
{
$ContextKey = $ContextKey.Replace('','_')
}
[array]$SoftwareDistribution += Get-CimInstance Namespace ROOT\ccm\Policy\$ContextKey ClassName CCM_SoftwareDistribution
}
# Create a datatable to hold the results
$DataTable = New-Object System.Data.DataTable
[void]$DataTable.Columns.Add("ComputerName")
[void]$DataTable.Columns.Add("PackageName")
[void]$DataTable.Columns.Add("PackageID")
[void]$DataTable.Columns.Add("ProgramName")
[void]$DataTable.Columns.Add("DeploymentStatus")
[void]$DataTable.Columns.Add("Context")
[void]$DataTable.Columns.Add("State")
[void]$DataTable.Columns.Add("RunStartTime")
[void]$DataTable.Columns.Add("SuccessOrFailureCode")
[void]$DataTable.Columns.Add("SuccessOrFailureReason")
foreach ($ContextKey in $ContextKeys)
{
If ($ContextKey -ne "System")
{
# Get user context if applicable
$SID = New-Object Security.Principal.SecurityIdentifier ArgumentList $ContextKey
$Context = $SID.Translate([System.Security.Principal.NTAccount])
}
Else
{
$Context = "Machine"
}
$SubKeys = Get-ChildItem "$ExecutionHistoryKey\$ContextKey"
Foreach ($SubKey in $SubKeys)
{
$Items = Get-ChildItem $SubKey.PSPath
Foreach ($Item in $Items)
{
$PackageInfo = $SoftwareDistribution | Where {$_.PKG_PackageID -eq $SubKey.PSChildName -and $_.PRG_ProgramName -eq $Item.GetValue("_ProgramID")} | Select First 1
If ($PackageInfo)
{
$PackageName = $PackageInfo.PKG_Name
$DeploymentStatus = "Active"
}
Else
{
$PackageName = "-Unknown-"
$DeploymentStatus = "No longer targeted"
}
[void]$DataTable.Rows.Add($using:Computer,$PackageName,$SubKey.PSChildName,$Item.GetValue("_ProgramID"),$DeploymentStatus,$Context,$Item.GetValue("_State"),$Item.GetValue("_RunStartTime"),$Item.GetValue("SuccessOrFailureCode"),$Item.GetValue("SuccessOrFailureReason"))
}
}
}
$DataTable.DefaultView.Sort = "RunStartTime DESC"
$DataTable = $DataTable.DefaultView.ToTable()
Return $DataTable
}
}
Process
{
foreach ($Computer in $ComputerName)
{
If ($Computer -eq $env:COMPUTERNAME)
{
$Result = Invoke-Command ScriptBlock $Code
}
Else
{
$Result = Invoke-Command ComputerName $Computer HideComputerName ScriptBlock $Code ErrorAction Continue
}
$Result | Select ComputerName,PackageName,PackageID,ProgramName,DeploymentStatus,Context,State,RunStartTime,SuccessOrFailureCode,SuccessOrFailureReason
}
}
End
{
}

Get Previous and Scheduled Evaluation Times for ConfigMgr Compliance Baselines with PowerShell

I was testing a compliance baseline recently and wanted to verify if the schedule defined in the baseline deployment is actually honored on the client. I set the schedule to run every hour, but it was clear that it did not run every hour and that some randomization was being used.

To review the most recent evaluation times and the next scheduled evaluation time, I had to read the scheduler.log in the CCM\Logs directory, because I could only find a single last evaluation time recorded in WMI.

The following PowerShell script reads which baselines are currently deployed to the local machine, displays a window for you to choose one, then basically reads the Scheduler log to find when the most recent evaluations were and when the next one is scheduled.

Select a baseline
Baseline evaluations
##############################################################
## ##
## Reads the most recent and next scheduled evaluation time ##
## for deployed Compliance Baselines from the Scheduler.log ##
## ##
##############################################################
#requires -RunAsAdministrator
# Get Baselines from WMI
# Excludes co-management policies
Try
{
$Instances = Get-CimInstance Namespace ROOT\ccm\dcm ClassName SMS_DesiredConfiguration Filter "PolicyType!=1" OperationTimeoutSec 5 ErrorAction Stop | Select DisplayName,IsMachineTarget,Name
}
Catch
{
Throw "Couldn't get baseline info from WMI: $_"
}
If ($Instances.Count -eq 0)
{
Throw "No deployed baselines found!"
}
# Datatable to hold the baselines for the WPF window
$DataTable = New-Object System.Data.DataTable
[void]$DataTable.Columns.Add("DisplayName")
[void]$DataTable.Columns.Add("IsMachineTarget")
foreach ($Instance in ($Instances | Sort DisplayName))
{
[void]$DataTable.Rows.Add($Instance.DisplayName,$Instance.IsMachineTarget)
}
# WPF Window for baseline selection
Add-Type AssemblyName PresentationFramework,PresentationCore,WindowsBase
$Window = New-Object System.Windows.Window
$Window.WindowStartupLocation = [System.Windows.WindowStartupLocation]::CenterScreen
$Window.SizeToContent = [System.Windows.SizeToContent]::WidthAndHeight
$window.ResizeMode = [System.Windows.ResizeMode]::NoResize
$Window.Title = "DOUBLE-CLICK A BASELINE TO SELECT"
$DataGrid = New-Object System.Windows.Controls.DataGrid
$DataGrid.ItemsSource = $DataTable.DefaultView
$DataGrid.CanUserAddRows = $False
$DataGrid.IsReadOnly = $true
$DataGrid.SelectionMode = [System.Windows.Controls.DataGridSelectionMode]::Single
$DataGrid.Height = "NaN"
$DataGrid.MaxHeight = "250"
$DataGrid.Width = "NaN"
$DataGrid.AlternatingRowBackground = "#e6ffcc"
$DataGrid.Add_MouseDoubleClick({
$script:SelectedRow = $This.SelectedValue
$Window.Close()
})
$Window.AddChild($DataGrid)
[void]$Window.ShowDialog()
If (!$SelectedRow)
{
Throw "No baseline was selected!"
}
# If the baseline is user-targetted
If ($SelectedRow.row.IsMachineTarget -eq $false)
{
# Get Logged-on user SID
$LogonUIRegPath = "HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Authentication\LogonUI"
#Could also use this:
#Get-ItemProperty -Path HKLM:\SOFTWARE\Microsoft\SMS\CurrentUser -Name UserSID -ErrorAction Stop
$Property = "LastLoggedOnUserSID"
$LastLoggedOnUserSID = Get-ItemProperty Path $LogonUIRegPath Name $Property | Select ExpandProperty $Property
$LastLoggedOnUserSIDUnderscore = $LastLoggedOnUserSID.Replace('','_')
$Namespace = "ROOT\ccm\Policy\$LastLoggedOnUserSIDUnderscore\ActualConfig"
}
Else
{
$Namespace = "ROOT\ccm\Policy\Machine\ActualConfig"
}
# Get assignment info
$BaselineName = $SelectedRow.Row.DisplayName
$Pattern = [Regex]::Escape($BaselineName)
$CIAssignment = Get-CimInstance Namespace $Namespace ClassName CCM_DCMCIAssignment | where {$_.AssignmentName -match $Pattern}
$AssignmentIDs = $CIAssignment | Select AssignmentID,AssignmentName
Write-host "Baseline: $BaselineName" ForegroundColor Magenta
foreach ($AssignmentID in $AssignmentIDs)
{
# Read the scheduler log
$Log = "$env:SystemRoot\CCM\Logs\Scheduler.log"
If ($SelectedRow.row.IsMachineTarget -eq $false)
{
$LogEntries = Select-String Path $Log SimpleMatch "$LastLoggedOnUserSID/$($AssignmentID.AssignmentID)"
}
Else
{
$LogEntries = Select-String Path $Log SimpleMatch "Machine/$($AssignmentID.AssignmentID)"
}
If ($LogEntries)
{
# Get the previous evaluations date/time
$Evaluations = New-Object System.Collections.ArrayList
$EvaluationEntries = $LogEntries | where {$_ -match "SMSTrigger"}
Foreach ($Entry in $EvaluationEntries)
{
$Time = $Entry.Line.Split('=')[1]
$Date = $Entry.Line.Split('=')[2]
$a = $Time.Split()[0].trimend().replace('"','')
$b = $Date.Split()[0].trimend().replace('"','').replace('','/')
$Time = (Get-Date $a).ToLongTimeString()
$Date = [DateTime]"$b $Time"
$LocalDate = Get-Date $date Format (Get-Culture).DateTimeFormat.RFC1123Pattern
[void]$Evaluations.Add($LocalDate)
}
# Get the next scheduled evaluation date/time
$LastEvaluation = $EvaluationEntries | Select Last 1
$date = $LastEvaluation.Line.Split()[8]
$time = $LastEvaluation.Line.Split()[9]
$ampm = $LastEvaluation.Line.Split()[10]
$NextEvaluation = [DateTime]"$date $time $ampm"
$NextEvaluationLocal = Get-Date $NextEvaluation Format (Get-Culture).DateTimeFormat.RFC1123Pattern
# Return the results
Write-Host "Assignment: $($AssignmentID.AssignmentName)" ForegroundColor Green
Write-host "Last Evaluations:"
foreach ($Evaluation in $Evaluations)
{
Write-host " $Evaluation" ForegroundColor Yellow
}
Write-host "Next Scheduled Evaluation:"
Write-Host " $NextEvaluationLocal" ForegroundColor Yellow
}
Else
{
Write-Host "No log entries found!" ForegroundColor Red
}
}

HTML Report for SCCM Site Component Warnings and Errors

Just a quick one 🙂

If you’re like me you are too lazy busy to regularly check the component status of an SCCM Site Server for any issues, so why not get PowerShell to do it for you?

The code below will email an html-formatted report of any site components that are currently in an error or warning status, together with the last few error or warning status messages for each component. Run it as a scheduled task or with your favorite automation tool to keep your eye on any current issues. Whether you get annoyed because you now created more work for yourself, or get happy because you can stay on top of issues in your SCCM environment, I leave to you!

The report will display the components that are marked as either critical or warning with the current number of messages:

It will then display the last x status messages for each component for a quick view of what the current issue/s are:

Run the script either on the site server or somewhere where the SCCM console is installed, and set the required parameters in the script.

#####################################################################################################
## ##
## This script checks for any SCCM Site Server components currently in an error or warning ##
## state and emails it as an html report, including the latest status messages for each component. ##
## ##
#####################################################################################################
################
## PARAMETERS ##
################
# Site server FQDN
$SiteServer = "SCCMServer.Contoso.com"
# Site code
$SiteCode = "ABC"
# Location of the resource dlls in the SCCM admin console path
$script:SMSMSGSLocation = $env:SMS_ADMIN_UI_PATH\00000409
# SCCM SQL Server / instance
$script:dataSource = 'SCCMServer'
# SCCM SQL database
$script:database = 'CM_ABC'
# Number of Status messages to report
$SMCount = 5
# Tally interval – see https://docs.microsoft.com/en-us/sccm/develop/core/servers/manage/about-configuration-manager-tally-intervals
$TallyInterval = '0001128000100008'
# Email params
$EmailParams = @{
To = 'joe.bloggs@contoso.com'
From = 'SCCMReports@contoso.com'
Smtpserver = 'contoso-com.mail.protection.outlook.com'
Port = 25
Subject = "SCCM Site Server Component Status Report | $SiteServer | $SiteCode | $(Get-Date Format ddMMMyyyy)"
}
# Html CSS style
$Style = @"
<style>
table {
border-collapse: collapse;
}
td, th {
border: 1px solid #ddd;
padding: 8px;
}
th {
padding-top: 12px;
padding-bottom: 12px;
text-align: left;
background-color: #4286f4;
color: white;
}
h2 {
color: red;
}
</style>
"@
###############
## FUNCTIONS ##
###############
# Function to get data from SQL server
function Get-SQLData {
param($Query)
$connectionString = "Server=$dataSource;Database=$database;Integrated Security=SSPI;"
$connection = New-Object TypeName System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $Query
$reader = $command.ExecuteReader()
$table = New-Object TypeName 'System.Data.DataTable'
$table.Load($reader)
# Close the connection
$connection.Close()
return $Table
}
# Function to get the status message description
function Get-StatusMessage {
param (
$MessageID,
[ValidateSet("srvmsgs.dll","provmsgs.dll","climsgs.dll")]$DLL,
[ValidateSet("Informational","Warning","Error")]$Severity,
$InsString1,
$InsString2,
$InsString3,
$InsString4,
$InsString5,
$InsString6,
$InsString7,
$InsString8,
$InsString9,
$InsString10
)
# Set the resources dll
Switch ($DLL)
{
"srvmsgs.dll" { $stringPathToDLL = "$SMSMSGSLocation\srvmsgs.dll" }
"provmsgs.dll" { $stringPathToDLL = "$SMSMSGSLocation\provmsgs.dll" }
"climsgs.dll" { $stringPathToDLL = "$SMSMSGSLocation\climsgs.dll" }
}
# Load Status Message Lookup DLL into memory and get pointer to memory
$ptrFoo = $Win32LoadLibrary::LoadLibrary($stringPathToDLL.ToString())
$ptrModule = $Win32GetModuleHandle::GetModuleHandle($stringPathToDLL.ToString())
# Set severity code
Switch ($Severity)
{
"Informational" { $code = 1073741824 }
"Warning" { $code = 2147483648 }
"Error" { $code = 3221225472 }
}
# Format the message
$result = $Win32FormatMessage::FormatMessage($flags, $ptrModule, $Code -bor $MessageID, 0, $stringOutput, $sizeOfBuffer, $stringArrayInput)
if ($result -gt 0)
{
# Add insert strings to message
$objMessage = New-Object System.Object
$objMessage | Add-Member type NoteProperty name MessageString value $stringOutput.ToString().Replace("%11","").Replace("%12","").Replace("%3%4%5%6%7%8%9%10","").Replace("%1",$InsString1).Replace("%2",$InsString2).Replace("%3",$InsString3).Replace("%4",$InsString4).Replace("%5",$InsString5).Replace("%6",$InsString6).Replace("%7",$InsString7).Replace("%8",$InsString8).Replace("%9",$InsString9).Replace("%10",$InsString10)
}
Return $objMessage
}
#################
## MAIN SCRIPT ##
#################
# SQL query for component status
$Query = "
Select
ComponentName,
ComponentType,
Case
when Status = 0 then 'OK'
when Status = 1 then 'Warning'
when Status = 2 then 'Critical'
End as 'Status',
Case
when State = 0 then 'Stopped'
when State = 1 then 'Started'
when State = 2 then 'Paused'
when State = 3 then 'Installing'
when State = 4 then 'Re-installing'
when State = 5 then 'De-installing'
End as 'State',
Case
When AvailabilityState = 0 then 'Online'
When AvailabilityState = 3 then 'Offline'
When AvailabilityState = 4 then 'Unknown'
End as 'AvailabilityState',
Infos,
Warnings,
Errors
from vSMS_ComponentSummarizer
where TallyInterval = N'$TallyInterval'
and MachineName = '$SiteServer'
and SiteCode = '$SiteCode '
and Status in (1,2)
Order by Status,ComponentName
"
$Results = Get-SQLData Query $Query
# Convert results to HTML
$HTML = $Results |
ConvertTo-Html Property "ComponentName","ComponentType","Status","State","AvailabilityState","Infos","Warnings","Errors" Head $Style Body "<h2>Components in a Warning or Error State</h2>" CssUri "http://www.w3schools.com/lib/w3.css" |
Out-String
$HTML = $HTML + "<h2></h2><h2>Last $SMCount Error or Warning Status Messages for…</h2>"
If ($Results)
{
# Start PInvoke Code
$sigFormatMessage = @'
[DllImport("kernel32.dll")]
public static extern uint FormatMessage(uint flags, IntPtr source, uint messageId, uint langId, StringBuilder buffer, uint size, string[] arguments);
'@
$sigGetModuleHandle = @'
[DllImport("kernel32.dll")]
public static extern IntPtr GetModuleHandle(string lpModuleName);
'@
$sigLoadLibrary = @'
[DllImport("kernel32.dll")]
public static extern IntPtr LoadLibrary(string lpFileName);
'@
$Win32FormatMessage = Add-Type MemberDefinition $sigFormatMessage name "Win32FormatMessage" namespace Win32Functions PassThru Using System.Text
$Win32GetModuleHandle = Add-Type MemberDefinition $sigGetModuleHandle name "Win32GetModuleHandle" namespace Win32Functions PassThru Using System.Text
$Win32LoadLibrary = Add-Type MemberDefinition $sigLoadLibrary name "Win32LoadLibrary" namespace Win32Functions PassThru Using System.Text
#End PInvoke Code
$sizeOfBuffer = [int]16384
$stringArrayInput = {"%1","%2","%3","%4","%5", "%6", "%7", "%8", "%9"}
$flags = 0x00000800 -bor 0x00000200
$stringOutput = New-Object System.Text.StringBuilder $sizeOfBuffer
# Process each resulting component
Foreach ($Result in $Results)
{
# Query SQL for status messages
$Component = $Result.ComponentName
$SMQuery = "
select
top $SMCount
smsgs.RecordID,
CASE smsgs.Severity
WHEN -1073741824 THEN 'Error'
WHEN 1073741824 THEN 'Informational'
WHEN -2147483648 THEN 'Warning'
ELSE 'Unknown'
END As 'SeverityName',
case smsgs.MessageType
WHEN 256 THEN 'Milestone'
WHEN 512 THEN 'Detail'
WHEN 768 THEN 'Audit'
WHEN 1024 THEN 'NT Event'
ELSE 'Unknown'
END AS 'Type',
smsgs.MessageID,
smsgs.Severity,
smsgs.MessageType,
smsgs.ModuleName,
modNames.MsgDLLName,
smsgs.Component,
smsgs.MachineName,
smsgs.Time,
smsgs.SiteCode,
smwis.InsString1,
smwis.InsString2,
smwis.InsString3,
smwis.InsString4,
smwis.InsString5,
smwis.InsString6,
smwis.InsString7,
smwis.InsString8,
smwis.InsString9,
smwis.InsString10
from v_StatusMessage smsgs
join v_StatMsgWithInsStrings smwis on smsgs.RecordID = smwis.RecordID
join v_StatMsgModuleNames modNames on smsgs.ModuleName = modNames.ModuleName
where smsgs.MachineName = '$SiteServer'
and smsgs.Component = '$Component'
and smsgs.Severity in ('-1073741824','-2147483648')
Order by smsgs.Time DESC
"
$StatusMsgs = Get-SQLData Query $SMQuery
# Put desired fields into an object for each result
$StatusMessages = @()
foreach ($Row in $StatusMsgs)
{
$Params = @{
MessageID = $Row.MessageID
DLL = $Row.MsgDLLName
Severity = $Row.SeverityName
InsString1 = $Row.InsString1
InsString2 = $Row.InsString2
InsString3 = $Row.InsString3
InsString4 = $Row.InsString4
InsString5 = $Row.InsString5
InsString6 = $Row.InsString6
InsString7 = $Row.InsString7
InsString8 = $Row.InsString8
InsString9 = $Row.InsString9
InsString10 = $Row.InsString10
}
$Message = Get-StatusMessage @params
$StatusMessage = New-Object psobject
Add-Member InputObject $StatusMessage Name Severity MemberType NoteProperty Value $Row.SeverityName
Add-Member InputObject $StatusMessage Name Type MemberType NoteProperty Value $Row.Type
Add-Member InputObject $StatusMessage Name SiteCode MemberType NoteProperty Value $Row.SiteCode
Add-Member InputObject $StatusMessage Name "Date / Time" MemberType NoteProperty Value $Row.Time
Add-Member InputObject $StatusMessage Name System MemberType NoteProperty Value $Row.MachineName
Add-Member InputObject $StatusMessage Name Component MemberType NoteProperty Value $Row.Component
Add-Member InputObject $StatusMessage Name Module MemberType NoteProperty Value $Row.ModuleName
Add-Member InputObject $StatusMessage Name MessageID MemberType NoteProperty Value $Row.MessageID
Add-Member InputObject $StatusMessage Name Description MemberType NoteProperty Value $Message.MessageString
$StatusMessages += $StatusMessage
}
# Add to the HTML code
$HTML = $HTML + (
$StatusMessages |
ConvertTo-Html Property "Severity","Date / Time","MessageID","Description" Head $Style Body "<h2>$Component</h2>" CssUri "http://www.w3schools.com/lib/w3.css" |
Out-String
)
}
# Fire the email
Send-MailMessage @EmailParams Body $Html BodyAsHtml
}

Create Collections for SCCM Client Installation Failures by Error Code

Ok, so in a perfect SCCM world you would never have any SCCM client installation failures and this post would be totally unnecessary. But in the real world, you are very likely to have a number of systems that fail to install the SCCM client and the reasons can be many.

To identify such systems, it can be helpful to create collections for some of the common client installation failure codes so you can easily see and report on which type of installation failures you are experiencing and the number of systems affected.

To identify the installation failure error codes you have in your environment for Windows systems, run the following SQL query against the SCCM database:

select 
	Count(cdr.Name) as 'Count',
	cdr.CP_LastInstallationError as 'Last Installation Error Code'
from v_CombinedDeviceResources cdr
where
	cdr.CP_LastInstallationError is not null
	and cdr.IsClient = 0
	and cdr.DeviceOS like '%Windows%'
group by cdr.CP_LastInstallationError
order by 'Count' desc
Client installation error counts

Next simply create a collection for each error code using the following WQL query, changing the LastInstallationError value to the relevant error code:

select 
    SYS.ResourceID,
    SYS.ResourceType,
    SYS.Name,
    SYS.SMSUniqueIdentifier,
    SYS.ResourceDomainORWorkgroup,
    SYS.Client 
from SMS_R_System as SYS 
Inner Join SMS_CM_RES_COLL_SMS00001 as COL on SYS.ResourceID = COL.ResourceID  
Where COL.LastInstallationError = 53 
And (SYS.Client = 0  Or SYS.Client is null)

Error codes are all fine and dandy, but unless you have an error code database in your head you’ll want to translate those codes to friendly descriptions. To do that, I use a PowerShell function I created that pulls the description from the SrsResources.dll which you can find in any SCCM console installation. There’s more than one way to translate error codes though – see my blog post here. Better yet, create yourself an error code SQL database which you can join to in your SQL queries and is super useful for reporting purposes – see my post here.

Anyway, once you’ve translated the error codes, you can name your collections with them for easy reference:

Client installation failure collections

Now comes the hard part – figuring out how to fix those errors and working through all the affected systems 😬