Deploying HP BIOS Updates – a real world example

Not so long ago HP published a customer advisory listing a number of their models that need to be on the latest BIOS release to be upgraded to Windows 10 2004. Since we were getting ready to rollout 20H2 we encountered some affected models in piloting, which prompted me to find that advisory and then get the BIOS updated on affected devices.

To be honest, until now we’ve never pushed out BIOS updates to anyone, but to get these devices updated to 20H2 we now had no choice. In this post I’m just going to share how we did that. For us, good user experience is critical but finding the balance between keeping devices secure and up-to-date without being too disruptive to the user can be a challenge!

First off was to create a script that could update the BIOS on any supported HP workstation, without needing to package and distribute BIOS update content. I know other equally handsome community members have shared some great scripts and solutions for doing BIOS updates, but I decided to create my own in this instance to meet our particular requirements and afford a bit more control over the update process. I considered using the HP Client Management Script Library which seems purpose-built for this kind of task and is a great resource, but I preferred not to have the dependency of an external PowerShell module and its requirements.

I published a version of this script in Github here. The script does the following:

  • Creates a working directory under ProgramData
  • Disables the IE first run wizard which causes a problem for the Invoke-WebRequest cmdlet when running in a context where IE hasn’t been initialised
  • Checks the HP Image Assistant (HPIA) web page for the latest version and downloads it
  • Extracts and runs the HPIA to identify the latest applicable BIOS update (if any)
  • If a BIOS update is available, downloads and extracts the softpaq
  • Checks if a BIOS password has been set, if so creates an encrypted password file as required by the firmware update utility
  • Runs the firmware update utility to stage the update
  • Everything is logged to the working directory and the logs are uploaded to Azure blob storage upon completion, or if something fails, so we can review them without requiring remote access to the user’s computer

It all runs silently without any user interaction and it can be run on any HP model that the HPIA supports.

In Production however, we used a slightly modified version of this script. Since there was the possibility that there could be unknown BIOS passwords in use out there, we decided not to try to flash the BIOS using an encrypted password file, but instead try to remove the BIOS password altogether (temporarily!) When the BIOS update is staged it simply copies the password file to the staging volume – it doesn’t check whether the password is correct or not. If it is not correct, the user would then be asked for the correct password when the BIOS is flashed and that is not cool! Removing the password meant that the user could never be unexpectedly prompted for the password in the event that the provided password file was incorrect. Of course, to remove the password you also have to know the password, so we tried the ones we knew and if they worked, great, if they didn’t, the script would simply exit out as a failsafe.

We have a compliance baseline deployed with MEMCM that sets the BIOS password on any managed workstation that does not have one set, so after the machine rebooted, the BIOS flashed and machine starts up again, before long the CB would run and set the password again.

Doing this also meant that we needed to ensure the computer was restarted asap after the update was staged – and for another reason as well – Bitlocker encryption is suspended until the update is applied and the machine restarted.

Because we didn’t want to force the update on users and force a restart on them, we decided to package the script as an application in MEMCM. This meant a couple of things:

  • We could put a nice corporate logo on the app and make it available for install in the Software Center
  • We could handle the return codes with custom actions. In this case, we are expecting the PowerShell script to exit successfully with code 0, and when it does we’ve set that code to be a soft reboot so that MEMCM restart notifications are then displayed to the user.

As a detection method, the following PowerShell code was used. This simply detects if the last write time of the log file has changed within the last hour, if it has, it’s installed. Longer than an hour and it’s available for install again if needed.

$Log = Get-ChildItem C:\ProgramData\Contoso\HP_BIOS_Update -Recurse -Include HP_BIOS_Update.log -ErrorAction SilentlyContinue | 
    where {([DateTime]::Now - $_.LastWriteTime).TotalHours -le 1}
If ($Log)
{
    Write-Host "Installed"
}

We then deployed the application with an available deployment. We communicated with the users directly to inform them a BIOS update needed to be installed on their device in order to remain secure and up-to-date, and directed them to the Software Center to install it themselves.

We also prepared collections in SCCM using query-based membership rules to identify the machines that were affected by the HP advisory, and an SQL query to find the same information and pull the full user name and email address from inventoried data.

The script does contain the BIOS password in clear text which, or course, may not meet your security requirements, although for us this password is not really that critical – it’s just there to help prevent the user from making unauthorized changes in the BIOS. In our Production script though, we simply converted these to base64 before adding them to the script to at least provide some masking. But for greater security you could consider storing the password in Azure key vault and fetch it at run time with a web request, for example.

If you wish to use the script in your own environment, you’ll need to change the location of the working directory as you desire. Additionally if you wish to upload the log files to an Azure storage container, you’ll need to have or create a container and add the URL and the SAS token string to the script, or else just comment out the Upload-LogFilesToAzure function where it’s used. I’m a big fan of sending log files to Azure storage especially during this season where many are working from home and may not be corporate connected. You can just use Azure Storage Explorer to download the log files which will open up in CMTrace if that’s your default log viewer.

Hope this is helpful to someone! The PS script is below.

#####################
## HP BIOS UPDATER ##
#####################
# Params
$HPIAWebUrl = "https://ftp.hp.com/pub/caps-softpaq/cmit/HPIA.html" # Static web page of the HP Image Assistant
$BIOSPassword = "MyPassword"
$script:ContainerURL = "https://mystorageaccount.blob.core.windows.net/mycontainer" # URL of your Azure blob storage container
$script:FolderPath = "HP_BIOS_Updates" # the subfolder to put logs into in the storage container
$script:SASToken = "mysastoken" # the SAS token string for the container (with write permission)
$ProgressPreference = 'SilentlyContinue' # to speed up web requests
################################
## Create Directory Structure ##
################################
$RootFolder = $env:ProgramData
$ParentFolderName = "Contoso"
$ChildFolderName = "HP_BIOS_Update"
$ChildFolderName2 = Get-Date Format "yyyy-MMM-dd_HH.mm.ss"
$script:WorkingDirectory = "$RootFolder\$ParentFolderName\$ChildFolderName\$ChildFolderName2"
try
{
[void][System.IO.Directory]::CreateDirectory($WorkingDirectory)
}
catch
{
throw
}
# Function write to a log file in ccmtrace format
Function script:Write-Log {
param (
[Parameter(Mandatory = $true)]
[string]$Message,
[Parameter()]
[ValidateSet(1, 2, 3)] # 1-Info, 2-Warning, 3-Error
[int]$LogLevel = 1,
[Parameter(Mandatory = $true)]
[string]$Component,
[Parameter(Mandatory = $false)]
[object]$Exception
)
$LogFile = "$WorkingDirectory\HP_BIOS_Update.log"
If ($Exception)
{
[String]$Message = "$Message" + "$Exception"
}
$TimeGenerated = "$(Get-Date Format HH:mm:ss).$((Get-Date).Millisecond)+000"
$Line = '<![LOG[{0}]LOG]!><time="{1}" date="{2}" component="{3}" context="" type="{4}" thread="" file="">'
$LineFormat = $Message, $TimeGenerated, (Get-Date Format MMddyyyy), $Component, $LogLevel
$Line = $Line -f $LineFormat
# Write to log
Add-Content Value $Line Path $LogFile ErrorAction SilentlyContinue
}
# Function to upload log file to Azure Blob storage
Function Upload-LogFilesToAzure {
$Date = Get-date Format "yyyy-MM-dd_HH.mm.ss"
$HpFirmwareUpdRecLog = Get-ChildItem Path $WorkingDirectory Include HpFirmwareUpdRec.log Recurse ErrorAction SilentlyContinue
$HPBIOSUPDRECLog = Get-ChildItem Path $WorkingDirectory Include HPBIOSUPDREC64.log Recurse ErrorAction SilentlyContinue
If ($HpFirmwareUpdRecLog)
{
$File = $HpFirmwareUpdRecLog
}
ElseIf ($HPBIOSUPDRECLog)
{
$File = $HPBIOSUPDRECLog
}
Else{}
If ($File)
{
$Body = Get-Content $($File.FullName) Raw ErrorAction SilentlyContinue
If ($Body)
{
$URI = "$ContainerURL/$FolderPath/$($Env:COMPUTERNAME)`_$Date`_$($File.Name)$SASToken"
$Headers = @{
'x-ms-content-length' = $($File.Length)
'x-ms-blob-type' = 'BlockBlob'
}
Invoke-WebRequest Uri $URI Method PUT Headers $Headers Body $Body ErrorAction SilentlyContinue
}
}
$File2 = Get-Item $WorkingDirectory\HP_BIOS_Update.log ErrorAction SilentlyContinue
$Body2 = Get-Content $($File2.FullName) Raw ErrorAction SilentlyContinue
If ($Body2)
{
$URI2 = "$ContainerURL/$FolderPath/$($Env:COMPUTERNAME)`_$Date`_$($File2.Name)$SASToken"
$Headers2 = @{
'x-ms-content-length' = $($File2.Length)
'x-ms-blob-type' = 'BlockBlob'
}
Invoke-WebRequest Uri $URI2 Method PUT Headers $Headers2 Body $Body2 ErrorAction SilentlyContinue
}
}
Write-Log Message "#######################" Component "Preparation"
Write-Log Message "## Starting BIOS update run ##" Component "Preparation"
Write-Log Message "#######################" Component "Preparation"
#################################
## Disable IE First Run Wizard ##
#################################
# This prevents an error running Invoke-WebRequest when IE has not yet been run in the current context
Write-Log Message "Disabling IE first run wizard" Component "Preparation"
$null = New-Item Path "HKLM:\SOFTWARE\Policies\Microsoft" Name "Internet Explorer" Force
$null = New-Item Path "HKLM:\SOFTWARE\Policies\Microsoft\Internet Explorer" Name "Main" Force
$null = New-ItemProperty Path "HKLM:\SOFTWARE\Policies\Microsoft\Internet Explorer\Main" Name "DisableFirstRunCustomize" PropertyType DWORD Value 1 Force
##########################
## Get latest HPIA Info ##
##########################
Write-Log Message "Finding info for latest version of HP Image Assistant (HPIA)" Component "DownloadHPIA"
try
{
$HTML = Invoke-WebRequest Uri $HPIAWebUrl ErrorAction Stop
}
catch
{
Write-Log Message "Failed to download the HPIA web page. $($_.Exception.Message)" Component "DownloadHPIA" LogLevel 3
UploadLogFilesToAzure
throw
}
$HPIASoftPaqNumber = ($HTML.Links | Where {$_.href -match "hp-hpia-"}).outerText
$HPIADownloadURL = ($HTML.Links | Where {$_.href -match "hp-hpia-"}).href
$HPIAFileName = $HPIADownloadURL.Split('/')[-1]
Write-Log Message "SoftPaq number is $HPIASoftPaqNumber" Component "DownloadHPIA"
Write-Log Message "Download URL is $HPIADownloadURL" Component "DownloadHPIA"
###################
## Download HPIA ##
###################
Write-Log Message "Downloading the HPIA" Component "DownloadHPIA"
try
{
$ExistingBitsJob = Get-BitsTransfer Name "$HPIAFileName" AllUsers ErrorAction SilentlyContinue
If ($ExistingBitsJob)
{
Write-Log Message "An existing BITS tranfer was found. Cleaning it up." Component "DownloadHPIA" LogLevel 2
Remove-BitsTransfer BitsJob $ExistingBitsJob
}
$BitsJob = Start-BitsTransfer Source $HPIADownloadURL Destination $WorkingDirectory\$HPIAFileName Asynchronous DisplayName "$HPIAFileName" Description "HPIA download" RetryInterval 60 ErrorAction Stop
do {
Start-Sleep Seconds 5
$Progress = [Math]::Round((100 * ($BitsJob.BytesTransferred / $BitsJob.BytesTotal)),2)
Write-Log Message "Downloaded $Progress`%" Component "DownloadHPIA"
} until ($BitsJob.JobState -in ("Transferred","Error"))
If ($BitsJob.JobState -eq "Error")
{
Write-Log Message "BITS tranfer failed: $($BitsJob.ErrorDescription)" Component "DownloadHPIA" LogLevel 3
UploadLogFilesToAzure
throw
}
Write-Log Message "Download is finished" Component "DownloadHPIA"
Complete-BitsTransfer BitsJob $BitsJob
Write-Log Message "BITS transfer is complete" Component "DownloadHPIA"
}
catch
{
Write-Log Message "Failed to start a BITS transfer for the HPIA: $($_.Exception.Message)" Component "DownloadHPIA" LogLevel 3
UploadLogFilesToAzure
throw
}
##################
## Extract HPIA ##
##################
Write-Log Message "Extracting the HPIA" Component "Analyze"
try
{
$Process = Start-Process FilePath $WorkingDirectory\$HPIAFileName WorkingDirectory $WorkingDirectory ArgumentList "/s /f .\HPIA\ /e" NoNewWindow PassThru Wait ErrorAction Stop
Start-Sleep Seconds 5
If (Test-Path $WorkingDirectory\HPIA\HPImageAssistant.exe)
{
Write-Log Message "Extraction complete" Component "Analyze"
}
Else
{
Write-Log Message "HPImageAssistant not found!" Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
}
catch
{
Write-Log Message "Failed to extract the HPIA: $($_.Exception.Message)" Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
##############################################
## Analyze available BIOS updates with HPIA ##
##############################################
Write-Log Message "Analyzing system for available BIOS updates" Component "Analyze"
try
{
$Process = Start-Process FilePath $WorkingDirectory\HPIA\HPImageAssistant.exe WorkingDirectory $WorkingDirectory ArgumentList "/Operation:Analyze /Category:BIOS /Selection:All /Action:List /Silent /ReportFolder:$WorkingDirectory\Report" NoNewWindow PassThru Wait ErrorAction Stop
If ($Process.ExitCode -eq 0)
{
Write-Log Message "Analysis complete" Component "Analyze"
}
elseif ($Process.ExitCode -eq 256)
{
Write-Log Message "The analysis returned no recommendation. No BIOS update is available at this time" Component "Analyze" LogLevel 2
UploadLogFilesToAzure
Exit 0
}
elseif ($Process.ExitCode -eq 4096)
{
Write-Log Message "This platform is not supported!" Component "Analyze" LogLevel 2
UploadLogFilesToAzure
throw
}
Else
{
Write-Log Message "Process exited with code $($Process.ExitCode). Expecting 0." Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
}
catch
{
Write-Log Message "Failed to start the HPImageAssistant.exe: $($_.Exception.Message)" Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
# Read the XML report
Write-Log Message "Reading xml report" Component "Analyze"
try
{
$XMLFile = Get-ChildItem Path "$WorkingDirectory\Report" Recurse Include *.xml ErrorAction Stop
If ($XMLFile)
{
Write-Log Message "Report located at $($XMLFile.FullName)" Component "Analyze"
try
{
[xml]$XML = Get-Content Path $XMLFile.FullName ErrorAction Stop
$Recommendation = $xml.HPIA.Recommendations.BIOS.Recommendation
If ($Recommendation)
{
$CurrentBIOSVersion = $Recommendation.TargetVersion
$ReferenceBIOSVersion = $Recommendation.ReferenceVersion
$DownloadURL = "https://" + $Recommendation.Solution.Softpaq.Url
$SoftpaqFileName = $DownloadURL.Split('/')[-1]
Write-Log Message "Current BIOS version is $CurrentBIOSVersion" Component "Analyze"
Write-Log Message "Recommended BIOS version is $ReferenceBIOSVersion" Component "Analyze"
Write-Log Message "Softpaq download URL is $DownloadURL" Component "Analyze"
}
Else
{
Write-Log Message "Failed to find a BIOS recommendation in the XML report" Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
}
catch
{
Write-Log Message "Failed to parse the XML file: $($_.Exception.Message)" Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
}
Else
{
Write-Log Message "Failed to find an XML report." Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
}
catch
{
Write-Log Message "Failed to find an XML report: $($_.Exception.Message)" Component "Analyze" LogLevel 3
UploadLogFilesToAzure
throw
}
###############################
## Download the BIOS softpaq ##
###############################
Write-Log Message "Downloading the Softpaq" Component "DownloadBIOSUpdate"
try
{
$ExistingBitsJob = Get-BitsTransfer Name "$SoftpaqFileName" AllUsers ErrorAction SilentlyContinue
If ($ExistingBitsJob)
{
Write-Log Message "An existing BITS tranfer was found. Cleaning it up." Component "DownloadBIOSUpdate" LogLevel 2
Remove-BitsTransfer BitsJob $ExistingBitsJob
}
$BitsJob = Start-BitsTransfer Source $DownloadURL Destination $WorkingDirectory\$SoftpaqFileName Asynchronous DisplayName "$SoftpaqFileName" Description "BIOS update download" RetryInterval 60 ErrorAction Stop
do {
Start-Sleep Seconds 5
$Progress = [Math]::Round((100 * ($BitsJob.BytesTransferred / $BitsJob.BytesTotal)),2)
Write-Log Message "Downloaded $Progress`%" Component "DownloadBIOSUpdate"
} until ($BitsJob.JobState -in ("Transferred","Error"))
If ($BitsJob.JobState -eq "Error")
{
Write-Log Message "BITS tranfer failed: $($BitsJob.ErrorDescription)" Component "DownloadBIOSUpdate" LogLevel 3
UploadLogFilesToAzure
throw
}
Write-Log Message "Download is finished" Component "DownloadBIOSUpdate"
Complete-BitsTransfer BitsJob $BitsJob
Write-Log Message "BITS transfer is complete" Component "DownloadBIOSUpdate"
}
catch
{
Write-Log Message "Failed to start a BITS transfer for the BIOS update: $($_.Exception.Message)" Component "DownloadBIOSUpdate" LogLevel 3
UploadLogFilesToAzure
throw
}
#########################
## Extract BIOS Update ##
#########################
Write-Log Message "Extracting the BIOS Update" Component "ExtractBIOSUpdate"
$BIOSUpdateDirectoryName = $SoftpaqFileName.Split('.')[0]
try
{
$Process = Start-Process FilePath $WorkingDirectory\$SoftpaqFileName WorkingDirectory $WorkingDirectory ArgumentList "/s /f .\$BIOSUpdateDirectoryName\ /e" NoNewWindow PassThru Wait ErrorAction Stop
Start-Sleep Seconds 5
$HpFirmwareUpdRec = Get-ChildItem Path $WorkingDirectory Include HpFirmwareUpdRec.exe Recurse ErrorAction SilentlyContinue
$HPBIOSUPDREC = Get-ChildItem Path $WorkingDirectory Include HPBIOSUPDREC.exe Recurse ErrorAction SilentlyContinue
If ($HpFirmwareUpdRec)
{
$BIOSExecutable = $HpFirmwareUpdRec
}
ElseIf ($HPBIOSUPDREC)
{
$BIOSExecutable = $HPBIOSUPDREC
}
Else
{
Write-Log Message "BIOS update executable not found!" Component "ExtractBIOSUpdate" LogLevel 3
UploadLogFilesToAzure
throw
}
Write-Log Message "Extraction complete" Component "ExtractBIOSUpdate"
}
catch
{
Write-Log Message "Failed to extract the softpaq: $($_.Exception.Message)" Component "ExtractBIOSUpdate" LogLevel 3
UploadLogFilesToAzure
throw
}
#############################
## Check for BIOS password ##
#############################
try
{
$SetupPwd = (Get-CimInstance Namespace ROOT\HP\InstrumentedBIOS ClassName HP_BIOSPassword Filter "Name='Setup Password'" ErrorAction Stop).IsSet
If ($SetupPwd -eq 1)
{
Write-Log Message "The BIOS has a password set" Component "BIOSPassword"
$BIOSPasswordSet = $true
}
Else
{
Write-Log Message "No password has been set on the BIOS" Component "BIOSPassword"
}
}
catch
{
Write-Log Message "Unable to determine if a BIOS password has been set: $($_.Exception.Message)" Component "BIOSPassword" LogLevel 3
UploadLogFilesToAzure
throw
}
##########################
## Create password file ##
##########################
If ($BIOSPasswordSet)
{
Write-Log Message "Creating an encrypted password file" Component "BIOSPassword"
$HpqPswd = Get-ChildItem Path $WorkingDirectory Include HpqPswd.exe Recurse ErrorAction SilentlyContinue
If ($HpqPswd)
{
try
{
$Process = Start-Process FilePath $HpqPswd.FullName WorkingDirectory $WorkingDirectory ArgumentList "-p""$BIOSPassword"" -f.\password.bin -s" NoNewWindow PassThru Wait ErrorAction Stop
Start-Sleep Seconds 5
If (Test-Path $WorkingDirectory\password.bin)
{
Write-Log Message "File successfully created" Component "BIOSPassword"
}
Else
{
Write-Log Message "Encrypted password file could not be found!" Component "BIOSPassword" LogLevel 3
UploadLogFilesToAzure
throw
}
}
catch
{
Write-Log Message "Failed to create an encrypted password file: $($_.Exception.Message)" Component "BIOSPassword" LogLevel 3
UploadLogFilesToAzure
throw
}
}
else
{
Write-Log Message "Failed to locate HP password encryption utility!" Component "BIOSPassword" LogLevel 3
UploadLogFilesToAzure
throw
}
}
###########################
## Stage the BIOS update ##
###########################
Write-Log Message "Staging BIOS firmware update" Component "BIOSFlash"
try
{
If ($BIOSPasswordSet)
{
$Process = Start-Process FilePath "$WorkingDirectory\$BIOSUpdateDirectoryName\$BIOSExecutable" WorkingDirectory $WorkingDirectory ArgumentList "-s -p.\password.bin -f.\$BIOSUpdateDirectoryName -r -b" NoNewWindow PassThru Wait ErrorAction Stop
}
Else
{
$Process = Start-Process FilePath "$WorkingDirectory\$BIOSUpdateDirectoryName\$BIOSExecutable" WorkingDirectory $WorkingDirectory ArgumentList "-s -f.\$BIOSUpdateDirectoryName -r -b" NoNewWindow PassThru Wait ErrorAction Stop
}
If ($Process.ExitCode -eq 3010)
{
Write-Log Message "The update has been staged. The BIOS will be updated on restart" Component "BIOSFlash"
}
Else
{
Write-Log Message "An unexpected exit code was returned: $($Process.ExitCode)" Component "BIOSFlash" LogLevel 3
UploadLogFilesToAzure
throw
}
}
catch
{
Write-Log Message "Failed to stage BIOS update: $($_.Exception.Message)" Component "BIOSFlash" LogLevel 3
UploadLogFilesToAzure
throw
}
Write-Log Message "This BIOS update run is complete. Have a nice day!" Component "Completion"
UploadLogFilesToAzure

Getting Creative: a Bespoke Solution for Feature Update Deployments

This is the first blog post in what I hope will be a series of posts demonstrating several custom solutions I created for things such as feature update deployments, managing local admin password rotation, provisioning Windows 10 devices, managing drive mappings and more. My reasons for creating these solutions was to overcome some of the current limitations in existing products or processes, make things more cloud-first and independent of existing on-prem infrastructure where possible, and to more exactly meet the requirements of the business.

Although I will try to provide a generalised version of the source code where possible, I am not providing complete solutions that you can go ahead and use as is. Rather my intention is to inspire your own creativity, to give working examples of what could be done if you have the time and resource, and to provide source code as a reference or starting point for your own solutions should you wish to create them!

Someone asked me recently how we deploy feature updates and it was a difficult question to answer other than to say we use a custom-built process. Having used some of the existing methods available (ConfigMgr Software Updates, ConfigMgr custom WaaS process, Windows Update for Business) we concluded there were shortcomings in each of them, and this provided inspiration to create our own, customized process to give us the control, reliability, user experience and reporting capability that we desired. Don’t get me wrong – I am not saying these methods aren’t good – they just couldn’t do things exactly the way we wanted.

So I set out to create a bespoke process – one that we could customize according to our needs, that was largely independent of our existing Configuration Manager infrastructure and that could run on any device with internet access. This required making use of cloud services in Azure as well as a lot of custom scripting! In this blog, I’ll try to cover what I did and how it works.

User Experience

First, let’s look at the end user experience of the feature update installation process – this was something key for us, to improve the user experience keeping it simple yet informative, and able to respond appropriately to any upgrade issues.

Once the update is available to a device, a toast notification is displayed notifying the user that an update is available. Initially, this displays once a day and automatically dismisses after 25 seconds. (I’ve blanked out our corporate branding in all these images)

We use a soft deadline – ie the update is never forced on the user. Enforcing compliance is handled by user communications and involvement from our local technicians. With one week left before the deadline, we increase the frequency of the notifications to twice per day.

If the deadline has passed, we take a more aggressive approach with the notifications, modifying the image and text, displaying it every 30 minutes and it doesn’t leave the screen unless the user actions or dismisses it.

The update can be installed via a shortcut on the desktop, or in the last notification it can be initiated from the notification itself.

Once triggered, a custom UI is displayed introducing the user to the update and what to expect.

When the user clicks Begin, we check that a power adapter is connected and no removable USB devices are attached – if they are, we prompt to the user to remove them first.

The update runs in three phases or stages – these correspond to the PreDownload, Install and Finalize commands on the update (more on that later). The progress of each stage is polled from the registry, as is the Setup Phase and Setup SubPhase.

Note that the user cannot cancel the update once it starts and this window will remain on the screen and on top of all other windows until the update is complete. The user can click the Hide me button, and this will shrink the window like so:

This little window also cannot be removed from the screen, but it can be moved around and is small enough to be unobtrusive. When the update has finished installing, or when the user clicks Restore, the main window will automatically display again and report the result of the update.

The colour scheme is based on Google’s material design, by the way.

If the update failed during the online phase, the user can still initiate the update from the desktop shortcut but toast notifications will no longer display as reminders. The idea is that IT can attempt to remediate the device and run the update again after.

If successful, the user can click Restart to restart the computer immediately. Then the offline phase of the upgrade runs, where you see the usual light blue screen and white Windows update text reporting that updates are being installed.

Once complete, the user will be brought back to the login screen, and we won’t bother them anymore.

If the update rolled back during the offline phase, we will detect this next time they log in and notify them one time:

Logging and Reporting

The entire update process is logged right from the start to a log file on the local machine. We also send ‘status messages’ at key points during the process and these find their way to an Azure SQL database which becomes the source for reporting on update progress across the estate (more on this later).

A PowerBI report gives visual indicators of update progress as well as a good amount of detail from each machine including update status, whether it passed or failed readiness checks and if failed, why, whether it passed the compatibility assessment, if it failed the assessment or the install we give the error code, whether any hard blocks were found, setup diag results (2004 onward), how long the update took to install and a bunch of other stuff we find useful.

Since 2004 though, we have starting inventorying certain registry keys using ConfigMgr to give us visibility of devices that won’t upgrade because of a Safeguard hold or other reason, so we can target the upgrade only to devices that aren’t reporting any known compatibility issues.

If a device performs a rollback, we can get it to upload key logs and registry key dumps to an Azure storage account where an administrator can remotely diagnose the issue.

How does it work?

Now lets dive into the process in more technical detail.

Deployment Script

The update starts life with a simple PowerShell script that does the following:

  • Creates a local directory to use to cache content, scripts and logs etc
  • Temporarily stores some domain credentials in the registry of the local SYSTEM account as encrypted secure strings for accessing content from a ConfigMgr distribution point if necessary (more on this later)
  • Downloads a manifest file that contains a list of all files and file versions that need to be downloaded to run the update. These include scripts, dlls (for the UI), xml definition files for scheduled tasks etc
  • Each file is then downloaded to the cache directory from an Azure CDN
  • 3 scheduled tasks are then registered on the client:
    • A ‘preparer’ task which runs prerequisite actions
    • A ‘file updater’ task which keeps local files up-to-date in case we wish to change something
    • A ‘content cleanup’ task which is responsible for cleaning up in the event the device gets upgraded through any means
  • A ‘status message’ is then sent as an http request, creating a new record for the device in the Azure SQL database

This script can be deployed through any method you wish, including Configuration Manager, Intune or just manually, however it should be run in SYSTEM context.

Content

All content needed for the update process to run is put into a container in a storage account in Azure. This storage account is exposed via an Azure Content Delivery Network (CDN). This means that clients can get all the content they need directly from an internet location with minimal latency no matter where they are in the world.

Feature Update Files

The files for the feature update itself are the ESD file and WindowsUpdateBox.exe that Windows Update uses. You can get these files from Windows Update, WSUS, or as in our case, from Configuration Manager via WSUS. We simply download the feature updates to a deployment package in ConfigMgr and grab the content from there.

You could of course use an ISO image and run setup.exe, but the ESD files are somewhat smaller in size and are sufficient for purpose.

The ESD files are put into the Azure CDN so the client can download them from there, but we also allow the client the option to get the FU content from a local ConfigMgr distribution point if they are connected to the corporate network locally. Having this option allows considerably quicker content download. Since IIS on the distribution points is not open to anonymous authentication, we use the domain credentials stamped to the registry to access the DP and download the content directly from IIS (credentials are cleaned from the registry after use).

Status Messages

Similar to how ConfigMgr sends status message to a management point, this solution also send status messages at key points during the process. This works by using Azure Event Grid to receive the message sent from the client as an http request. The Event Grid sends the message to an Azure Function, and the Azure Function is responsible to update the Azure SQL database created for this purpose with the current upgrade status of the device. The reason for doing it this way is that sending an http request to Event Grid is very quick and doesn’t hold up the process. Event Grid forwards the message to the Azure Function and can retry the message in the case it can’t get through immediately (although I’ve never experienced any failures or dead-lettering in practice). The Azure Function uses a Managed Identity to access the SQL database, which means the SQL database never needs to be exposed outside of its sandbox in Azure, and no credentials are needed to update the database.

We then use PowerBI to report on the data in the database to give visibility of where in the process every device is, if there are any issues that need addressing and all the stats that are useful for understanding whether devices get content from Azure or a local DP, what their approximate bandwidth is, how long downloads took, whether they were wired or wireless, make and model, upgrade time etc.

Preparation Script

After the initial deployment script has run, the entire upgrade process is driven by scheduled tasks on the client. The first task to run is the Preparation script and this attempts to run every hour until successful completion. This script does the following things:

  • Create the registry keys for the upgrade. These keys are stamped with the update progress and the results of the various actions such as pre-req checks, downloads etc. When we send a ‘status message’ we simply read these keys and send them on. Having progress stamped in the local registry is useful if we need to troubleshoot on the device directly.
  • Run readiness checks, such as
    • Checking for client OS
    • Checking disk space
  • Check for internet connectivity
  • Determine the approximate bandwidth to the Azure CDN and measure latency. This is done by downloading a 100MB file from the CDN and timing the download and using ‘psping.exe’ to measure latency. From this, we can calculate an approximate download time for the main ESD file.
  • Determine if the device is connected by wire or wireless
  • Determine if the device is connected to the corporate network
  • If the device is on the corporate network, we check latency to all the ConfigMgr distribution points to determine which one will be the best DP to get content from
  • Determine whether OS is business or consumer and which language. This helps us figure out which ESD file to use.
  • Download WindowsUpdateBox.exe and verify the hash
  • Download the feature update ESD file and verify the hash
    • Downloads of FU content is done using BITS transfers as this proved the most reliable method. Code is added to handle BITS transfer errors to add resilience.
  • Assuming all the above is done successfully, the Preparation task will be disabled and the PreDownload task created.

PreDownload Script

The purpose of this script is to run the equivalent of a compatibility assessment. When using the ESD file, this is done with the /PreDownload switch on WindowsUpdateBox.exe. Should the PreDownload fail, the error code will be logged to the registry. Since 2004, we also read the SetupDiag results and stamp these to the registry. We also check the Compat*.xml files to look for any hard blocks and if found, we log the details to the registry.

If the PreDownload failed, we change the schedule of the task to run twice a week. This allows for remediation to be performed on the device before attempting the PreDownload assessment again.

If the PreDownload succeeds, we disable the PreDownload task and create two new ones – a Notification task and an Upgrade task.

We also create a desktop shortcut that the user can use to initiate the upgrade.

Notification Script

The Notification script runs in the user context and displays toast notifications to notify the user that the upgrade is available, what the deadline is and how to upgrade, as already mentioned.

Upgrade Script

When the user clicks the desktop shortcut or the ‘Install now’ button on the toast notification, the upgrade is initiated. Because the upgrade needs to run with administrative privilege, the only thing the desktop shortcut and toast notification button does is to create an entry in the Application event log. The upgrade scheduled task is triggered when this event is created and the task runs in SYSTEM context. The UI is displayed in the user session with the help of the handy ServiceUI.exe from the MDT toolkit.

Upgrade UI

The user interface part of the upgrade is essentially a WPF application coded in PowerShell. The UI displays some basic upgrade information for the user and once they click ‘Begin’ we run the upgrade in 3 stages:

  1. PreDownload. Even though we ran this already, we run again before installing just to make sure nothing has changed since, and it doesn’t take long to run.
  2. Install. This uses the /Install switch on WindowsUpdateBox.exe and runs the main part of the online phase of the upgrade.
  3. Finalize. This uses the /Finalize switch and finalizes the update in preparation for a computer restart.

The progress of each of these phases is tracked in the registry and displayed in the UI using progress bars. If there is an issue, we notify the user and IT can get involved to remediate.

If successful, the user can restart the computer immediately or a later point (though we discourage this!). We don’t stop the user from working while the upgrade is running in the online phase and we allow them to partially hide the upgrade window so the upgrade does not hinder user productivity (similar to how WUfB installs an update in the background.)

After the user restarts the computer, the usual Windows Update screens take over until the update has installed and the user is brought to the login screen again.

Drivers and Stuff

We had considered upgrading drivers and even apps with this process, as we did for the 1903 upgrade, however user experience was important for us and we didn’t want the upgrade to take any longer than necessary, so we decided not to chain anything onto the upgrade process itself but handle other things separately. That being said, because this is a custom solution it is perfectly possible to incorporate additional activities into it if desired.

Rollback

In the event the that OS was rolled back during the offline phase, a scheduled task will run that will detect this and raise a toast notification to inform the user. We have a script that will gather logs and data from the device and upload it to a storage account in Azure where an administrator can remotely diagnose the issue. I plan to incorporate that as an automatic part of the process in a future version.

Updater Script

The solution creates an Updater scheduled task which runs once per day. The purpose of this task is to keep the solution up to date. If we want to change something in the process, add some code to a file or whatever is necessary, the Updater will take care of this.

It works by downloading a manifest file from the Azure CDN. This file contains all the files used by the solution with their current versions. If we update something, we upload the new files to the Azure storage account, purge them from the CDN and update the manifest file.

The Updater script will download the current manifest, detect that something has changed and download the required files to the device.

Cleanup Script

A Cleanup task is also created. When this task detects that the OS has been upgraded to the required version, it will remove all the scheduled tasks and cached content to leave no footprint on the device other than the log file and the registry keys.

Source Files

You can find a generalised version of the code used in this solution in my Github repo as a reference. As mentioned before though, there are many working parts to the solution including the Azure services and I haven’t documented their configuration here.

Final Comments

The main benefit of this solution for us is that it is completely customised to our needs. Although it is relatively complex to create, it is also relatively easy to maintain as well as adapt the solution for new W10 versions. We do still take advantage of products like ConfigMgr to allow devices to get content from a local DP if they are corporate connected, and ConfigMgr / Update Compliance / Desktop Analytics for helping us determine device compatibility and ConfigMgr or Intune to actually get the deployment script to the device. We also make good use of Azure services for the status messages and the cloud database, as well as PowerBI for reporting. So the solution still utilizes existing Microsoft products while giving us the control and customisations that we need to provide a better upgrade experience for our users.

PowerBI Reports for Windows 10 Feature Update Compliance

This morning I saw an interesting tweet from Sandy Zeng with a Log Analytics workbook she’d created for W10 feature updates based on Update Compliance data. I’d been meaning to create a similar report for that myself in PowerBI for some time, so I took inspiration from her tweet and got to work on something!

Microsoft’s Update Compliance solution can be used to report on software updates and feature updates status across your estate from Windows telemetry data. If you use Desktop Analytics, you can even combine the data for richer reporting.

Log Analytics allows exporting of queries in Power Query M formula language, which can be imported into PowerBI to create some nice reports.

Here’s a screenshot of what I ended up with. You can filter the data to view devices with Safeguard holds, for example, which since Windows 10 2004 has been a show stopper for many wanting to upgrade…

There are pages for 2004 and 20H2, as well as a page listing some of the known Safeguard hold IDs that have been publicly disclosed by Microsoft.

You can download this report for your own use with the links below. Note I have created two reports – the first assumes you have both Update Compliance and Desktop Analytics using the same Log Analytics workspace. If this is not the case for you, download the second report which doesn’t link to DA data and just uses Update Compliance. The only data I’ve included from DA is the make and model since these can be helpful in analysing devices affected by Safeguard holds.

Windows 10 Feature Update Compliance

Windows 10 Feature Update Compliance (no DA)

To create your own report, you’ll need the latest version of PowerBI desktop installed, with preview support for dynamic M query parameters.

Open PowerBI and go to File > Options and settings > Options > Preview features and enable Dynamic M Query Parameters.

Restart PowerBI and open the downloaded PBI template. Upon opening, you’ll be prompted for the Log Analytics workspace ID. You can find this on the Overview pane of your workspace in the Azure portal.

The reports contain data with a timespan of the last 48 hours, but you can change this if you want by editing the queries in the Advanced editor, and changing the value “P2D”.

You also might want to play around with the LastScan filter so you only get devices with a recent scan date, and avoid duplicates.

Hope it’s helpful!

Get a daily admin Audit Report for MEM / Intune

In an environment where you have multiple admin users it’s useful to audit admin activities so everyone can be aware of changes that others have made. I do this for Endpoint Configuration Manager with a daily email report built from admin status messages, so I decided to create something similar for Intune / MEM.

Admin actions are already audited for you in MEM (Tenant Administration > Audit logs) so it’s simply a case of getting that data into an email report. You can do this with Graph (which gives you more data actually) but I decided to use Log Analytics for this instead.

You need a Log Analytics workspace, and you need to configure Diagnostics settings in the MEM portal to send AuditLogs to the workspace.

Then, in order to automate sending a daily report create a service principal in Azure AD with just the permissions necessary to read data from the Log Analytics workspace. You can do this easily from the Azure portal using CloudShell. In the example below, I’m creating a new service principal with the role “Log Analytics Reader” scoped just to the Log Analytics workspace where the AuditLogs are sent to.

$DisplayName = "MEM-Reporting"
$Role = "Log Analytics Reader"
$Scope = "/subscriptions/<subscriptionId>/resourcegroups/<resourcegroupname>/providers/microsoft.operationalinsights/workspaces/<workspacename>"

$sp = New-AzADServicePrincipal -DisplayName $DisplayName -Role $Role -Scope $Scope

With the service principal created, you’ll need to make a note of the ApplicationId:

$sp.ApplicationId

And the secret:

$SP.Secret | ConvertFrom-SecureString -AsPlainText

Of course, if you prefer you can use certificate authentication instead of using the secret key.

Below is a PowerShell script that uses the Az PowerShell module to connect to the log analytics workspace as the service principal, query the IntuneAuditLogs for entries in the last 24 hours, then send them in an HTML email report. Run it with your favourite automation tool.

You’ll need the app Id and secret from the service principal, your tenant Id, your log analytics workspace Id, and don’t forget to update the email parameters.

Sample email report
# Script to send a daily audit report for admin activities in MEM/Intune
# Requirements:
# – Log Analytics Workspace
# – Intune Audit Logs saved to workspace
# – Service Principal with 'Log Analytics reader' role in workspace
# – Azure Az PowerShell modules
# Azure resource info
$ApplicationId = "abc73938-0000-0000-0000-9b01316a9123" # Service Principal Application Id
$Secret = "489j49r-0000-0000-0000-e2dc6451123" # Service Principal Secret
$TenantID = "abc894e7-00000-0000-0000-320d0334b123" # Tenant ID
$LAWorkspaceID = "abcc1e47-0000-0000-0000-b7ce2b2bb123" # Log Analytics Workspace ID
$Timespan = (New-TimeSpan Hours 24)
# Email params
$EmailParams = @{
To = 'trevor.jones@smsagent.blog'
From = 'MEMReporting@smsagent.blog'
Smtpserver = 'smsagent.mail.protection.outlook.com'
Port = 25
Subject = "MEM Audit Report | $(Get-Date Format ddMMMyyyy)"
}
# Html CSS style
$Style = @"
<style>
table {
border-collapse: collapse;
font-family: sans-serif
font-size: 12px
}
td, th {
border: 1px solid #ddd;
padding: 6px;
}
th {
padding-top: 8px;
padding-bottom: 8px;
text-align: left;
background-color: #3700B3;
color: #03DAC6
}
</style>
"@
# Connect to Azure with Service Principal
$Creds = [PSCredential]::new($ApplicationId,(ConvertTo-SecureString $Secret AsPlaintext Force))
Connect-AzAccount ServicePrincipal Credential $Creds Tenant $TenantID
# Run the Log Analytics Query
$Query = "IntuneAuditLogs | sort by TimeGenerated desc"
$Results = Invoke-AzOperationalInsightsQuery WorkspaceId $LAWorkspaceID Query $Query Timespan $Timespan
$ResultsArray = [System.Linq.Enumerable]::ToArray($Results.Results)
# Converts the results to a datatable
$DataTable = New-Object System.Data.DataTable
$Columns = @("Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID")
foreach ($Column in $Columns)
{
[void]$DataTable.Columns.Add($Column)
}
foreach ($result in $ResultsArray)
{
$Properties = $Result.Properties | ConvertFrom-Json
[void]$DataTable.Rows.Add(
$Properties.ActivityDate,
$result.Identity,
$Properties.Actor.ApplicationName,
$result.OperationName,
$result.ResultType,
$Properties.TargetDisplayNames[0],
$Properties.TargetObjectIDs[0]
)
}
# Send an email
If ($DataTable.Rows.Count -ge 1)
{
$HTML = $Datatable |
ConvertTo-Html Property "Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID" Head $Style Body "<h2>MEM Admin Activities in the last 24 hours</h2>" |
Out-String
Send-MailMessage @EmailParams Body $html BodyAsHtml
}

Collecting ConfigMgr Client Logs to Azure Storage

In the 2002 release of Endpoint Configuration Manager, Microsoft added a nice capability to collect log files from a client to the site server. Whilst this is a cool capability, you might not be on 2002 yet or you might prefer to send logs to a storage account in Azure rather than to the site server. You can do that quite easily using the Run Script feature. This works whether the client is connected on the corporate network or through a Cloud Management Gateway.

To do this you need a storage account in Azure, a container in the account, and a Shared access signature.

I’ll assume you have the first two in place, so let’s create a Shared access signature. In the Storage account in the Azure Portal, click on Shared access signature under Settings.

  • Under Allowed services, check Blob.
  • Under Allowed resource types, check Object.
  • Under Allowed permissions, check Create.

Set an expiry date then click Generate SAS and connection string. Copy the SAS token and keep it safe somewhere.

Below is a PowerShell script that will upload client log files to Azure storage.

## Uploads client logs files to Azure storage
$Logs = Get-ChildItem "$env:SystemRoot\CCM\Logs"
$Date = Get-date -Format "yyyy-MM-dd-HH-mm-ss"
$ContainerURL = "https://mystorageaccount.blob.core.windows.net/mycontainer&quot;
$FolderPath = "ClientLogFiles/$($env:COMPUTERNAME)/$Date"
$SASToken = "?sv=2019-10-10&ss=b&srt=o&sp=c&se=2030-05-01T06:31:59Z&st=2020-04-30T22:31:59Z&spr=https&sig=123456789abcdefg"
$Responses = New-Object System.Collections.ArrayList
$Stopwatch = New-object System.Diagnostics.Stopwatch
$Stopwatch.Start()
foreach ($Log in $Logs)
{
$Body = Get-Content $($Log.FullName) -Raw
$URI = "$ContainerURL/$FolderPath/$($Log.Name)$SASToken"
$Headers = @{
'x-ms-content-length' = $($Log.Length)
'x-ms-blob-type' = 'BlockBlob'
}
$Response = Invoke-WebRequest -Uri $URI -Method PUT -Headers $Headers -Body $Body
[void]$Responses.Add($Response)
}
$Stopwatch.Stop()
Write-host "$(($Responses | Where {$_.StatusCode -eq 201}).Count) log files uploaded in $([Math]::Round($Stopwatch.Elapsed.TotalSeconds,2)) seconds."

Update the following parameters in your script:

  • ContainerURL. This is the URL to the container in your storage account. You can find it by clicking on the container, then Properties > URL.
  • SASToken. This is the SAS token string you created earlier.

Create and approve a new Script in ConfigMgr with this code. You can then run it against any online machine, or collection. When it’s complete, it will output how many log files were uploaded and how long the upload took.

To view the log files, you can either browse them in storage account in the Azure portal looking at the container directly, or using the Storage explorer. My preferred method is to use the standalone Microsoft Azure Storage Explorer app, where you can simply double-click a log file to open it, or easily download the folder containing the log files to your local machine.

Delete Device Records in AD / AAD / Intune / Autopilot / ConfigMgr with PowerShell

I’ve done a lot of testing with Windows Autopilot in recent times. Most of my tests are done in virtual machines, which are ideal as I can simply dispose of them after. But you also need to cleanup the device records that were created in Azure Active Directory, Intune, the Autopilot registration service, Microsoft Endpoint Manager (if you’re using it) and Active Directory in the case of Hybrid-joined devices.

To make this a bit easier, I wrote the following PowerShell script. You simply enter the device name and it’ll go and search for that device in any of the above locations that you specify and delete the device records.

The script assumes you have the appropriate permissions, and requires the Microsoft.Graph.Intune and AzureAD PowerShell modules, as well as the Configuration Manager module if you want to delete from there.

You can delete from all of the above locations with the -All switch, or you can specify any combination, for example -AAD -Intune -ConfigMgr, or -AD -Intune etc.

In the case of the Autopilot device registration, the device must also exist in Intune before you attempt to delete it as the Intune record is used to determine the serial number of the device.

Please test thoroughly before using on any production device!

Examples

Delete-AutopilotedDeviceRecords -ComputerName PC01 -All
@(
    'PC01'
    'PC02'
    'PC03'
) | foreach {
    Delete-AutopilotedDeviceRecords -ComputerName $_ -AAD -Intune
}

Output

Script

[CmdletBinding(DefaultParameterSetName='All')]
Param
(
[Parameter(ParameterSetName='All',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
[Parameter(ParameterSetName='Individual',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
$ComputerName,
[Parameter(ParameterSetName='All')]
[switch]$All = $True,
[Parameter(ParameterSetName='Individual')]
[switch]$AD,
[Parameter(ParameterSetName='Individual')]
[switch]$AAD,
[Parameter(ParameterSetName='Individual')]
[switch]$Intune,
[Parameter(ParameterSetName='Individual')]
[switch]$Autopilot,
[Parameter(ParameterSetName='Individual')]
[switch]$ConfigMgr
)
Set-Location $env:SystemDrive
# Load required modules
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Importing modules…" NoNewline
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module Microsoft.Graph.Intune ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module AzureAD ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module $env:SMS_ADMIN_UI_PATH.Replace('i386','ConfigurationManager.psd1') ErrorAction Stop
}
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
# Authenticate with Azure
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-Host "Authenticating with MS Graph and Azure AD…" NoNewline
$intuneId = Connect-MSGraph ErrorAction Stop
$aadId = Connect-AzureAD AccountId $intuneId.UPN ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "Error!" ForegroundColor Red
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
Write-host "$($ComputerName.ToUpper())" ForegroundColor Yellow
Write-Host "===============" ForegroundColor Yellow
# Delete from AD
If ($PSBoundParameters.ContainsKey("AD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Active Directory " ForegroundColor Yellow NoNewline
Write-host "computer account…" NoNewline
$Searcher = [ADSISearcher]::new()
$Searcher.Filter = "(sAMAccountName=$ComputerName`$)"
[void]$Searcher.PropertiesToLoad.Add("distinguishedName")
$ComputerAccount = $Searcher.FindOne()
If ($ComputerAccount)
{
Write-host "Success" ForegroundColor Green
Write-Host " Deleting computer account…" NoNewline
$DirectoryEntry = $ComputerAccount.GetDirectoryEntry()
$Result = $DirectoryEntry.DeleteTree()
Write-Host "Success" ForegroundColor Green
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Azure AD
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Azure AD " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
[array]$AzureADDevices = Get-AzureADDevice SearchString $ComputerName All:$true ErrorAction Stop
If ($AzureADDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
Foreach ($AzureADDevice in $AzureADDevices)
{
Write-host " Deleting DisplayName: $($AzureADDevice.DisplayName) | ObjectId: $($AzureADDevice.ObjectId) | DeviceId: $($AzureADDevice.DeviceId)" NoNewline
Remove-AzureADDevice ObjectId $AzureADDevice.ObjectId ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Intune
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Intune " ForegroundColor Yellow NoNewline
Write-host "managed device record/s…" NoNewline
[array]$IntuneDevices = Get-IntuneManagedDevice Filter "deviceName eq '$ComputerName'" ErrorAction Stop
If ($IntuneDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("All"))
{
foreach ($IntuneDevice in $IntuneDevices)
{
Write-host " Deleting DeviceName: $($IntuneDevice.deviceName) | Id: $($IntuneDevice.Id) | AzureADDeviceId: $($IntuneDevice.azureADDeviceId) | SerialNumber: $($IntuneDevice.serialNumber)" NoNewline
Remove-IntuneManagedDevice managedDeviceId $IntuneDevice.Id Verbose ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete Autopilot device
If ($PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
If ($IntuneDevices.Count -ge 1)
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Autopilot " ForegroundColor Yellow NoNewline
Write-host "device registration…" NoNewline
$AutopilotDevices = New-Object System.Collections.ArrayList
foreach ($IntuneDevice in $IntuneDevices)
{
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities?`$filter=contains(serialNumber,'$($IntuneDevice.serialNumber)')"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod GET ErrorAction Stop
[void]$AutopilotDevices.Add($AutopilotDevice)
}
Write-Host "Success" ForegroundColor Green
foreach ($device in $AutopilotDevices)
{
Write-host " Deleting SerialNumber: $($Device.value.serialNumber) | Model: $($Device.value.model) | Id: $($Device.value.id) | GroupTag: $($Device.value.groupTag) | ManagedDeviceId: $($device.value.managedDeviceId)" NoNewline
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities/$($device.value.Id)"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod DELETE ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
}
# Delete from ConfigMgr
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "ConfigMgr " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
$SiteCode = (Get-PSDrive PSProvider CMSITE ErrorAction Stop).Name
Set-Location ("$SiteCode" + ":") ErrorAction Stop
[array]$ConfigMgrDevices = Get-CMDevice Name $ComputerName Fast ErrorAction Stop
Write-Host "Success" ForegroundColor Green
foreach ($ConfigMgrDevice in $ConfigMgrDevices)
{
Write-host " Deleting Name: $($ConfigMgrDevice.Name) | ResourceID: $($ConfigMgrDevice.ResourceID) | SMSID: $($ConfigMgrDevice.SMSID) | UserDomainName: $($ConfigMgrDevice.UserDomainName)" NoNewline
Remove-CMDevice InputObject $ConfigMgrDevice Force ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
Set-Location $env:SystemDrive

The Cost of Running a Personal Windows 10 VM in Azure

As amazing as it may sound for an IT professional of many years, aside from my work laptops, I do not own a Windows computer! For my personal computing requirements, I use a £200 Chromebook to connect to the internet and do most everything I need in the cloud.

My dear wife, on the other hand, is not a fan of the Chromebook and likes a ‘good old-fashioned’ Windows computer. But since her computing requirements are minimal, I decided to investigate the cost of running a Windows 10 VM in Azure instead of buying a new Windows laptop that she won’t use very often. Turns out, it’s quite a cost effective way to run a Windows 10 OS 🙂 The only things you really pay for are the disk storage and VM compute time. To access the VM, we simply use a remote desktop app on the Chromebook.

The first thing you need is an Azure subscription. Currently you get some credits for the first month, then some services that are free for the first 12 months. Even after that, if you have no resources in the subscription you won’t pay a penny for it.

You can create a Windows 10 VM from the Azure Marketplace. Creating the VM will create a few resources in a resource group such as a NIC, an NSG (network security group), a disk and of course the VM itself. To save on cost, I didn’t add any data disk and just used the 127GiB disk that comes with the OS. I also used the basic sku for the NSG, and I didn’t assign a static public IP address – I simply added a DNS name. You’ll get charged for a static IP (something like £0.06 p/day) but if you use a dynamic IP with a DNS name you won’t get charged anything.

For the disk, I somehow assumed that I would need a Premium SSD to get good performance as that’s what I would typically use for corporate VMs, but as this is for home use and I’m not really concerned about SLAs, I experimented with the Standard SSD and the Standard HDD as well. I was surprised to find the the Standard HDD was perfectly adequate for every day use and I didn’t really notice much difference in performance with either of the SSD options. Of course you do get less IOPS with an HDD, but that hasn’t been any issue. Since an HDD is much cheaper than an SSD, it made sense to use one.

For the VM size, I used an F4s_V2 which is compute optimized, has 8GB RAM, 4vCPUs and runs great. You could certainly get away with a smaller size though and shave your compute costs, something like a DS2_V3 and it’ll still run just fine.

In the tables below I summarized the actual costs of running the VM and also compared the costs of using Premium SSD/Standard SSD/Standard HDD. These costs are in GBP (£) and are in the UK South Azure region and are true at the time of writing – prices will vary based on region and currency and VM compute hours. The costs are also from actual invoiced costs – not from the Azure price calculator. The price calculator can give a good ball-park figure but in my experience the actual cost will be different…

Note: there are also data egress costs, ie data coming out of Azure. Downloads etc within the VM are ingress and don’t generally get charged. But even for egress you get the first 5GB free anyway (see here).

Time periodCost (£ GBP)
Per hour0.16
Per day3.84
Per month116.8
Per year1401.6
Compute costs for F4s_V2 VM
Time periodCost (£ GBP) Premium SSDCost (£ GBP) Standard SSDCost (£ GBP) Standard HDD
Per day0.570.250.12
Per month17.347.63.65
Per year208.0591.2543.8
Disk storage costs

So the base cost for owning the VM is £3.65 p/month using a Standard HDD. On top of that is the compute time. For example, if I use the VM for 20 hours in a month, the compute cost is £3.20 for the month. Add that to the base cost, and it’s £6.85 for the month. That’s not bad 🙂

Some key things to remember are that you always pay for the disk storage whether you use the VM or not. You only pay for compute time when you actually turn on the VM and use it. Always remember to actually stop your VM when finished (not just shut down the OS) so that the resources are de-allocated and you are not charged for unnecessary compute time. Use the Auto-shutdown feature to ensure the VM gets stopped every night. Also, since you have a public IP address it’s a good idea to use some NSG rules to restrict who or from where and on which ports you can access your VM.

Using an Azure VM for personal computing needs is a great option – you benefit from the elasticity and scalability of the cloud, and you only pay for what you use. You can scale up or scale down at any time according to your needs and you can set a budget and keep your eye on how much you’re spending using Azure cost management.

Setting the Computer Description During Windows Autopilot

I’ve been getting to grips with Windows Autopilot recently and, having a long history working with SCCM, I’ve found it hard not to compare it with the power of traditional OSD using a task sequence. In fact, one of my goals was to basically try to reproduce what I’m doing in OSD with Autopilot in order to end up with the same result – and it’s been a challenge.

I like the general concept of Autopilot and don’t get me wrong – it’s getting better all the time – but it still has its shortcomings that require a bit of creativity to work around. One of the things I do during OSD is to set the computer description in AD. That’s fairly easy to do in a task sequence; you can just script it and run the step using credentials that have the permission to make that change.

In Autopilot however (hybrid AAD join scenario), although you can run Powershell scripts too, they will only run in SYSTEM context during the Autopilot process. That means you either need to give computer accounts the permission to change their own properties in AD, or you have to find a way to run that code using alternate credentials. You can run scripts in the context of the logged-on user, but I don’t want to do that – in fact I disable the user ESP – I want to use a specific account that has those permissions.

You could use SCCM to do it post-deployment if you are co-managing the device, but ideally I want everything to be native to Autopilot where possible, and move away from the hybrid mentality of do what you can with Intune, and use SCCM for the rest.

It is possible to execute code in another user context from SYSTEM context, but when making changes in AD the DirectoryEntry operation kept erroring with “An operations error occurred”. After researching, I realized it is due to AD not accepting the authentication token as it’s being passed a second time and not directly. I tried creating a separate powershell process, a background job, a runspace with specific credentials – nothing would play ball. Anyway, I found a way to get around that by using the AccountManagement .Net class, which allows you to create a context using specific credentials.

In this example, I’m setting the computer description based on the model and serial number of the device. You need to provide the username and password for the account you will perform the AD operation with. I’ve put the password in clear text in this example, but in the real world we store the credentials in an Azure Keyvault and load them in dynamically at runtime with some POSH code to avoid storing them in the script. I hope in the future we will be able to run Powershell scripts with Intune in a specific user context, as you can with steps in an SCCM task sequence.

# Set credentials
$ADAccount = "mydomain\myADaccount"
$ADPassword = "Pa$$w0rd"

# Set initial description
$Model = Get-WMIObject -Class Win32_ComputerSystem -Property Model -ErrorAction Stop| Select -ExpandProperty Model
$SerialNumber = Get-WMIObject -Class Win32_BIOS -Property SerialNumber -ErrorAction Stop | Select -ExpandProperty SerialNumber
$Description = "$Model - $SerialNumber"

# Set some type accelerators
Add-Type -AssemblyName System.DirectoryServices.AccountManagement -ErrorAction Stop
$Accelerators = [PowerShell].Assembly.GetType("System.Management.Automation.TypeAccelerators")
$Accelerators::Add("PrincipalContext",[System.DirectoryServices.AccountManagement.PrincipalContext])
$Accelerators::Add("ContextType",[System.DirectoryServices.AccountManagement.ContextType])
$Accelerators::Add("Principal",[System.DirectoryServices.AccountManagement.ComputerPrincipal])
$Accelerators::Add("IdentityType",[System.DirectoryServices.AccountManagement.IdentityType])

# Connect to AD and set the computer description
$Domain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$PrincipalContext = [PrincipalContext]::new([ContextType]::Domain,$Domain,$ADAccount,$ADPassword)
$Account = [Principal]::FindByIdentity($PrincipalContext,[IdentityType]::Name,$env:COMPUTERNAME)
$LDAPObject = $Account.GetUnderlyingObject()
If ($LDAPObject.Properties["description"][0])
{
    $LDAPObject.Properties["description"][0] = $Description
}
Else
{
    [void]$LDAPObject.Properties["description"].Add($Description)
}
$LDAPObject.CommitChanges()
$Account.Dispose()

Querying for Devices in Azure AD and Intune with PowerShell and Microsoft Graph

Recently I needed to get a list of devices in both Azure Active Directory and Intune and I found that using the online portals I could not filter devices by the parameters that I needed. So I turned to Microsoft Graph to get the data instead. You can use the Microsoft Graph Explorer to query via the Graph REST API, however, the query capabilities of the API are still somewhat limited. To find the data I needed, I had to query the Graph REST API using PowerShell, where I can take advantage of the greater filtering capabilities of PowerShell’s Where-Object.

To use the Graph API, you need to authenticate first. A cool guy named Dave Falkus has published a number of PowerShell scripts on GitHub that use the Graph API with Intune, and these contain some code to authenticate with the API. Rather than re-invent the wheel, we can use his functions to get the authentication token that we need.

First, we need the AzureRM or Azure AD module installed as we use the authentication libraries that are included with it.

Next, open one of the scripts that Dave has published on GitHub, for example here, and copy the function Get-AuthToken into your script.

The also copy the Authentication code region into your script, ie the section between the following:


#region Authentication
...
#endregion

If you run this code it’ll ask you for an account name to authenticate with from your Azure AD. Once authenticated, we have a token we can use with the Graph REST API saved as a globally-scoped variable $authToken.

Get Devices from Azure AD

To get devices from Azure AD, we can use the following function, which I take no credit for as I have simply modified a function written by Dave.


Function Get-AzureADDevices(){

[cmdletbinding()]

$graphApiVersion = "v1.0"
$Resource = "devices"
$QueryParams = ""

    try {

        $uri = "https://graph.microsoft.com/$graphApiVersion/$($Resource)$QueryParams"
        Invoke-RestMethod -Uri $uri -Headers $authToken -Method Get
    }

    catch {

    $ex = $_.Exception
    $errorResponse = $ex.Response.GetResponseStream()
    $reader = New-Object System.IO.StreamReader($errorResponse)
    $reader.BaseStream.Position = 0
    $reader.DiscardBufferedData()
    $responseBody = $reader.ReadToEnd();
    Write-Host "Response content:`n$responseBody" -f Red
    Write-Error "Request to $Uri failed with HTTP Status $($ex.Response.StatusCode) $($ex.Response.StatusDescription)"
    write-host
    break

    }

}

In the $graphAPIVersion parameter, you can use the current version of the API.

Now we can run the following code, which will use the API to return all devices in your Azure AD and save them to them a hash table which organizes the results by operating system version.


# Return the data
$ADDeviceResponse = Get-AzureADDevices
$ADDevices = $ADDeviceResponse.Value
$NextLink = $ADDeviceResponse.'@odata.nextLink'
# Need to loop the requests because only 100 results are returned each time
While ($NextLink -ne $null)
{
    $ADDeviceResponse = Invoke-RestMethod -Uri $NextLink -Headers $authToken -Method Get
    $NextLink = $ADDeviceResponse.'@odata.nextLink'
    $ADDevices += $ADDeviceResponse.Value
}

Write-Host "Found $($ADDevices.Count) devices in Azure AD" -ForegroundColor Yellow
$ADDevices.operatingSystem | group -NoElement

$DeviceTypes = $ADDevices.operatingSystem | group -NoElement | Select -ExpandProperty Name
$AzureADDevices = @{}
Foreach ($DeviceType in $DeviceTypes)
{
    $AzureADDevices.$DeviceType = $ADDevices | where {$_.operatingSystem -eq "$DeviceType"} | Sort displayName
}

Write-host "Devices have been saved to a variable. Enter '`$AzureADDevices' to view."

It will tell you how many devices it found, and how many devices there are by operating system version / device type.

2018-10-22 16_06_14-Windows PowerShell ISE

We can now use the $AzureADDevices hash table to query the data as we wish.

For example, here I search for an iPhone that belongs to a particular user:


$AzureADDevices.Iphone | where {$_.displayName -match 'nik'}

Here I am looking for the count of Windows devices that are hybrid Azure AD joined, and display the detail in the GridView.


($AzureADDevices.Windows | where {$_.trustType -eq 'ServerAd'}).Count
$AzureADDevices.Windows | where {$_.trustType -eq 'ServerAd'} | Out-GridView

And here I’m looking for all MacOS devices that are not compliant with policy.


($AzureADDevices.MacOS | where {$_.isCompliant -ne "True"}) | Out-GridView

Get Devices from Intune

To get devices from Intune, we can take a similar approach. Again no credit for this function as its modified from Dave’s code.


Function Get-IntuneDevices(){

[cmdletbinding()]

# Defining Variables
$graphApiVersion = "v1.0"
$Resource = "deviceManagement/managedDevices"

try {

    $uri = "https://graph.microsoft.com/$graphApiVersion/$Resource"
    (Invoke-RestMethod -Uri $uri -Headers $authToken -Method Get).Value

}

    catch {

    $ex = $_.Exception
    $errorResponse = $ex.Response.GetResponseStream()
    $reader = New-Object System.IO.StreamReader($errorResponse)
    $reader.BaseStream.Position = 0
    $reader.DiscardBufferedData()
    $responseBody = $reader.ReadToEnd();
    Write-Host "Response content:`n$responseBody" -f Red
    Write-Error "Request to $Uri failed with HTTP Status $($ex.Response.StatusCode) $($ex.Response.StatusDescription)"
    write-host
    break

    }

}

Running the following code will return all devices in Intune and save them to a hash table again organised by operating system.


$MDMDevices = Get-IntuneDevices

Write-Host "Found $($MDMDevices.Count) devices in Intune" -ForegroundColor Yellow
$MDMDevices.operatingSystem | group -NoElement

$IntuneDeviceTypes = $MDMDevices.operatingSystem | group -NoElement | Select -ExpandProperty Name
$IntuneDevices = @{}
Foreach ($IntuneDeviceType in $IntuneDeviceTypes)
{
    $IntuneDevices.$IntuneDeviceType = $MDMDevices | where {$_.operatingSystem -eq "$IntuneDeviceType"} | Sort displayName
}

Write-host "Devices have been saved to a variable. Enter '`$IntuneDevices' to view."

Now we can query data using the $IntuneDevices variable.

Here I am querying for the count of compliant and non-compliant iOS devices.


$IntuneDevices.iOS | group complianceState -NoElement

Here I am querying for all non-compliant iOS devices, specifying the columns I want to see, sort the results and outputting into table format.


$IntuneDevices.iOS |
    where {$_.complianceState -eq "noncompliant"} |
    Select userDisplayName,deviceName,imei,managementState,complianceGracePeriodExpirationDateTime |
    Sort userDisplayName |
    ft

All Windows devices sorted by username:


$IntuneDevices.Windows | Select userDisplayName,deviceName | Sort userDisplayName

Windows devices managed by SCCM:


$IntuneDevices.Windows | where {$_.managementAgent -eq "ConfigurationManagerClientMdm"} | Out-GridView

Windows devices enrolled using Windows auto enrollment:


$IntuneDevices.Windows | where {$_.deviceEnrollmentType -eq "windowsAutoEnrollment"} | Out-GridView

Windows devices enrolled by SCCM co-management:


$IntuneDevices.Windows | where {$_.deviceEnrollmentType -eq "windowsCoManagement"} | Out-GridView

You can, of course, expand this into users and other resource types, not just devices. You just need the right URL construct for the data type you want to query.