Get a daily admin Audit Report for MEM / Intune

In an environment where you have multiple admin users it’s useful to audit admin activities so everyone can be aware of changes that others have made. I do this for Endpoint Configuration Manager with a daily email report built from admin status messages, so I decided to create something similar for Intune / MEM.

Admin actions are already audited for you in MEM (Tenant Administration > Audit logs) so it’s simply a case of getting that data into an email report. You can do this with Graph (which gives you more data actually) but I decided to use Log Analytics for this instead.

You need a Log Analytics workspace, and you need to configure Diagnostics settings in the MEM portal to send AuditLogs to the workspace.

Then, in order to automate sending a daily report create a service principal in Azure AD with just the permissions necessary to read data from the Log Analytics workspace. You can do this easily from the Azure portal using CloudShell. In the example below, I’m creating a new service principal with the role “Log Analytics Reader” scoped just to the Log Analytics workspace where the AuditLogs are sent to.

$DisplayName = "MEM-Reporting"
$Role = "Log Analytics Reader"
$Scope = "/subscriptions/<subscriptionId>/resourcegroups/<resourcegroupname>/providers/microsoft.operationalinsights/workspaces/<workspacename>"

$sp = New-AzADServicePrincipal -DisplayName $DisplayName -Role $Role -Scope $Scope

With the service principal created, you’ll need to make a note of the ApplicationId:

$sp.ApplicationId

And the secret:

$SP.Secret | ConvertFrom-SecureString -AsPlainText

Of course, if you prefer you can use certificate authentication instead of using the secret key.

Below is a PowerShell script that uses the Az PowerShell module to connect to the log analytics workspace as the service principal, query the IntuneAuditLogs for entries in the last 24 hours, then send them in an HTML email report. Run it with your favourite automation tool.

You’ll need the app Id and secret from the service principal, your tenant Id, your log analytics workspace Id, and don’t forget to update the email parameters.

Sample email report
# Script to send a daily audit report for admin activities in MEM/Intune
# Requirements:
# – Log Analytics Workspace
# – Intune Audit Logs saved to workspace
# – Service Principal with 'Log Analytics reader' role in workspace
# – Azure Az PowerShell modules
# Azure resource info
$ApplicationId = "abc73938-0000-0000-0000-9b01316a9123" # Service Principal Application Id
$Secret = "489j49r-0000-0000-0000-e2dc6451123" # Service Principal Secret
$TenantID = "abc894e7-00000-0000-0000-320d0334b123" # Tenant ID
$LAWorkspaceID = "abcc1e47-0000-0000-0000-b7ce2b2bb123" # Log Analytics Workspace ID
$Timespan = (New-TimeSpan Hours 24)
# Email params
$EmailParams = @{
To = 'trevor.jones@smsagent.blog'
From = 'MEMReporting@smsagent.blog'
Smtpserver = 'smsagent.mail.protection.outlook.com'
Port = 25
Subject = "MEM Audit Report | $(Get-Date Format ddMMMyyyy)"
}
# Html CSS style
$Style = @"
<style>
table {
border-collapse: collapse;
font-family: sans-serif
font-size: 12px
}
td, th {
border: 1px solid #ddd;
padding: 6px;
}
th {
padding-top: 8px;
padding-bottom: 8px;
text-align: left;
background-color: #3700B3;
color: #03DAC6
}
</style>
"@
# Connect to Azure with Service Principal
$Creds = [PSCredential]::new($ApplicationId,(ConvertTo-SecureString $Secret AsPlaintext Force))
Connect-AzAccount ServicePrincipal Credential $Creds Tenant $TenantID
# Run the Log Analytics Query
$Query = "IntuneAuditLogs | sort by TimeGenerated desc"
$Results = Invoke-AzOperationalInsightsQuery WorkspaceId $LAWorkspaceID Query $Query Timespan $Timespan
$ResultsArray = [System.Linq.Enumerable]::ToArray($Results.Results)
# Converts the results to a datatable
$DataTable = New-Object System.Data.DataTable
$Columns = @("Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID")
foreach ($Column in $Columns)
{
[void]$DataTable.Columns.Add($Column)
}
foreach ($result in $ResultsArray)
{
$Properties = $Result.Properties | ConvertFrom-Json
[void]$DataTable.Rows.Add(
$Properties.ActivityDate,
$result.Identity,
$Properties.Actor.ApplicationName,
$result.OperationName,
$result.ResultType,
$Properties.TargetDisplayNames[0],
$Properties.TargetObjectIDs[0]
)
}
# Send an email
If ($DataTable.Rows.Count -ge 1)
{
$HTML = $Datatable |
ConvertTo-Html Property "Date","Initiated by (actor)","Application Name","Activity","Operation Status","Target Name","Target ObjectID" Head $Style Body "<h2>MEM Admin Activities in the last 24 hours</h2>" |
Out-String
Send-MailMessage @EmailParams Body $html BodyAsHtml
}

Forcing a Full Hardware Inventory Report to be Sent Immediately on a ConfigMgr Client

Sometimes you might want to force a ConfigMgr client to send a full hardware inventory report immediately for whatever reason. Typically you would simply clean out the WMI instance for the InventoryAction then trigger the schedule. But sometimes there may already be a scheduled action in the queue, for example the hardware inventory cycle has been triggered on the normal schedule but it runs with randomization so it doesn’t run immediately when it’s triggered. In this case, you get a message in the InventoryAgent.log that looks like this:

Inventory: Message [Type=InventoryAction, ActionID={00000000-0000-0000-0000-000000000001}, Report=Delta] already in queue. Message ignored.

It ignores your request if there’s already a request queued.

You can still force it to run immediately though by clearing the queue. To do this you can simply delete the InventoryAgent queue folder but you can’t do this while the SMS Agent host service is running, you have to stop the service first.

Below is a script that will attempt to trigger a full a HWI report and check the InventoryAgent.log to see if the request was ignored – if so, it clears the queue and tries again.

# Invoke a full (resync) HWI report
$Instance = Get-CimInstance -NameSpace ROOT\ccm\InvAgt -Query "SELECT * FROM InventoryActionStatus WHERE InventoryActionID='{00000000-0000-0000-0000-000000000001}'"
$Instance | Remove-CimInstance
Invoke-CimMethod -Namespace ROOT\ccm -ClassName SMS_Client -MethodName TriggerSchedule -Arguments @{ sScheduleID = "{00000000-0000-0000-0000-000000000001}"}
Start-Sleep -Seconds 5

# Check InventoryAgent log for ignored message
$Log = "$env:SystemRoot\CCM\Logs\InventoryAgent.Log"
$LogEntries = Select-String –Path $Log –SimpleMatch "{00000000-0000-0000-0000-000000000001}" | Select -Last 1
If ($LogEntries -match "already in queue. Message ignored.")
{
    # Clear the message queue
    # WARNING: This restarts the SMS Agent host service
    Stop-Service -Name CcmExec -Force
    Remove-Item -Path C:\Windows\CCM\ServiceData\Messaging\EndpointQueues\InventoryAgent -Recurse -Force -Confirm:$false
    Start-Service -Name CcmExec

    # Invoke a full (resync) HWI report
    Start-Sleep -Seconds 5
    $Instance = Get-CimInstance -NameSpace ROOT\ccm\InvAgt -Query "SELECT * FROM InventoryActionStatus WHERE InventoryActionID='{00000000-0000-0000-0000-000000000001}'"
    $Instance | Remove-CimInstance
    Invoke-CimMethod -Namespace ROOT\ccm -ClassName SMS_Client -MethodName TriggerSchedule -Arguments @{ sScheduleID = "{00000000-0000-0000-0000-000000000001}"}
}

Collecting ConfigMgr Client Logs to Azure Storage

In the 2002 release of Endpoint Configuration Manager, Microsoft added a nice capability to collect log files from a client to the site server. Whilst this is a cool capability, you might not be on 2002 yet or you might prefer to send logs to a storage account in Azure rather than to the site server. You can do that quite easily using the Run Script feature. This works whether the client is connected on the corporate network or through a Cloud Management Gateway.

To do this you need a storage account in Azure, a container in the account, and a Shared access signature.

I’ll assume you have the first two in place, so let’s create a Shared access signature. In the Storage account in the Azure Portal, click on Shared access signature under Settings.

  • Under Allowed services, check Blob.
  • Under Allowed resource types, check Object.
  • Under Allowed permissions, check Create.

Set an expiry date then click Generate SAS and connection string. Copy the SAS token and keep it safe somewhere.

Below is a PowerShell script that will upload client log files to Azure storage.

## Uploads client logs files to Azure storage
$Logs = Get-ChildItem "$env:SystemRoot\CCM\Logs"
$Date = Get-date -Format "yyyy-MM-dd-HH-mm-ss"
$ContainerURL = "https://mystorageaccount.blob.core.windows.net/mycontainer&quot;
$FolderPath = "ClientLogFiles/$($env:COMPUTERNAME)/$Date"
$SASToken = "?sv=2019-10-10&ss=b&srt=o&sp=c&se=2030-05-01T06:31:59Z&st=2020-04-30T22:31:59Z&spr=https&sig=123456789abcdefg"
$Responses = New-Object System.Collections.ArrayList
$Stopwatch = New-object System.Diagnostics.Stopwatch
$Stopwatch.Start()
foreach ($Log in $Logs)
{
$Body = Get-Content $($Log.FullName) -Raw
$URI = "$ContainerURL/$FolderPath/$($Log.Name)$SASToken"
$Headers = @{
'x-ms-content-length' = $($Log.Length)
'x-ms-blob-type' = 'BlockBlob'
}
$Response = Invoke-WebRequest -Uri $URI -Method PUT -Headers $Headers -Body $Body
[void]$Responses.Add($Response)
}
$Stopwatch.Stop()
Write-host "$(($Responses | Where {$_.StatusCode -eq 201}).Count) log files uploaded in $([Math]::Round($Stopwatch.Elapsed.TotalSeconds,2)) seconds."

Update the following parameters in your script:

  • ContainerURL. This is the URL to the container in your storage account. You can find it by clicking on the container, then Properties > URL.
  • SASToken. This is the SAS token string you created earlier.

Create and approve a new Script in ConfigMgr with this code. You can then run it against any online machine, or collection. When it’s complete, it will output how many log files were uploaded and how long the upload took.

To view the log files, you can either browse them in storage account in the Azure portal looking at the container directly, or using the Storage explorer. My preferred method is to use the standalone Microsoft Azure Storage Explorer app, where you can simply double-click a log file to open it, or easily download the folder containing the log files to your local machine.

Installing and Configuring Additional Languages during Windows Autopilot

I was experimenting with different ways to get additional languages installed and configured during Windows Autopilot and it proved to be an interesting challenge. The following is what I settled on in the end and what produced the results that I wanted.

Here were my particular requirements, but you can customize this per your own need:

  • The primary language should be English (United Kingdom)
  • An additional secondary language of English (United States)
  • Display language should be English (United Kingdom)
  • Default input override should be English (United Kingdom)
  • System locale should be English (United Kingdom)
  • The administrative defaults for the Welcome screen and New user accounts must have a display language, input language, format and location matching the primary language (UK / UK English)
  • All optional features for the primary language should be installed (handwriting, optical character recognition, etc)

To achieve this, I basically created three elements:

  1. Installed the Local Experience Pack for English (United Kingdom)
  2. Deployed a powershell script running in administrative context that sets the administrative language defaults and system locale
  3. Deployed a powershell script running in user context that sets the correct order in the user preferred languages list

This was deployed during Autopilot to a Windows 10 1909 (United States) base image.

Local Experience Packs

Local Experience Packs (LXPs) are the modern way to go for installing additional languages since Windows 10 1803. These are published to the Microsoft Store and are automatically updated. They also install more quickly that the traditional cab language packs that you would install with DISM.

LXPs are available in the Microsoft Store for Business, so they can be synced with Intune and deployed as apps. However, the problem with using LXPs as apps during Autopilot is the order of things. The LXP needs to be installed before the PowerShell script that configures the language defaults runs, and since PowerShell scripts are not currently tracked in the ESP, and apps are the last thing to install in the device setup phase, the scripts will very likely run before the app is installed.

To get around that, I decided to get the LXP from the Volume Licensing Center instead. Then I uploaded this to a storage account in Azure, where it gets downloaded and installed by the PowerShell script. This way I can control the order and be sure the LXP is installed before making configuration changes.

When downloading from the VLC, be sure to select the Multilanguage option:

Then get the highlighted ISO. The 1903 LXPs work for 1909 also.

Get the applicable appx file and the license file from the ISO, zip them, and upload the zip file into an Azure Storage account.

When uploading the zip file, be sure to choose the Account Key authentication type:

Once uploaded, click on the blob and go to the Generate SAS page. Choose Read permissions, set an appropriate expiry date, then copy the Blob SAS URL. You will need this to download the file with PowerShell.

Administrative PowerShell Script

Now lets create a PowerShell script that will:

  • Download and install the Local Experience Pack
  • Install any optional features for the language
  • Configure language and regional settings and defaults

Here’s the script I’m using for that.

# Admin-context script to set the administrative language defaults, system locale and install optional features for the primary language
# Language codes
$PrimaryLanguage = "en-GB"
$SecondaryLanguage = "en-US"
$PrimaryInputCode = "0809:00000809"
$SecondaryInputCode = "0409:00000409"
$PrimaryGeoID = "242"
# Enable side-loading
# Required for appx/msix prior to build 18956 (1909 insider)
New-ItemProperty Path HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\AppModelUnlock Name AllowAllTrustedApps Value 1 PropertyType DWORD Force
# Provision Local Experience Pack
$BlobURL = "https://mystorageaccount.blob.core.windows.net/mycontainer/en-gb.zip?sp=r&st=2020-03-26T18:02:28Z&se=2050-03-27T00:02:28Z&spr=https&sv=2019-02-02&sr=b&sig=91234567890OYr%2BI0RcryhGFy1DNMlzhfIWbQ%3D"
$DownloadedFile = "$env:LOCALAPPDATA\en-GB.zip"
Try
{
$WebClient = New-Object System.Net.WebClient
$WebClient.DownloadFile($BlobURL, $DownloadedFile)
Unblock-File Path $DownloadedFile ErrorAction SilentlyContinue
Expand-Archive Path $DownloadedFile DestinationPath $env:LOCALAPPDATA Force ErrorAction Stop
Add-AppxProvisionedPackage Online PackagePath "$env:LOCALAPPDATA\en-gb\LanguageExperiencePack.en-gb.Neutral.appx" LicensePath "$env:LOCALAPPDATA\en-gb\License.xml" ErrorAction Stop
Remove-Item Path $DownloadedFile Force ErrorAction SilentlyContinue
}
Catch
{
Write-Host "Failed to install Local Experience Pack: $_"
}
# Install optional features for primary language
$UKCapabilities = Get-WindowsCapability Online | Where {$_.Name -match "$PrimaryLanguage" -and $_.State -ne "Installed"}
$UKCapabilities | foreach {
Add-WindowsCapability Online Name $_.Name
}
# Apply custom XML to set administrative language defaults
$XML = @"
<gs:GlobalizationServices xmlns:gs="urn:longhornGlobalizationUnattend">
<!– user list –>
<gs:UserList>
<gs:User UserID="Current" CopySettingsToDefaultUserAcct="true" CopySettingsToSystemAcct="true"/>
</gs:UserList>
<!– GeoID –>
<gs:LocationPreferences>
<gs:GeoID Value="$PrimaryGeoID"/>
</gs:LocationPreferences>
<gs:MUILanguagePreferences>
<gs:MUILanguage Value="$PrimaryLanguage"/>
<gs:MUIFallback Value="$SecondaryLanguage"/>
</gs:MUILanguagePreferences>
<!– system locale –>
<gs:SystemLocale Name="$PrimaryLanguage"/>
<!– input preferences –>
<gs:InputPreferences>
<gs:InputLanguageID Action="add" ID="$PrimaryInputCode" Default="true"/>
<gs:InputLanguageID Action="add" ID="$SecondaryInputCode"/>
</gs:InputPreferences>
<!– user locale –>
<gs:UserLocale>
<gs:Locale Name="$PrimaryLanguage" SetAsCurrent="true" ResetAllSettings="false"/>
</gs:UserLocale>
</gs:GlobalizationServices>
"@
New-Item Path $env:TEMP Name "en-GB.xml" ItemType File Value $XML Force
$Process = Start-Process FilePath Control.exe ArgumentList "intl.cpl,,/f:""$env:Temp\en-GB.xml""" NoNewWindow PassThru Wait
$Process.ExitCode

A quick walkthrough:

First, I’ve entered the locale IDs for the primary and secondary languages, as well as the keyboard layout hex codes, and finally the Geo location ID for the primary language as variables.

Then we set a registry key to allow side-loading (required for older W10 versions for the install of appx/msix).

Next we download and install the LXP. You’ll need to enter the URL you copied earlier for the Azure blob, and update the zip filename as required, as well as the LXP filename.

Then we install any optional features for the primary language that aren’t already installed.

Then we define the content of an XML file that will be used to set the language and locale preferences. Obviously customize that per your requirement.

Then we save that content to a file and apply it.

Create the PowerShell script in Intune, make sure you don’t run it using the logged on credentials, and deploy it to your Autopilot AAD group.

User PowerShell Script

Now we need to create a very simple script that will run in the user context. This script simply makes sure that the list of preferred languages is in the correct order, as by default it will look like this:

This script will run for each user that logs in. It won’t run immediately so the order may be wrong when you first log in, but it doesn’t take long before it runs. Create the script in Intune, remember to run it using the logged on credentials, and deploy it to your Autopilot AAD group.

# User-context script to set the Language List
# Language codes
$PrimaryLanguage = "en-GB"
$SecondaryLanguage = "en-US"
$PrimaryInputCode = "0809:00000809"
$SecondaryInputCode = "0409:00000409"
# Set preferred languages
$NewLanguageList = New-WinUserLanguageList Language $PrimaryLanguage
$NewLanguageList.Add([Microsoft.InternationalSettings.Commands.WinUserLanguage]::new($SecondaryLanguage))
$NewLanguageList[1].InputMethodTips.Clear()
$NewLanguageList[1].InputMethodTips.Add($PrimaryInputCode)
$NewLanguageList[1].InputMethodTips.Add($SecondaryInputCode)
Set-WinUserLanguageList $NewLanguageList Force

The Result

After running the Autopilot deployment and logging in, everything checks out 🙂

Managing Intune PowerShell Scripts with Microsoft Graph

In this blog I’ll cover how to list, get, create, update, delete and assign PowerShell scripts in Intune using Microsoft Graph and PowerShell.

Although you can use the Invoke-WebRequest or Invoke-RestMethod cmdlets when working with MS Graph, I prefer to use the Microsoft.Graph.Intune module, aka Intune PowerShell SDK, as it more nicely handles getting an auth token and we don’t have to create any headers, so get that module installed.

In the Graph API, PowerShell scripts live under the deviceManagementScript resource type and these are still only available in the beta schema so they are subject to change.

Connect to MS Graph

First off, let’s connect to MS Graph and set the schema to beta:

If ((Get-MSGraphEnvironment).SchemaVersion -ne "beta")
{
    $null = Update-MSGraphEnvironment -SchemaVersion beta
}
$Graph = Connect-MSGraph

List PowerShell Scripts

Now we can list the PowerShell scripts we have in Intune:

$URI = "deviceManagement/deviceManagementScripts"
$IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
If ($IntuneScripts.value)
{
    $IntuneScripts = $IntuneScripts.value
}

If we take a look at the results, we’ll see that the script content is not included when we list scripts. It is included when we get a single script, as we’ll see next.

Get a PowerShell Script

To get a specific script, we need to know its Id. To get that, first let’s create a simple function where we can pass a script name and use the Get method to retrieve the script details.

Function Get-IntunePowerShellScript {
    Param($ScriptName)
    $URI = "deviceManagement/deviceManagementScripts" 
    $IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
    If ($IntuneScripts.value)
    {
        $IntuneScripts = $IntuneScripts.value
    }
    $IntuneScript = $IntuneScripts | Where {$_.displayName -eq "$ScriptName"}
    Return $IntuneScript
}

Now we can use this function to get the script Id and then call Get again adding the script Id to the URL:

$ScriptName = "Escrow Bitlocker Recovery Keys to AAD"
$Script = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($Script.id)"
$IntuneScript = Invoke-MSGraphRequest -HttpMethod GET -Url $URI

If we look at the result, we can see that the script content is now returned, albeit in binary form:

View Script Content

To view the script, we simply need to convert it:

$Base64 =[Convert]::FromBase64String($IntuneScript.scriptContent)
[System.Text.Encoding]::UTF8.GetString($Base64)

Create a Script

Now lets create a new script. To create a script we will read in a script file and convert it into base64. We add this together with other required parameters into some JSON before posting the request.

When reading and converting the script content use UTF8. Other character sets may not decode properly at run-time on the client-side and result in script execution failure.

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD"
    RunAsAccount = "system" # or user
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts"
$Response = Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

We can now see our script in the portal:

Update a Script

To update an existing script, we follow a similar process to creating a new script, we create some JSON that contains the updated parameters then call the Patch method to update it. But first we need to get the Id of the script we want to update, using our previously created function:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

In this example I have updated the content in the source script file so I need to read it in again, as well as updating the description of the script:

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD (Updated 2020-03-19)"
    RunAsAccount = "system"
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.id)"
$Response = Invoke-MSGraphRequest -HttpMethod PATCH -Url $URI -Content $Json

We can call Get on the script again and check the lastModifiedDateTime entry to verify that the script was updated, or check in the portal.

Add an Assignment

Before the script will execute anywhere it needs to be assigned to a group. To do that, we need the objectId of the AAD group we want to assign it to. To work with AAD groups I prefer to use the AzureAD module, so install that before continuing.

We need to again get the script that we want to assign:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

Then get the Azure AD group:

$AzureAD = Connect-AzureAD -AccountId $Graph.UPN
$GroupName = "Intune - [Test] Bitlocker Key Escrow"
$Group = Get-AzureADGroup -SearchString $GroupName

Then we prepare the necessary JSON and post the assignment

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($Group.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

To replace the current assignment with a new assignment, simply change the group name and run the same code again. To add an additional assignment or multiple assignments, you’ll need to post all the assignments at the same time, for example:

$GroupNameA = "Intune - [Test] Bitlocker Key Escrow"
$GroupNameB = "Intune - [Test] Autopilot SelfDeploying Provisioning"
$GroupA = Get-AzureADGroup -SearchString $GroupNameA
$GroupB = Get-AzureADGroup -SearchString $GroupNameB

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupA.ObjectId)"
        },
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupB.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

Delete an Assignment

I haven’t yet figured out how to delete an assignment – the current documentation appears to be incorrect. If you can figure this out please let me know!

Delete a Script

To delete a script, we simply get the script Id and call the Delete method on it:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)"
Invoke-MSGraphRequest -HttpMethod DELETE -Url $URI 

Delete Device Records in AD / AAD / Intune / Autopilot / ConfigMgr with PowerShell

I’ve done a lot of testing with Windows Autopilot in recent times. Most of my tests are done in virtual machines, which are ideal as I can simply dispose of them after. But you also need to cleanup the device records that were created in Azure Active Directory, Intune, the Autopilot registration service, Microsoft Endpoint Manager (if you’re using it) and Active Directory in the case of Hybrid-joined devices.

To make this a bit easier, I wrote the following PowerShell script. You simply enter the device name and it’ll go and search for that device in any of the above locations that you specify and delete the device records.

The script assumes you have the appropriate permissions, and requires the Microsoft.Graph.Intune and AzureAD PowerShell modules, as well as the Configuration Manager module if you want to delete from there.

You can delete from all of the above locations with the -All switch, or you can specify any combination, for example -AAD -Intune -ConfigMgr, or -AD -Intune etc.

In the case of the Autopilot device registration, the device must also exist in Intune before you attempt to delete it as the Intune record is used to determine the serial number of the device.

Please test thoroughly before using on any production device!

Examples

Delete-AutopilotedDeviceRecords -ComputerName PC01 -All
@(
    'PC01'
    'PC02'
    'PC03'
) | foreach {
    Delete-AutopilotedDeviceRecords -ComputerName $_ -AAD -Intune
}

Output

Script

[CmdletBinding(DefaultParameterSetName='All')]
Param
(
[Parameter(ParameterSetName='All',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
[Parameter(ParameterSetName='Individual',Mandatory=$true,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
$ComputerName,
[Parameter(ParameterSetName='All')]
[switch]$All = $True,
[Parameter(ParameterSetName='Individual')]
[switch]$AD,
[Parameter(ParameterSetName='Individual')]
[switch]$AAD,
[Parameter(ParameterSetName='Individual')]
[switch]$Intune,
[Parameter(ParameterSetName='Individual')]
[switch]$Autopilot,
[Parameter(ParameterSetName='Individual')]
[switch]$ConfigMgr
)
Set-Location $env:SystemDrive
# Load required modules
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Importing modules…" NoNewline
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module Microsoft.Graph.Intune ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module AzureAD ErrorAction Stop
}
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Import-Module $env:SMS_ADMIN_UI_PATH.Replace('i386','ConfigurationManager.psd1') ErrorAction Stop
}
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
# Authenticate with Azure
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-Host "Authenticating with MS Graph and Azure AD…" NoNewline
$intuneId = Connect-MSGraph ErrorAction Stop
$aadId = Connect-AzureAD AccountId $intuneId.UPN ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
Catch
{
Write-host "Error!" ForegroundColor Red
Write-host "$($_.Exception.Message)" ForegroundColor Red
Return
}
}
Write-host "$($ComputerName.ToUpper())" ForegroundColor Yellow
Write-Host "===============" ForegroundColor Yellow
# Delete from AD
If ($PSBoundParameters.ContainsKey("AD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Active Directory " ForegroundColor Yellow NoNewline
Write-host "computer account…" NoNewline
$Searcher = [ADSISearcher]::new()
$Searcher.Filter = "(sAMAccountName=$ComputerName`$)"
[void]$Searcher.PropertiesToLoad.Add("distinguishedName")
$ComputerAccount = $Searcher.FindOne()
If ($ComputerAccount)
{
Write-host "Success" ForegroundColor Green
Write-Host " Deleting computer account…" NoNewline
$DirectoryEntry = $ComputerAccount.GetDirectoryEntry()
$Result = $DirectoryEntry.DeleteTree()
Write-Host "Success" ForegroundColor Green
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Azure AD
If ($PSBoundParameters.ContainsKey("AAD") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Azure AD " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
[array]$AzureADDevices = Get-AzureADDevice SearchString $ComputerName All:$true ErrorAction Stop
If ($AzureADDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
Foreach ($AzureADDevice in $AzureADDevices)
{
Write-host " Deleting DisplayName: $($AzureADDevice.DisplayName) | ObjectId: $($AzureADDevice.ObjectId) | DeviceId: $($AzureADDevice.DeviceId) …" NoNewline
Remove-AzureADDevice ObjectId $AzureADDevice.ObjectId ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete from Intune
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Intune " ForegroundColor Yellow NoNewline
Write-host "managed device record/s…" NoNewline
[array]$IntuneDevices = Get-IntuneManagedDevice Filter "deviceName eq '$ComputerName'" ErrorAction Stop
If ($IntuneDevices.Count -ge 1)
{
Write-Host "Success" ForegroundColor Green
If ($PSBoundParameters.ContainsKey("Intune") -or $PSBoundParameters.ContainsKey("All"))
{
foreach ($IntuneDevice in $IntuneDevices)
{
Write-host " Deleting DeviceName: $($IntuneDevice.deviceName) | Id: $($IntuneDevice.Id) | AzureADDeviceId: $($IntuneDevice.azureADDeviceId) | SerialNumber: $($IntuneDevice.serialNumber) …" NoNewline
Remove-IntuneManagedDevice managedDeviceId $IntuneDevice.Id Verbose ErrorAction Stop
Write-host "Success" ForegroundColor Green
}
}
}
Else
{
Write-host "Not found!" ForegroundColor Red
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
# Delete Autopilot device
If ($PSBoundParameters.ContainsKey("Autopilot") -or $PSBoundParameters.ContainsKey("All"))
{
If ($IntuneDevices.Count -ge 1)
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "Autopilot " ForegroundColor Yellow NoNewline
Write-host "device registration…" NoNewline
$AutopilotDevices = New-Object System.Collections.ArrayList
foreach ($IntuneDevice in $IntuneDevices)
{
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities?`$filter=contains(serialNumber,'$($IntuneDevice.serialNumber)')"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod GET ErrorAction Stop
[void]$AutopilotDevices.Add($AutopilotDevice)
}
Write-Host "Success" ForegroundColor Green
foreach ($device in $AutopilotDevices)
{
Write-host " Deleting SerialNumber: $($Device.value.serialNumber) | Model: $($Device.value.model) | Id: $($Device.value.id) | GroupTag: $($Device.value.groupTag) | ManagedDeviceId: $($device.value.managedDeviceId) …" NoNewline
$URI = "https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities/$($device.value.Id)"
$AutopilotDevice = Invoke-MSGraphRequest Url $uri HttpMethod DELETE ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
}
# Delete from ConfigMgr
If ($PSBoundParameters.ContainsKey("ConfigMgr") -or $PSBoundParameters.ContainsKey("All"))
{
Try
{
Write-host "Retrieving " NoNewline
Write-host "ConfigMgr " ForegroundColor Yellow NoNewline
Write-host "device record/s…" NoNewline
$SiteCode = (Get-PSDrive PSProvider CMSITE ErrorAction Stop).Name
Set-Location ("$SiteCode" + ":") ErrorAction Stop
[array]$ConfigMgrDevices = Get-CMDevice Name $ComputerName Fast ErrorAction Stop
Write-Host "Success" ForegroundColor Green
foreach ($ConfigMgrDevice in $ConfigMgrDevices)
{
Write-host " Deleting Name: $($ConfigMgrDevice.Name) | ResourceID: $($ConfigMgrDevice.ResourceID) | SMSID: $($ConfigMgrDevice.SMSID) | UserDomainName: $($ConfigMgrDevice.UserDomainName) …" NoNewline
Remove-CMDevice InputObject $ConfigMgrDevice Force ErrorAction Stop
Write-Host "Success" ForegroundColor Green
}
}
Catch
{
Write-host "Error!" ForegroundColor Red
$_
}
}
Set-Location $env:SystemDrive

Get Program Execution History from a ConfigMgr Client with PowerShell

Have you ever been in the situation where something unexpected happens on a users computer and people start pointing their fingers at the ConfigMgr admin and asking “has anyone deployed something with SCCM?” Well, I decided to write a PowerShell script to retrieve the execution history for ConfigMgr programs on a local or remote client. This gives clear visibility of when and which deployments such as applications/programs/task sequences have run on the client and hopefully acquit you (or prove you guilty!)

Program execution history can be found in the registry but it doesn’t contain the name of the associated package, so I joined that data with software distribution data from WMI to give a better view.

You can run the script against the local machine, or a remote machine if you have PS remoting enabled. You can also run it against multiple machines at the same time and combine the data if desired. I recommend to pipe the results to grid view.

Get-CMClientExecutionHistory -Computername PC001,PC002 | Out-GridView
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$false,ValueFromPipelineByPropertyName=$true,ValueFromPipeline=$true)]
[string[]]$ComputerName = $env:COMPUTERNAME
)
Begin
{
$Code = {
# Get Execution History from registry, and package details from WMI
$ExecutionHistoryKey = "HKLM:\SOFTWARE\Microsoft\SMS\Mobile Client\Software Distribution\Execution History"
$ContextKeys = Get-ChildItem $ExecutionHistoryKey | Select ExpandProperty PSChildName
foreach ($ContextKey in $ContextKeys)
{
If ($ContextKey -eq "System")
{
$ContextKey = "Machine"
}
Else
{
$ContextKey = $ContextKey.Replace('','_')
}
[array]$SoftwareDistribution += Get-CimInstance Namespace ROOT\ccm\Policy\$ContextKey ClassName CCM_SoftwareDistribution
}
# Create a datatable to hold the results
$DataTable = New-Object System.Data.DataTable
[void]$DataTable.Columns.Add("ComputerName")
[void]$DataTable.Columns.Add("PackageName")
[void]$DataTable.Columns.Add("PackageID")
[void]$DataTable.Columns.Add("ProgramName")
[void]$DataTable.Columns.Add("DeploymentStatus")
[void]$DataTable.Columns.Add("Context")
[void]$DataTable.Columns.Add("State")
[void]$DataTable.Columns.Add("RunStartTime")
[void]$DataTable.Columns.Add("SuccessOrFailureCode")
[void]$DataTable.Columns.Add("SuccessOrFailureReason")
foreach ($ContextKey in $ContextKeys)
{
If ($ContextKey -ne "System")
{
# Get user context if applicable
$SID = New-Object Security.Principal.SecurityIdentifier ArgumentList $ContextKey
$Context = $SID.Translate([System.Security.Principal.NTAccount])
}
Else
{
$Context = "Machine"
}
$SubKeys = Get-ChildItem "$ExecutionHistoryKey\$ContextKey"
Foreach ($SubKey in $SubKeys)
{
$Items = Get-ChildItem $SubKey.PSPath
Foreach ($Item in $Items)
{
$PackageInfo = $SoftwareDistribution | Where {$_.PKG_PackageID -eq $SubKey.PSChildName -and $_.PRG_ProgramName -eq $Item.GetValue("_ProgramID")} | Select First 1
If ($PackageInfo)
{
$PackageName = $PackageInfo.PKG_Name
$DeploymentStatus = "Active"
}
Else
{
$PackageName = "-Unknown-"
$DeploymentStatus = "No longer targeted"
}
[void]$DataTable.Rows.Add($using:Computer,$PackageName,$SubKey.PSChildName,$Item.GetValue("_ProgramID"),$DeploymentStatus,$Context,$Item.GetValue("_State"),$Item.GetValue("_RunStartTime"),$Item.GetValue("SuccessOrFailureCode"),$Item.GetValue("SuccessOrFailureReason"))
}
}
}
$DataTable.DefaultView.Sort = "RunStartTime DESC"
$DataTable = $DataTable.DefaultView.ToTable()
Return $DataTable
}
}
Process
{
foreach ($Computer in $ComputerName)
{
If ($Computer -eq $env:COMPUTERNAME)
{
$Result = Invoke-Command ScriptBlock $Code
}
Else
{
$Result = Invoke-Command ComputerName $Computer HideComputerName ScriptBlock $Code ErrorAction Continue
}
$Result | Select ComputerName,PackageName,PackageID,ProgramName,DeploymentStatus,Context,State,RunStartTime,SuccessOrFailureCode,SuccessOrFailureReason
}
}
End
{
}

Get Previous and Scheduled Evaluation Times for ConfigMgr Compliance Baselines with PowerShell

I was testing a compliance baseline recently and wanted to verify if the schedule defined in the baseline deployment is actually honored on the client. I set the schedule to run every hour, but it was clear that it did not run every hour and that some randomization was being used.

To review the most recent evaluation times and the next scheduled evaluation time, I had to read the scheduler.log in the CCM\Logs directory, because I could only find a single last evaluation time recorded in WMI.

The following PowerShell script reads which baselines are currently deployed to the local machine, displays a window for you to choose one, then basically reads the Scheduler log to find when the most recent evaluations were and when the next one is scheduled.

Select a baseline
Baseline evaluations
##############################################################
## ##
## Reads the most recent and next scheduled evaluation time ##
## for deployed Compliance Baselines from the Scheduler.log ##
## ##
##############################################################
#requires -RunAsAdministrator
# Get Baselines from WMI
# Excludes co-management policies
Try
{
$Instances = Get-CimInstance Namespace ROOT\ccm\dcm ClassName SMS_DesiredConfiguration Filter "PolicyType!=1" OperationTimeoutSec 5 ErrorAction Stop | Select DisplayName,IsMachineTarget,Name
}
Catch
{
Throw "Couldn't get baseline info from WMI: $_"
}
If ($Instances.Count -eq 0)
{
Throw "No deployed baselines found!"
}
# Datatable to hold the baselines for the WPF window
$DataTable = New-Object System.Data.DataTable
[void]$DataTable.Columns.Add("DisplayName")
[void]$DataTable.Columns.Add("IsMachineTarget")
foreach ($Instance in ($Instances | Sort DisplayName))
{
[void]$DataTable.Rows.Add($Instance.DisplayName,$Instance.IsMachineTarget)
}
# WPF Window for baseline selection
Add-Type AssemblyName PresentationFramework,PresentationCore,WindowsBase
$Window = New-Object System.Windows.Window
$Window.WindowStartupLocation = [System.Windows.WindowStartupLocation]::CenterScreen
$Window.SizeToContent = [System.Windows.SizeToContent]::WidthAndHeight
$window.ResizeMode = [System.Windows.ResizeMode]::NoResize
$Window.Title = "DOUBLE-CLICK A BASELINE TO SELECT"
$DataGrid = New-Object System.Windows.Controls.DataGrid
$DataGrid.ItemsSource = $DataTable.DefaultView
$DataGrid.CanUserAddRows = $False
$DataGrid.IsReadOnly = $true
$DataGrid.SelectionMode = [System.Windows.Controls.DataGridSelectionMode]::Single
$DataGrid.Height = "NaN"
$DataGrid.MaxHeight = "250"
$DataGrid.Width = "NaN"
$DataGrid.AlternatingRowBackground = "#e6ffcc"
$DataGrid.Add_MouseDoubleClick({
$script:SelectedRow = $This.SelectedValue
$Window.Close()
})
$Window.AddChild($DataGrid)
[void]$Window.ShowDialog()
If (!$SelectedRow)
{
Throw "No baseline was selected!"
}
# If the baseline is user-targetted
If ($SelectedRow.row.IsMachineTarget -eq $false)
{
# Get Logged-on user SID
$LogonUIRegPath = "HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Authentication\LogonUI"
#Could also use this:
#Get-ItemProperty -Path HKLM:\SOFTWARE\Microsoft\SMS\CurrentUser -Name UserSID -ErrorAction Stop
$Property = "LastLoggedOnUserSID"
$LastLoggedOnUserSID = Get-ItemProperty Path $LogonUIRegPath Name $Property | Select ExpandProperty $Property
$LastLoggedOnUserSIDUnderscore = $LastLoggedOnUserSID.Replace('','_')
$Namespace = "ROOT\ccm\Policy\$LastLoggedOnUserSIDUnderscore\ActualConfig"
}
Else
{
$Namespace = "ROOT\ccm\Policy\Machine\ActualConfig"
}
# Get assignment info
$BaselineName = $SelectedRow.Row.DisplayName
$Pattern = [Regex]::Escape($BaselineName)
$CIAssignment = Get-CimInstance Namespace $Namespace ClassName CCM_DCMCIAssignment | where {$_.AssignmentName -match $Pattern}
$AssignmentIDs = $CIAssignment | Select AssignmentID,AssignmentName
Write-host "Baseline: $BaselineName" ForegroundColor Magenta
foreach ($AssignmentID in $AssignmentIDs)
{
# Read the scheduler log
$Log = "$env:SystemRoot\CCM\Logs\Scheduler.log"
If ($SelectedRow.row.IsMachineTarget -eq $false)
{
$LogEntries = Select-String Path $Log SimpleMatch "$LastLoggedOnUserSID/$($AssignmentID.AssignmentID)"
}
Else
{
$LogEntries = Select-String Path $Log SimpleMatch "Machine/$($AssignmentID.AssignmentID)"
}
If ($LogEntries)
{
# Get the previous evaluations date/time
$Evaluations = New-Object System.Collections.ArrayList
$EvaluationEntries = $LogEntries | where {$_ -match "SMSTrigger"}
Foreach ($Entry in $EvaluationEntries)
{
$Time = $Entry.Line.Split('=')[1]
$Date = $Entry.Line.Split('=')[2]
$a = $Time.Split()[0].trimend().replace('"','')
$b = $Date.Split()[0].trimend().replace('"','').replace('','/')
$Time = (Get-Date $a).ToLongTimeString()
$Date = [DateTime]"$b $Time"
$LocalDate = Get-Date $date Format (Get-Culture).DateTimeFormat.RFC1123Pattern
[void]$Evaluations.Add($LocalDate)
}
# Get the next scheduled evaluation date/time
$LastEvaluation = $EvaluationEntries | Select Last 1
$date = $LastEvaluation.Line.Split()[8]
$time = $LastEvaluation.Line.Split()[9]
$ampm = $LastEvaluation.Line.Split()[10]
$NextEvaluation = [DateTime]"$date $time $ampm"
$NextEvaluationLocal = Get-Date $NextEvaluation Format (Get-Culture).DateTimeFormat.RFC1123Pattern
# Return the results
Write-Host "Assignment: $($AssignmentID.AssignmentName)" ForegroundColor Green
Write-host "Last Evaluations:"
foreach ($Evaluation in $Evaluations)
{
Write-host " $Evaluation" ForegroundColor Yellow
}
Write-host "Next Scheduled Evaluation:"
Write-Host " $NextEvaluationLocal" ForegroundColor Yellow
}
Else
{
Write-Host "No log entries found!" ForegroundColor Red
}
}

[Unsupported] Getting / triggering ConfigMgr Client Programs using Software Center Code

An odd title perhaps, but I recently had a requirement to retrieve the deadline for a deployed task sequence on the client side in the user context using PowerShell. You can find this info in WMI, using the CCM_Program class of the ROOT\ccm\ClientSDK namespace. Problem is, standard users do not have access to that.

I tried deploying a script in SYSTEM context to get the deadline from WMI and stamp it to a registry location where it could be read in the user context, however curiously the CCM_Program namespace is not accessible in SYSTEM context. A quick Google search assured me I was not alone scratching my head over that one.

I found a way to do it using a Software Center dll, which I’m sure is not supported, but it works at least. Run the following PowerShell code as the logged-on user to find the deadline for a deployed program (could be a classic package/program or task sequence).

$PackageID = "ABC0012B"
Add-Type -Path $env:windir\CCM\SCClient.data.dll
$Connector = [Microsoft.SoftwareCenter.Client.Data.ClientConnectionFactory]::CreateDataConnector()
$Package = $Connector.AllProgramApplications | where {$_.PackageId -eq $PackageID}
$Connector.Dispose()
If ($Package)
{
    $Deadline = Get-Date $Package.DeadlineDisplayValue
}

You can do some other nice things with that Software Center data connector class, for example, trigger a task sequence to run. But you didn’t hear that from me 😉

$PackageID = "ABC0012B"
Add-Type -Path $env:windir\CCM\SCClient.data.dll
$Connector = [Microsoft.SoftwareCenter.Client.Data.ClientConnectionFactory]::CreateDataConnector()
$Package = $Connector.AllProgramApplications | where {$_.PackageId -eq $PackageID}
$Connector.InstallApplication($Package,$false,$false)
$Connector.Dispose()

Setting the Computer Description During Windows Autopilot

I’ve been getting to grips with Windows Autopilot recently and, having a long history working with SCCM, I’ve found it hard not to compare it with the power of traditional OSD using a task sequence. In fact, one of my goals was to basically try to reproduce what I’m doing in OSD with Autopilot in order to end up with the same result – and it’s been a challenge.

I like the general concept of Autopilot and don’t get me wrong – it’s getting better all the time – but it still has its shortcomings that require a bit of creativity to work around. One of the things I do during OSD is to set the computer description in AD. That’s fairly easy to do in a task sequence; you can just script it and run the step using credentials that have the permission to make that change.

In Autopilot however (hybrid AAD join scenario), although you can run Powershell scripts too, they will only run in SYSTEM context during the Autopilot process. That means you either need to give computer accounts the permission to change their own properties in AD, or you have to find a way to run that code using alternate credentials. You can run scripts in the context of the logged-on user, but I don’t want to do that – in fact I disable the user ESP – I want to use a specific account that has those permissions.

You could use SCCM to do it post-deployment if you are co-managing the device, but ideally I want everything to be native to Autopilot where possible, and move away from the hybrid mentality of do what you can with Intune, and use SCCM for the rest.

It is possible to execute code in another user context from SYSTEM context, but when making changes in AD the DirectoryEntry operation kept erroring with “An operations error occurred”. After researching, I realized it is due to AD not accepting the authentication token as it’s being passed a second time and not directly. I tried creating a separate powershell process, a background job, a runspace with specific credentials – nothing would play ball. Anyway, I found a way to get around that by using the AccountManagement .Net class, which allows you to create a context using specific credentials.

In this example, I’m setting the computer description based on the model and serial number of the device. You need to provide the username and password for the account you will perform the AD operation with. I’ve put the password in clear text in this example, but in the real world we store the credentials in an Azure Keyvault and load them in dynamically at runtime with some POSH code to avoid storing them in the script. I hope in the future we will be able to run Powershell scripts with Intune in a specific user context, as you can with steps in an SCCM task sequence.

# Set credentials
$ADAccount = "mydomain\myADaccount"
$ADPassword = "Pa$$w0rd"

# Set initial description
$Model = Get-WMIObject -Class Win32_ComputerSystem -Property Model -ErrorAction Stop| Select -ExpandProperty Model
$SerialNumber = Get-WMIObject -Class Win32_BIOS -Property SerialNumber -ErrorAction Stop | Select -ExpandProperty SerialNumber
$Description = "$Model - $SerialNumber"

# Set some type accelerators
Add-Type -AssemblyName System.DirectoryServices.AccountManagement -ErrorAction Stop
$Accelerators = [PowerShell].Assembly.GetType("System.Management.Automation.TypeAccelerators")
$Accelerators::Add("PrincipalContext",[System.DirectoryServices.AccountManagement.PrincipalContext])
$Accelerators::Add("ContextType",[System.DirectoryServices.AccountManagement.ContextType])
$Accelerators::Add("Principal",[System.DirectoryServices.AccountManagement.ComputerPrincipal])
$Accelerators::Add("IdentityType",[System.DirectoryServices.AccountManagement.IdentityType])

# Connect to AD and set the computer description
$Domain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$PrincipalContext = [PrincipalContext]::new([ContextType]::Domain,$Domain,$ADAccount,$ADPassword)
$Account = [Principal]::FindByIdentity($PrincipalContext,[IdentityType]::Name,$env:COMPUTERNAME)
$LDAPObject = $Account.GetUnderlyingObject()
If ($LDAPObject.Properties["description"][0])
{
    $LDAPObject.Properties["description"][0] = $Description
}
Else
{
    [void]$LDAPObject.Properties["description"].Add($Description)
}
$LDAPObject.CommitChanges()
$Account.Dispose()