Collecting ConfigMgr Client Logs to Azure Storage

In the 2002 release of Endpoint Configuration Manager, Microsoft added a nice capability to collect log files from a client to the site server. Whilst this is a cool capability, you might not be on 2002 yet or you might prefer to send logs to a storage account in Azure rather than to the site server. You can do that quite easily using the Run Script feature. This works whether the client is connected on the corporate network or through a Cloud Management Gateway.

To do this you need a storage account in Azure, a container in the account, and a Shared access signature.

I’ll assume you have the first two in place, so let’s create a Shared access signature. In the Storage account in the Azure Portal, click on Shared access signature under Settings.

  • Under Allowed services, check Blob.
  • Under Allowed resource types, check Object.
  • Under Allowed permissions, check Create.

Set an expiry date then click Generate SAS and connection string. Copy the SAS token and keep it safe somewhere.

Below is a PowerShell script that will upload client log files to Azure storage.

Update the following parameters in your script:

  • ContainerURL. This is the URL to the container in your storage account. You can find it by clicking on the container, then Properties > URL.
  • SASToken. This is the SAS token string you created earlier.

Create and approve a new Script in ConfigMgr with this code. You can then run it against any online machine, or collection. When it’s complete, it will output how many log files were uploaded and how long the upload took.

To view the log files, you can either browse them in storage account in the Azure portal looking at the container directly, or using the Storage explorer. My preferred method is to use the standalone Microsoft Azure Storage Explorer app, where you can simply double-click a log file to open it, or easily download the folder containing the log files to your local machine.

Installing and Configuring Additional Languages during Windows Autopilot

I was experimenting with different ways to get additional languages installed and configured during Windows Autopilot and it proved to be an interesting challenge. The following is what I settled on in the end and what produced the results that I wanted.

Here were my particular requirements, but you can customize this per your own need:

  • The primary language should be English (United Kingdom)
  • An additional secondary language of English (United States)
  • Display language should be English (United Kingdom)
  • Default input override should be English (United Kingdom)
  • System locale should be English (United Kingdom)
  • The administrative defaults for the Welcome screen and New user accounts must have a display language, input language, format and location matching the primary language (UK / UK English)
  • All optional features for the primary language should be installed (handwriting, optical character recognition, etc)

To achieve this, I basically created three elements:

  1. Installed the Local Experience Pack for English (United Kingdom)
  2. Deployed a powershell script running in administrative context that sets the administrative language defaults and system locale
  3. Deployed a powershell script running in user context that sets the correct order in the user preferred languages list

This was deployed during Autopilot to a Windows 10 1909 (United States) base image.

Local Experience Packs

Local Experience Packs (LXPs) are the modern way to go for installing additional languages since Windows 10 1803. These are published to the Microsoft Store and are automatically updated. They also install more quickly that the traditional cab language packs that you would install with DISM.

LXPs are available in the Microsoft Store for Business, so they can be synced with Intune and deployed as apps. However, the problem with using LXPs as apps during Autopilot is the order of things. The LXP needs to be installed before the PowerShell script that configures the language defaults runs, and since PowerShell scripts are not currently tracked in the ESP, and apps are the last thing to install in the device setup phase, the scripts will very likely run before the app is installed.

To get around that, I decided to get the LXP from the Volume Licensing Center instead. Then I uploaded this to a storage account in Azure, where it gets downloaded and installed by the PowerShell script. This way I can control the order and be sure the LXP is installed before making configuration changes.

When downloading from the VLC, be sure to select the Multilanguage option:

Then get the highlighted ISO. The 1903 LXPs work for 1909 also.

Get the applicable appx file and the license file from the ISO, zip them, and upload the zip file into an Azure Storage account.

When uploading the zip file, be sure to choose the Account Key authentication type:

Once uploaded, click on the blob and go to the Generate SAS page. Choose Read permissions, set an appropriate expiry date, then copy the Blob SAS URL. You will need this to download the file with PowerShell.

Administrative PowerShell Script

Now lets create a PowerShell script that will:

  • Download and install the Local Experience Pack
  • Install any optional features for the language
  • Configure language and regional settings and defaults

Here’s the script I’m using for that.

A quick walkthrough:

First, I’ve entered the locale IDs for the primary and secondary languages, as well as the keyboard layout hex codes, and finally the Geo location ID for the primary language as variables.

Then we set a registry key to allow side-loading (required for older W10 versions for the install of appx/msix).

Next we download and install the LXP. You’ll need to enter the URL you copied earlier for the Azure blob, and update the zip filename as required, as well as the LXP filename.

Then we install any optional features for the primary language that aren’t already installed.

Then we define the content of an XML file that will be used to set the language and locale preferences. Obviously customize that per your requirement.

Then we save that content to a file and apply it.

Create the PowerShell script in Intune, make sure you don’t run it using the logged on credentials, and deploy it to your Autopilot AAD group.

User PowerShell Script

Now we need to create a very simple script that will run in the user context. This script simply makes sure that the list of preferred languages is in the correct order, as by default it will look like this:

This script will run for each user that logs in. It won’t run immediately so the order may be wrong when you first log in, but it doesn’t take long before it runs. Create the script in Intune, remember to run it using the logged on credentials, and deploy it to your Autopilot AAD group.

The Result

After running the Autopilot deployment and logging in, everything checks out 🙂

Managing Intune PowerShell Scripts with Microsoft Graph

In this blog I’ll cover how to list, get, create, update, delete and assign PowerShell scripts in Intune using Microsoft Graph and PowerShell.

Although you can use the Invoke-WebRequest or Invoke-RestMethod cmdlets when working with MS Graph, I prefer to use the Microsoft.Graph.Intune module, aka Intune PowerShell SDK, as it more nicely handles getting an auth token and we don’t have to create any headers, so get that module installed.

In the Graph API, PowerShell scripts live under the deviceManagementScript resource type and these are still only available in the beta schema so they are subject to change.

Connect to MS Graph

First off, let’s connect to MS Graph and set the schema to beta:

If ((Get-MSGraphEnvironment).SchemaVersion -ne "beta")
{
    $null = Update-MSGraphEnvironment -SchemaVersion beta
}
$Graph = Connect-MSGraph

List PowerShell Scripts

Now we can list the PowerShell scripts we have in Intune:

$URI = "deviceManagement/deviceManagementScripts"
$IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
If ($IntuneScripts.value)
{
    $IntuneScripts = $IntuneScripts.value
}

If we take a look at the results, we’ll see that the script content is not included when we list scripts. It is included when we get a single script, as we’ll see next.

Get a PowerShell Script

To get a specific script, we need to know its Id. To get that, first let’s create a simple function where we can pass a script name and use the Get method to retrieve the script details.

Function Get-IntunePowerShellScript {
    Param($ScriptName)
    $URI = "deviceManagement/deviceManagementScripts" 
    $IntuneScripts = Invoke-MSGraphRequest -HttpMethod GET -Url $URI
    If ($IntuneScripts.value)
    {
        $IntuneScripts = $IntuneScripts.value
    }
    $IntuneScript = $IntuneScripts | Where {$_.displayName -eq "$ScriptName"}
    Return $IntuneScript
}

Now we can use this function to get the script Id and then call Get again adding the script Id to the URL:

$ScriptName = "Escrow Bitlocker Recovery Keys to AAD"
$Script = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($Script.id)"
$IntuneScript = Invoke-MSGraphRequest -HttpMethod GET -Url $URI

If we look at the result, we can see that the script content is now returned, albeit in binary form:

View Script Content

To view the script, we simply need to convert it:

$Base64 =[Convert]::FromBase64String($IntuneScript.scriptContent)
[System.Text.Encoding]::UTF8.GetString($Base64)

Create a Script

Now lets create a new script. To create a script we will read in a script file and convert it into base64. We add this together with other required parameters into some JSON before posting the request.

When reading and converting the script content use UTF8. Other character sets may not decode properly at run-time on the client-side and result in script execution failure.

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD"
    RunAsAccount = "system" # or user
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts"
$Response = Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

We can now see our script in the portal:

Update a Script

To update an existing script, we follow a similar process to creating a new script, we create some JSON that contains the updated parameters then call the Patch method to update it. But first we need to get the Id of the script we want to update, using our previously created function:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

In this example I have updated the content in the source script file so I need to read it in again, as well as updating the description of the script:

$ScriptPath = "C:\temp"
$ScriptName = "Escrow-BitlockerRecoveryKeys.ps1"
$Params = @{
    ScriptName = $ScriptName
    ScriptContent = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes((Get-Content -Path "$ScriptPath\$ScriptName" -Raw -Encoding UTF8)))
    DisplayName = "Escrow Bitlocker Recovery Keys"
    Description = "Backup Bitlocker Recovery key for OS volume to AAD (Updated 2020-03-19)"
    RunAsAccount = "system"
    EnforceSignatureCheck = "false"
    RunAs32Bit = "false"
}
$Json = @"
{
    "@odata.type": "#microsoft.graph.deviceManagementScript",
    "displayName": "$($params.DisplayName)",
    "description": "$($Params.Description)",
    "scriptContent": "$($Params.ScriptContent)",
    "runAsAccount": "$($Params.RunAsAccount)",
    "enforceSignatureCheck": $($Params.EnforceSignatureCheck),
    "fileName": "$($Params.ScriptName)",
    "runAs32Bit": $($Params.RunAs32Bit)
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.id)"
$Response = Invoke-MSGraphRequest -HttpMethod PATCH -Url $URI -Content $Json

We can call Get on the script again and check the lastModifiedDateTime entry to verify that the script was updated, or check in the portal.

Add an Assignment

Before the script will execute anywhere it needs to be assigned to a group. To do that, we need the objectId of the AAD group we want to assign it to. To work with AAD groups I prefer to use the AzureAD module, so install that before continuing.

We need to again get the script that we want to assign:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName

Then get the Azure AD group:

$AzureAD = Connect-AzureAD -AccountId $Graph.UPN
$GroupName = "Intune - [Test] Bitlocker Key Escrow"
$Group = Get-AzureADGroup -SearchString $GroupName

Then we prepare the necessary JSON and post the assignment

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($Group.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

To replace the current assignment with a new assignment, simply change the group name and run the same code again. To add an additional assignment or multiple assignments, you’ll need to post all the assignments at the same time, for example:

$GroupNameA = "Intune - [Test] Bitlocker Key Escrow"
$GroupNameB = "Intune - [Test] Autopilot SelfDeploying Provisioning"
$GroupA = Get-AzureADGroup -SearchString $GroupNameA
$GroupB = Get-AzureADGroup -SearchString $GroupNameB

$Json = @"
{
    "deviceManagementScriptGroupAssignments": [
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupA.ObjectId)"
        },
        {
          "@odata.type": "#microsoft.graph.deviceManagementScriptGroupAssignment",
          "id": "$($IntuneScript.Id)",
          "targetGroupId": "$($GroupB.ObjectId)"
        }
      ]
}
"@
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)/assign"
Invoke-MSGraphRequest -HttpMethod POST -Url $URI -Content $Json

Delete an Assignment

I haven’t yet figured out how to delete an assignment – the current documentation appears to be incorrect. If you can figure this out please let me know!

Delete a Script

To delete a script, we simply get the script Id and call the Delete method on it:

$ScriptName = "Escrow Bitlocker Recovery Keys"
$IntuneScript = Get-IntunePowerShellScript -ScriptName $ScriptName
$URI = "deviceManagement/deviceManagementScripts/$($IntuneScript.Id)"
Invoke-MSGraphRequest -HttpMethod DELETE -Url $URI 

Delete Device Records in AD / AAD / Intune / Autopilot / ConfigMgr with PowerShell

I’ve done a lot of testing with Windows Autopilot in recent times. Most of my tests are done in virtual machines, which are ideal as I can simply dispose of them after. But you also need to cleanup the device records that were created in Azure Active Directory, Intune, the Autopilot registration service, Microsoft Endpoint Manager (if you’re using it) and Active Directory in the case of Hybrid-joined devices.

To make this a bit easier, I wrote the following PowerShell script. You simply enter the device name and it’ll go and search for that device in any of the above locations that you specify and delete the device records.

The script assumes you have the appropriate permissions, and requires the Microsoft.Graph.Intune and AzureAD PowerShell modules, as well as the Configuration Manager module if you want to delete from there.

You can delete from all of the above locations with the -All switch, or you can specify any combination, for example -AAD -Intune -ConfigMgr, or -AD -Intune etc.

In the case of the Autopilot device registration, the device must also exist in Intune before you attempt to delete it as the Intune record is used to determine the serial number of the device.

Please test thoroughly before using on any production device!

Examples

Delete-AutopilotedDeviceRecords -ComputerName PC01 -All
@(
    'PC01'
    'PC02'
    'PC03'
) | foreach {
    Delete-AutopilotedDeviceRecords -ComputerName $_ -AAD -Intune
}

Output

Script

Windows 10 Splash Screen Issue Fixed for W10 1909 / ConfigMgr Task Sequence

In August last year, I posted an updated version of a custom Windows 10-style splash screen I created for use in a ConfigMgr upgrade task sequence. Since Windows 10 1909 came on the scene a few have commented that the splash screens will appear for a few seconds then disappear when running in a task sequence. I was able to reproduce the issue and have updated the scripts to correct that problem.

You can find the updated files in my GitHub repo.

One additional script has been added (Create-Runspaces.ps1) and the Show-OSUpgradeBackground.ps1 code has changed, but you only need to update your package content – the way you call the scripts in a task sequence remains unchanged.

Thanks to Gary Blok for pushing me on this!

Technical stuff – if you care 🙂

The problem occurred because the PowerShell process which creates the runspaces that display the splash screens only stays alive long enough for the runspaces to be created. These runspaces run in separate processes. I don’t know why the behaviour is different in W10 1909 or if it’s specific to a particular version of ConfigMgr, but when the first process ends, the spawned processes are also closed down.

To work around that, the Show-OSUpgradeBackground script now creates an additional PowerShell process which calls the script Create-Runspaces, and this script does what the first script did previously – create the runspace/s to display the splash screen/s.

Introducing the additional process means the first process can close down with affecting anything, and the task sequence is not blocked from continuation while the splash screens display.

The Cost of Running a Personal Windows 10 VM in Azure

As amazing as it may sound for an IT professional of many years, aside from my work laptops, I do not own a Windows computer! For my personal computing requirements, I use a £200 Chromebook to connect to the internet and do most everything I need in the cloud.

My dear wife, on the other hand, is not a fan of the Chromebook and likes a ‘good old-fashioned’ Windows computer. But since her computing requirements are minimal, I decided to investigate the cost of running a Windows 10 VM in Azure instead of buying a new Windows laptop that she won’t use very often. Turns out, it’s quite a cost effective way to run a Windows 10 OS 🙂 The only things you really pay for are the disk storage and VM compute time. To access the VM, we simply use a remote desktop app on the Chromebook.

The first thing you need is an Azure subscription. Currently you get some credits for the first month, then some services that are free for the first 12 months. Even after that, if you have no resources in the subscription you won’t pay a penny for it.

You can create a Windows 10 VM from the Azure Marketplace. Creating the VM will create a few resources in a resource group such as a NIC, an NSG (network security group), a disk and of course the VM itself. To save on cost, I didn’t add any data disk and just used the 127GiB disk that comes with the OS. I also used the basic sku for the NSG, and I didn’t assign a static public IP address – I simply added a DNS name. You’ll get charged for a static IP (something like £0.06 p/day) but if you use a dynamic IP with a DNS name you won’t get charged anything.

For the disk, I somehow assumed that I would need a Premium SSD to get good performance as that’s what I would typically use for corporate VMs, but as this is for home use and I’m not really concerned about SLAs, I experimented with the Standard SSD and the Standard HDD as well. I was surprised to find the the Standard HDD was perfectly adequate for every day use and I didn’t really notice much difference in performance with either of the SSD options. Of course you do get less IOPS with an HDD, but that hasn’t been any issue. Since an HDD is much cheaper than an SSD, it made sense to use one.

For the VM size, I used an F4s_V2 which is compute optimized, has 8GB RAM, 4vCPUs and runs great. You could certainly get away with a smaller size though and shave your compute costs, something like a DS2_V3 and it’ll still run just fine.

In the tables below I summarized the actual costs of running the VM and also compared the costs of using Premium SSD/Standard SSD/Standard HDD. These costs are in GBP (£) and are in the UK South Azure region and are true at the time of writing – prices will vary based on region and currency and VM compute hours. The costs are also from actual invoiced costs – not from the Azure price calculator. The price calculator can give a good ball-park figure but in my experience the actual cost will be different…

Note: there are also data egress costs, ie data coming out of Azure. Downloads etc within the VM are ingress and don’t generally get charged. But even for egress you get the first 5GB free anyway (see here).

Time periodCost (£ GBP)
Per hour0.16
Per day3.84
Per month116.8
Per year1401.6
Compute costs for F4s_V2 VM
Time periodCost (£ GBP) Premium SSDCost (£ GBP) Standard SSDCost (£ GBP) Standard HDD
Per day0.570.250.12
Per month17.347.63.65
Per year208.0591.2543.8
Disk storage costs

So the base cost for owning the VM is £3.65 p/month using a Standard HDD. On top of that is the compute time. For example, if I use the VM for 20 hours in a month, the compute cost is £3.20 for the month. Add that to the base cost, and it’s £6.85 for the month. That’s not bad 🙂

Some key things to remember are that you always pay for the disk storage whether you use the VM or not. You only pay for compute time when you actually turn on the VM and use it. Always remember to actually stop your VM when finished (not just shut down the OS) so that the resources are de-allocated and you are not charged for unnecessary compute time. Use the Auto-shutdown feature to ensure the VM gets stopped every night. Also, since you have a public IP address it’s a good idea to use some NSG rules to restrict who or from where and on which ports you can access your VM.

Using an Azure VM for personal computing needs is a great option – you benefit from the elasticity and scalability of the cloud, and you only pay for what you use. You can scale up or scale down at any time according to your needs and you can set a budget and keep your eye on how much you’re spending using Azure cost management.