In the 2002 release of Endpoint Configuration Manager, Microsoft added a nice capability to collect log files from a client to the site server. Whilst this is a cool capability, you might not be on 2002 yet or you might prefer to send logs to a storage account in Azure rather than to the site server. You can do that quite easily using the Run Script feature. This works whether the client is connected on the corporate network or through a Cloud Management Gateway.
To do this you need a storage account in Azure, a container in the account, and a Shared access signature.
I’ll assume you have the first two in place, so let’s create a Shared access signature. In the Storage account in the Azure Portal, click on Shared access signature under Settings.
Under Allowed services, check Blob.
Under Allowed resource types, check Object.
Under Allowed permissions, check Create.
Set an expiry date then click Generate SAS and connection string. Copy the SAS token and keep it safe somewhere.
Below is a PowerShell script that will upload client log files to Azure storage.
ContainerURL. This is the URL to the container in your storage account. You can find it by clicking on the container, then Properties > URL.
SASToken. This is the SAS token string you created earlier.
Create and approve a new Script in ConfigMgr with this code. You can then run it against any online machine, or collection. When it’s complete, it will output how many log files were uploaded and how long the upload took.
To view the log files, you can either browse them in storage account in the Azure portal looking at the container directly, or using the Storage explorer. My preferred method is to use the standalone Microsoft Azure Storage Explorer app, where you can simply double-click a log file to open it, or easily download the folder containing the log files to your local machine.
I was experimenting with different ways to get additional languages installed and configured during Windows Autopilot and it proved to be an interesting challenge. The following is what I settled on in the end and what produced the results that I wanted.
Here were my particular requirements, but you can customize this per your own need:
The primary language should be English (United Kingdom)
An additional secondary language of English (United States)
Display language should be English (United Kingdom)
Default input override should be English (United Kingdom)
System locale should be English (United Kingdom)
The administrative defaults for the Welcome screen and New user accounts must have a display language, input language, format and location matching the primary language (UK / UK English)
All optional features for the primary language should be installed (handwriting, optical character recognition, etc)
To achieve this, I basically created three elements:
Installed the Local Experience Pack for English (United Kingdom)
Deployed a powershell script running in administrative context that sets the administrative language defaults and system locale
Deployed a powershell script running in user context that sets the correct order in the user preferred languages list
This was deployed during Autopilot to a Windows 10 1909 (United States) base image.
Local Experience Packs
Local Experience Packs (LXPs) are the modern way to go for installing additional languages since Windows 10 1803. These are published to the Microsoft Store and are automatically updated. They also install more quickly that the traditional cab language packs that you would install with DISM.
LXPs are available in the Microsoft Store for Business, so they can be synced with Intune and deployed as apps. However, the problem with using LXPs as apps during Autopilot is the order of things. The LXP needs to be installed before the PowerShell script that configures the language defaults runs, and since PowerShell scripts are not currently tracked in the ESP, and apps are the last thing to install in the device setup phase, the scripts will very likely run before the app is installed.
To get around that, I decided to get the LXP from the Volume Licensing Center instead. Then I uploaded this to a storage account in Azure, where it gets downloaded and installed by the PowerShell script. This way I can control the order and be sure the LXP is installed before making configuration changes.
When downloading from the VLC, be sure to select the Multilanguage option:
Then get the highlighted ISO. The 1903 LXPs work for 1909 also.
Get the applicable appx file and the license file from the ISO, zip them, and upload the zip file into an Azure Storage account.
When uploading the zip file, be sure to choose the Account Key authentication type:
Once uploaded, click on the blob and go to the Generate SAS page. Choose Read permissions, set an appropriate expiry date, then copy the Blob SAS URL. You will need this to download the file with PowerShell.
Administrative PowerShell Script
Now lets create a PowerShell script that will:
Download and install the Local Experience Pack
Install any optional features for the language
Configure language and regional settings and defaults
Here’s the script I’m using for that.
# Admin-context script to set the administrative language defaults, system locale and install optional features for the primary language
# Language codes
# Enable side-loading
# Required for appx/msix prior to build 18956 (1909 insider)
First, I’ve entered the locale IDs for the primary and secondary languages, as well as the keyboard layout hex codes, and finally the Geo location ID for the primary language as variables.
Then we set a registry key to allow side-loading (required for older W10 versions for the install of appx/msix).
Next we download and install the LXP. You’ll need to enter the URL you copied earlier for the Azure blob, and update the zip filename as required, as well as the LXP filename.
Then we install any optional features for the primary language that aren’t already installed.
Then we define the content of an XML file that will be used to set the language and locale preferences. Obviously customize that per your requirement.
Then we save that content to a file and apply it.
Create the PowerShell script in Intune, make sure you don’t run it using the logged on credentials, and deploy it to your Autopilot AAD group.
User PowerShell Script
Now we need to create a very simple script that will run in the user context. This script simply makes sure that the list of preferred languages is in the correct order, as by default it will look like this:
This script will run for each user that logs in. It won’t run immediately so the order may be wrong when you first log in, but it doesn’t take long before it runs. Create the script in Intune, remember to run it using the logged on credentials, and deploy it to your Autopilot AAD group.
In this blog I’ll cover how to list, get, create, update, delete and assign PowerShell scripts in Intune using Microsoft Graph and PowerShell.
Although you can use the Invoke-WebRequest or Invoke-RestMethod cmdlets when working with MS Graph, I prefer to use the Microsoft.Graph.Intune module, aka Intune PowerShell SDK, as it more nicely handles getting an auth token and we don’t have to create any headers, so get that module installed.
In the Graph API, PowerShell scripts live under the deviceManagementScript resource type and these are still only available in the beta schema so they are subject to change.
Connect to MS Graph
First off, let’s connect to MS Graph and set the schema to beta:
Now lets create a new script. To create a script we will read in a script file and convert it into base64. We add this together with other required parameters into some JSON before posting the request.
When reading and converting the script content use UTF8. Other character sets may not decode properly at run-time on the client-side and result in script execution failure.
To update an existing script, we follow a similar process to creating a new script, we create some JSON that contains the updated parameters then call the Patch method to update it. But first we need to get the Id of the script we want to update, using our previously created function:
We can call Get on the script again and check the lastModifiedDateTime entry to verify that the script was updated, or check in the portal.
Add an Assignment
Before the script will execute anywhere it needs to be assigned to a group. To do that, we need the objectId of the AAD group we want to assign it to. To work with AAD groups I prefer to use the AzureAD module, so install that before continuing.
We need to again get the script that we want to assign:
To replace the current assignment with a new assignment, simply change the group name and run the same code again. To add an additional assignment or multiple assignments, you’ll need to post all the assignments at the same time, for example:
I’ve done a lot of testing with Windows Autopilot in recent times. Most of my tests are done in virtual machines, which are ideal as I can simply dispose of them after. But you also need to cleanup the device records that were created in Azure Active Directory, Intune, the Autopilot registration service, Microsoft Endpoint Manager (if you’re using it) and Active Directory in the case of Hybrid-joined devices.
To make this a bit easier, I wrote the following PowerShell script. You simply enter the device name and it’ll go and search for that device in any of the above locations that you specify and delete the device records.
The script assumes you have the appropriate permissions, and requires the Microsoft.Graph.Intune and AzureAD PowerShell modules, as well as the Configuration Manager module if you want to delete from there.
You can delete from all of the above locations with the -All switch, or you can specify any combination, for example -AAD -Intune -ConfigMgr, or -AD -Intune etc.
In the case of the Autopilot device registration, the device must also exist in Intune before you attempt to delete it as the Intune record is used to determine the serial number of the device.
Please test thoroughly before using on any production device!
In August last year, I posted an updated version of a custom Windows 10-style splash screen I created for use in a ConfigMgr upgrade task sequence. Since Windows 10 1909 came on the scene a few have commented that the splash screens will appear for a few seconds then disappear when running in a task sequence. I was able to reproduce the issue and have updated the scripts to correct that problem.
One additional script has been added (Create-Runspaces.ps1) and the Show-OSUpgradeBackground.ps1 code has changed, but you only need to update your package content – the way you call the scripts in a task sequence remains unchanged.
The problem occurred because the PowerShell process which creates the runspaces that display the splash screens only stays alive long enough for the runspaces to be created. These runspaces run in separate processes. I don’t know why the behaviour is different in W10 1909 or if it’s specific to a particular version of ConfigMgr, but when the first process ends, the spawned processes are also closed down.
To work around that, the Show-OSUpgradeBackground script now creates an additional PowerShell process which calls the script Create-Runspaces, and this script does what the first script did previously – create the runspace/s to display the splash screen/s.
Introducing the additional process means the first process can close down with affecting anything, and the task sequence is not blocked from continuation while the splash screens display.
As amazing as it may sound for an IT professional of many years, aside from my work laptops, I do not own a Windows computer! For my personal computing requirements, I use a £200 Chromebook to connect to the internet and do most everything I need in the cloud.
My dear wife, on the other hand, is not a fan of the Chromebook and likes a ‘good old-fashioned’ Windows computer. But since her computing requirements are minimal, I decided to investigate the cost of running a Windows 10 VM in Azure instead of buying a new Windows laptop that she won’t use very often. Turns out, it’s quite a cost effective way to run a Windows 10 OS 🙂 The only things you really pay for are the disk storage and VM compute time. To access the VM, we simply use a remote desktop app on the Chromebook.
The first thing you need is an Azure subscription. Currently you get some credits for the first month, then some services that are free for the first 12 months. Even after that, if you have no resources in the subscription you won’t pay a penny for it.
You can create a Windows 10 VM from the Azure Marketplace. Creating the VM will create a few resources in a resource group such as a NIC, an NSG (network security group), a disk and of course the VM itself. To save on cost, I didn’t add any data disk and just used the 127GiB disk that comes with the OS. I also used the basic sku for the NSG, and I didn’t assign a static public IP address – I simply added a DNS name. You’ll get charged for a static IP (something like £0.06 p/day) but if you use a dynamic IP with a DNS name you won’t get charged anything.
For the disk, I somehow assumed that I would need a Premium SSD to get good performance as that’s what I would typically use for corporate VMs, but as this is for home use and I’m not really concerned about SLAs, I experimented with the Standard SSD and the Standard HDD as well. I was surprised to find the the Standard HDD was perfectly adequate for every day use and I didn’t really notice much difference in performance with either of the SSD options. Of course you do get less IOPS with an HDD, but that hasn’t been any issue. Since an HDD is much cheaper than an SSD, it made sense to use one.
For the VM size, I used an F4s_V2 which is compute optimized, has 8GB RAM, 4vCPUs and runs great. You could certainly get away with a smaller size though and shave your compute costs, something like a DS2_V3 and it’ll still run just fine.
In the tables below I summarized the actual costs of running the VM and also compared the costs of using Premium SSD/Standard SSD/Standard HDD. These costs are in GBP (£) and are in the UK South Azure region and are true at the time of writing – prices will vary based on region and currency and VM compute hours. The costs are also from actual invoiced costs – not from the Azure price calculator. The price calculator can give a good ball-park figure but in my experience the actual cost will be different…
Note: there are also data egress costs, ie data coming out of Azure. Downloads etc within the VM are ingress and don’t generally get charged. But even for egress you get the first 5GB free anyway (see here).
Cost (£ GBP)
Compute costs for F4s_V2 VM
Cost (£ GBP) Premium SSD
Cost (£ GBP) Standard SSD
Cost (£ GBP) Standard HDD
Disk storage costs
So the base cost for owning the VM is £3.65 p/month using a Standard HDD. On top of that is the compute time. For example, if I use the VM for 20 hours in a month, the compute cost is £3.20 for the month. Add that to the base cost, and it’s £6.85 for the month. That’s not bad 🙂
Some key things to remember are that you always pay for the disk storage whether you use the VM or not. You only pay for compute time when you actually turn on the VM and use it. Always remember to actually stop your VM when finished (not just shut down the OS) so that the resources are de-allocated and you are not charged for unnecessary compute time. Use the Auto-shutdown feature to ensure the VM gets stopped every night. Also, since you have a public IP address it’s a good idea to use some NSG rules to restrict who or from where and on which ports you can access your VM.
Using an Azure VM for personal computing needs is a great option – you benefit from the elasticity and scalability of the cloud, and you only pay for what you use. You can scale up or scale down at any time according to your needs and you can set a budget and keep your eye on how much you’re spending using Azure cost management.
So I was preparing an OSD task sequence in ConfigMgr to deploy Windows 10 1909 and I wanted to know if any of the HP workstations I would be deploying to had updated driver packs available for 1909, since I had simply copied the task sequence used for 1903.
A while ago I posted a blog with a script to download the latest driver packs from Dell using a web-scraping method, so I decided to take a similar approach for HP driver packs. HP publish a list of the driver packs available for various models and OS versions on the web in a tabular format, so I decided to try to convert that HTML table into data that could also be displayed in table format in PowerShell as well as being queryable, and the script below is the result.
This kind of data displays well in PowerShell’s gridview, like so:
Get-HPDriverPacks | Out-GridView
In the first column you find the models and the next columns contain the driver pack version, release date and download URL for the various OS versions. You can then use the gridview’s native filtering capabilities to find something specific:
By default, the script will get the 64-bit driver packs, but you can also get 32-bit for Windows 7 etc (you’re not still using Windows 7 are you?!):
Get-HPDriverPacks -Architecture '32-bit'
I can also query the data within PowerShell directly, for example:
Have you ever been in the situation where something unexpected happens on a users computer and people start pointing their fingers at the ConfigMgr admin and asking “has anyone deployed something with SCCM?” Well, I decided to write a PowerShell script to retrieve the execution history for ConfigMgr programs on a local or remote client. This gives clear visibility of when and which deployments such as applications/programs/task sequences have run on the client and hopefully acquit you (or prove you guilty!)
Program execution history can be found in the registry but it doesn’t contain the name of the associated package, so I joined that data with software distribution data from WMI to give a better view.
You can run the script against the local machine, or a remote machine if you have PS remoting enabled. You can also run it against multiple machines at the same time and combine the data if desired. I recommend to pipe the results to grid view.
I was testing a compliance baseline recently and wanted to verify if the schedule defined in the baseline deployment is actually honored on the client. I set the schedule to run every hour, but it was clear that it did not run every hour and that some randomization was being used.
To review the most recent evaluation times and the next scheduled evaluation time, I had to read the scheduler.log in the CCM\Logs directory, because I could only find a single last evaluation time recorded in WMI.
The following PowerShell script reads which baselines are currently deployed to the local machine, displays a window for you to choose one, then basically reads the Scheduler log to find when the most recent evaluations were and when the next one is scheduled.
An odd title perhaps, but I recently had a requirement to retrieve the deadline for a deployed task sequence on the client side in the user context using PowerShell. You can find this info in WMI, using the CCM_Program class of the ROOT\ccm\ClientSDK namespace. Problem is, standard users do not have access to that.
I tried deploying a script in SYSTEM context to get the deadline from WMI and stamp it to a registry location where it could be read in the user context, however curiously the CCM_Program namespace is not accessible in SYSTEM context. A quick Google search assured me I was not alone scratching my head over that one.
I found a way to do it using a Software Center dll, which I’m sure is not supported, but it works at least. Run the following PowerShell code as the logged-on user to find the deadline for a deployed program (could be a classic package/program or task sequence).