HTML Report for SCCM Site Component Warnings and Errors

Just a quick one 🙂

If you’re like me you are too lazy busy to regularly check the component status of an SCCM Site Server for any issues, so why not get PowerShell to do it for you?

The code below will email an html-formatted report of any site components that are currently in an error or warning status, together with the last few error or warning status messages for each component. Run it as a scheduled task or with your favorite automation tool to keep your eye on any current issues. Whether you get annoyed because you now created more work for yourself, or get happy because you can stay on top of issues in your SCCM environment, I leave to you!

The report will display the components that are marked as either critical or warning with the current number of messages:

It will then display the last x status messages for each component for a quick view of what the current issue/s are:

Run the script either on the site server or somewhere where the SCCM console is installed, and set the required parameters in the script.

Create Collections for SCCM Client Installation Failures by Error Code

Ok, so in a perfect SCCM world you would never have any SCCM client installation failures and this post would be totally unnecessary. But in the real world, you are very likely to have a number of systems that fail to install the SCCM client and the reasons can be many.

To identify such systems, it can be helpful to create collections for some of the common client installation failure codes so you can easily see and report on which type of installation failures you are experiencing and the number of systems affected.

To identify the installation failure error codes you have in your environment for Windows systems, run the following SQL query against the SCCM database:

select 
	Count(cdr.Name) as 'Count',
	cdr.CP_LastInstallationError as 'Last Installation Error Code'
from v_CombinedDeviceResources cdr
where
	cdr.CP_LastInstallationError is not null
	and cdr.IsClient = 0
	and cdr.DeviceOS like '%Windows%'
group by cdr.CP_LastInstallationError
order by 'Count' desc
Client installation error counts

Next simply create a collection for each error code using the following WQL query, changing the LastInstallationError value to the relevant error code:

select 
    SYS.ResourceID,
    SYS.ResourceType,
    SYS.Name,
    SYS.SMSUniqueIdentifier,
    SYS.ResourceDomainORWorkgroup,
    SYS.Client 
from SMS_R_System as SYS 
Inner Join SMS_CM_RES_COLL_SMS00001 as COL on SYS.ResourceID = COL.ResourceID  
Where COL.LastInstallationError = 53 
And (SYS.Client = 0  Or SYS.Client is null)

Error codes are all fine and dandy, but unless you have an error code database in your head you’ll want to translate those codes to friendly descriptions. To do that, I use a PowerShell function I created that pulls the description from the SrsResources.dll which you can find in any SCCM console installation. There’s more than one way to translate error codes though – see my blog post here. Better yet, create yourself an error code SQL database which you can join to in your SQL queries and is super useful for reporting purposes – see my post here.

Anyway, once you’ve translated the error codes, you can name your collections with them for easy reference:

Client installation failure collections

Now comes the hard part – figuring out how to fix those errors and working through all the affected systems 😬

New Tool: Delivery Optimization Monitor

Delivery Optimization Monitor is a tool for viewing Delivery Optimization data on the local or a remote PC.

It is based on the built-in Delivery Optimization UI in Windows 10 but allows you to view data graphically from remote computers as well.

The tool uses the Delivery Optimization PowerShell cmdlets built in to Windows 10 to retrieve and display DO data, including stats and charts for the current month, performance snapshot data and data on any current DO jobs.

Requirements

  • A supported version of Windows 10 (1703 onward)
  • PowerShell 5 minimum
  • .Net Framework 4.6.2 minimum
  • PS Remoting enabled to view data from remote computers.

This WPF tool is coded in Xaml and PowerShell and uses the MahApps.Metro and LiveCharts open source libraries.

Download

Download the tool from the Technet Gallery.

Use

To use the tool, extract the ZIP file, right-click the Delivery Optimization Monitor.ps1 and run with PowerShell.

To run against the local machine, you must run the tool elevated. To do so, create a shortcut to the ps1 file. Edit the properties of the shortcut and change the target to read:

PowerShell.exe -ExecutionPolicy Bypass -File “<pathtoPS1file>”

Right-click the shortcut and run as administrator, or edit the shortcut properties (under Advanced) to run as administrator.

For completeness, you can also change the icon of the shortcut to the icon file included in the bin directory.

Delivery Optimization Statistics

There are 3 tabs – the first displays DO data for the current month together with charts for download and upload statistics.

The second tab displays PerfSnap data and the third displays any current DO jobs.

Shout Out

Shout out to Kevin Rahetilahy over at dev4sys.com for blogging about LiveCharts in PowerShell.

Source Code

Source code can be found on GitHub.

Create Disk Usage Reports with PowerShell and WizTree

Recently I discovered a neat little utility called WizTree, which can be used to report on space used by files and folders on a drive. There are many utilities out there that can do that, but this one supports running on the command line which makes it very useful for scripting scenarios. It also works extremely quickly because it uses the Master File Table on disk instead of the slower Windows / .Net methods.

I wanted to create a disk usage report for systems that have less than 20GB of free space – the recommended minimum for doing a Windows 10 in-place upgrade – so that I can easily review it and identify files / folders that could potentially be deleted to free space on the disk. I wanted to script it so that it can be run in the background and deployed via ConfigMgr, and the resulting reports copied to a server share for review.

The following script does just that.

First, it runs WizTree on the command line and generates two CSV reports, one each for all files and folders on the drive. Next, since the generated CSV files contain sizes in bytes, the script imports the CSVs, converts the size data to include KB, MB and GB, then outputs to 2 new CSV files.

The script then generates 2 custom HTML reports that contain a list of the largest 100 files and folders, sorted by size.

Next it generates an HTML summary report that shows visually how much space is used on the disk and tells you how much space you need to free up to drop under the minimum 20GB-free limit.

Finally, it copies those reports to a server share, which will look like this:

fs

The Disk Usage Summary report will look something like this:

dus

And here’s a snippet from the large directories and files reports:

ld

lf

There are also CSV reports which contain the entire list of files and directories on the drive:

csv

To use the script, simply download the WizTree Portable app, extract the WizTree64.exe and place it in the same location as the script (assuming 64-bit OS).  Set the run location in the script (ie $PSScriptRoot if calling the script, or the directory location if running in the ISE), the temporary location where it can create files, and the server share where you want to copy the reports to. Then just run the script in admin context.

Getting Data from the Intune Data Warehouse with PowerShell

The Intune Data Warehouse is a great addition to the Microsoft Intune service allowing visibility of historical data for reporting, data and trend analysis for your Microsoft MDM environment. It comes with an OData feed that allows you to connect to the data with PowerBI, Microsoft’s reporting and data visualization service.

The Data Warehouse RESTful API (currently in Beta) can be used to get data from the warehouse using a REST client. I decided to explore how to do this with PowerShell so I can run some ad-hoc queries and analyse trends in the data.

To get data from the Intune Data Warehouse we need to do three main things:

  1. Create a native App in Azure and give it access to the Intune Data Warehouse
  2. Authenticate with Azure using OAuth 2.0 and get an access token
  3. Invoke the RESTful web service using http

Create a Native App in Azure

In your Azure portal, go to Azure Active Directory > App registrations. Click New application registration.

Give it a name, make sure it is a Native app (do not use Web app / API) and use the redirect URI https://login.live.com/oauth20_desktop.srf .

appreg

Click Create.

Once created, make a note of the Application ID as we will need this later.

Now, in the App in the Settings blade, click Required permissions > Add > Select an API and select Microsoft Intune API.

In the Add API access blade click Select permissions and grant the delegated permission Get data warehouse information from Microsoft Intune.

perms

Save your changes.

Authenticate with Azure

To authenticate with Azure I wrote the following function in PowerShell:

What the code does…

To authenticate with Azure we need to use the Active Directory Authentication Library from Microsoft. This library is actually included in resources like the ConfigMgr client and the AzureRM PowerShell module, but these do not have the latest versions of the library and the methods it contains have changed over time. So I prefer to use the latest version of the library (3.17.2 at the time of writing) which is available as a NuGet package.

The function will download NuGet and use it to download the latest version of the ADAL library to your user profile area. Once we have the library we will add it in PowerShell. Then we will acquire an access token.

The access token expires after an hour so once created, the token will probably be good for your current session. If it expires, simply run the function again. The access token will be saved to a variable in the script scope and will be used by the function that queries the data warehouse.

How to use it…

Make sure the following parameters contain the required values. I recommend that you store those in the function directly so you don’t have to add them every time.

Simply run the function:


New-IntuneDataWarehouseAccessToken

If you have not previously authenticated with Azure in your current session you will be prompted to sign in to your Azure account:

auth

The first time you use the native app you created, you will also be prompted for permission:

Prompt

Invoke the Web Service using the OData feed

Now that we have an access token, we can invoke the web service using http. I wrote the following PowerShell function to do that:

What the code does…

The function uses the custom OData feed URL for your tenant, creates an http client to invoke the web service and gets data for the data entity (ie collection, or table) that you specify. The results are then returned to the PowerShell console.

You can read more about the data model for the warehouse and get a reference for the various data entities and their relationships on the Microsoft Docs site.

How to use it…

Make sure the following parameter is set in the function:

  • WarehouseUrl

This Url is the custom feed URL for your tenant and you can find it from the Intune blade in Azure. On the Overview blade, on the right you find Other tasks and underneath Set up Intune Data Warehouse.

url

To list the data entities that are available to query use the ListDataEntities switch:


Get-IntuneDataWarehouseData -ListDataEntities

entities

To return the data from a specific data entity, use the DataEntity parameter. This example returns data from the devices table.


Get-IntuneDataWarehouseData -DataEntity devices

devices

Working with the Data

The API supports a few query parameters in the OData protocol v4.0, so rather than returning all the results in the data entity, you can narrow them down. However, I have noticed that the query parameters do not always work as expected when they are combined, at least in the beta version of the API.

For example, you can use the Filter parameter to return only matching results. This query finds a specific device:


Get-IntuneDataWarehouseData -DataEntity devices -Filter "deviceName eq 'SW-IT-LT-AZURE1'"

You can select only specific properties to be returned in the results using the Select parameter:


Get-IntuneDataWarehouseData -DataEntity devices -Select 'deviceName,serialNumber,lastContact'

Select the top 5 results:


Get-IntuneDataWarehouseData -DataEntity devices -Top 5

Skip 10 results and return the rest:


Get-IntuneDataWarehouseData -DataEntity devices -Skip 10

Sort results by a particular property:


Get-IntuneDataWarehouseData -DataEntity devices -OrderBy 'lastContact'

You can learn more about how to use query parameters in the OData protocol from the Microsoft Developer site for the Graph API.

These query parameters are certainly helpful, but for relational queries where you want to reference more than one table or entity, the API comes short and more complex Powershell code is required.

For example, here I am searching for the ethernet MAC address from the most recent hardware inventory for a device, and even with query parameters applied I still need to do some filtering and sorting in Powershell. Remember that the data warehouse only contains snapshots of historic data, so for current data you can use the Graph API instead, but this is just for an example.


$DeviceName = "SW-IT-LT-AZURE1"
Get-IntuneDataWarehouseData -DataEntity 'mdmDeviceInventoryHistories' -Select 'deviceKey,datekey,ethernetMac' |
    Where {$_.devicekey -eq ((Get-IntuneDataWarehouseData -DataEntity 'devices' -Filter "deviceName eq '$DeviceName'").devicekey)} |
    Sort datekey -Descending |
    Select -First 1 |
    Select -ExpandProperty ethernetMac 

This code takes a while to run however because it’s querying the data warehouse more than once to get the data. Another way to do this would be to first load the required device entities into memory, then I can query them more quickly and run other queries from this cached data.

The following code will load just those two entities into a hash table:


$DataEntities = "mdmDeviceInventoryHistories","devices"
$DataHash = @{}
foreach ($DataEntity in $DataEntities)
{
    Write-host "Loading $DataEntity"
    [void]$DataHash.Add($DataEntity,(Get-IntuneDataWarehouseData -DataEntity $DataEntity))
}

Then I can run the following code to get the ethernet MAC address and it returns the result instantly:


$DeviceName = "SW-IT-LT-AZURE1"
$DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select -First 1 |
    Select -ExpandProperty ethernetMac 

You could load the entire data warehouse into memory using the following code, then you can simply work with the resultant hashtable:


$DataEntities = Get-IntuneDataWarehouseData -ListDataEntities
$DataHash = @{}
foreach ($DataEntity in $DataEntities)
{
    Write-host "Loading $DataEntity"
    [void]$DataHash.Add($DataEntity,(Get-IntuneDataWarehouseData -DataEntity $DataEntity))
}

The benefit of a data warehouse of course is that you can review snapshots of data over a period of time and analyse the data for trends or identify when things changed. The following example is using the data hashtable and is reporting the device inventory history of a specific device over time. The Intune data warehouse keeps up to 90 days of historic data. In particular, I want to see how much the free space on disk is changing over time.


$DeviceName = "SW-IT-LT-158"
$Results = $DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select dateKey, deviceName,
        softwareVersion,
        @{e={$([math]::Round(($_.storageFree / 1GB),2))};l="storageFree (GB)"},
        @{e={$([math]::Round(($_.storageTotal / 1GB),2))};l="storageTotal (GB)"}
foreach ($Result in $results){
    $Result | Add-Member -MemberType NoteProperty -Name date -Value (($DataHash['dates'] | Where {$_.dateKey -eq $Result.dateKey}).fullDate | Get-Date -Format "dd MMM yyyy")
}
$Results | Select deviceName,date,softwareVersion,'storageFree (GB)','storageTotal (GB)' | ft

You can readily see that it’s necessary to manipulate the data quite a bit to get the results I want to see, for example in order to do something equivalent to a ‘join’ in SQL I am using Where-Object, and in order to add the data from another table to my results I am using Add-Member. I am also converting the values of the storage data into GB and formatting the date using the UK short date code.

trend1

The results are returned in an array object, but for data like this it can also be useful to use a datatable as you would for SQL data for example.  Then you can add / remove columns, change column order, set the datatype for a column, change headers etc.

This code does exactly the same thing as the last example, but using a datatable for the results.


$DeviceName = "sw-it-lt-158"
$Datatable = New-Object System.Data.DataTable
[void]$Datatable.Columns.AddRange(@('deviceName','date','softwareVersion','storageFree (GB)','storageTotal (GB)'))
$Results = $DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select dateKey,
        deviceName,
        softwareVersion,
        @{e={$([math]::Round(($_.storageFree / 1GB),2))};l="storageFree (GB)"},
        @{e={$([math]::Round(($_.storageTotal / 1GB),2))};l="storageTotal (GB)"}
foreach ($Result in $results){
    [datetime]$Date = ($DataHash['dates'] | Where {$_.dateKey -eq $Result.dateKey}).fullDate
    [void]$DataTable.Rows.Add($Result.deviceName,$Date.ToShortDateString(),$Result.softwareVersion,$Result.'storageFree (GB)', $Result.'storageTotal (GB)')
}
$Datatable | ft

Reviewing the results I can see that the available disk space is decreasing slightly over time. It would be nice to see that data represented graphically, and of course this is where the integration with PowerBI will shine, but we can also generate graphical charts in Powershell, so let’s give that a go.

Here is a function I wrote that will generate a spline chart using the .Net chart controls and display it in a WPF window. It takes a single series of data and you need to provide a title, a data object as an array, the X and Y axis names (which must match the header names in the data object).

To generate the chart, I will use the results from my previous example (not the datatable but the array), sort them by date, select the last 20 data snapshots, select the X and Y axis data into a new object and provide this to the chart function:


$Data = $Results | Sort dateKey | Select date,'storageFree (GB)' | Select -Last 20
New-SingleSeriesSplineChart -Title "Trend of Available Free Storage on SW-IT-LT-158" -Data $Data -AxisX "date" -AxisY "storageFree (GB)"

Now I have a nice graphical view 🙂

SplineChart

I have focused just on devices in this blog, but there is lots of data available in the Intune Data Warehouse including users, policies, compliance, configurations, MAM data etc, all of which can provide valuable insights into your MDM estate and whether you use PowerShell, PowerBI, Excel or whichever tool, the ability to view and analyse historic data is a welcome improvement to the ever-evolving Intune service.

Query for 32-bit or 64-bit Versions of Microsoft Office with ConfigMgr

Quick post – I needed to query for 64-bit versions of Microsoft Office installed on our clients. Usually, the 32-bit version gets installed as this is Microsoft’s recommendation due to add-in compatibility etc. But in some cases, the 64-bit version is required to take advantage of additional RAM. I couldn’t find anything useful online to distinguish between the x86 and x64 versions of Office using inventory data from ConfigMgr, but I found that the product code GUID of the Office product actually contains this information (see here).

So I put together the following SQL query which extracts some additional data about Office from the GUID, including the Release Version (ie RTM, SP1 etc), Release Type (ie Volume License, Trial etc) and the Bit Version.

This should work for any version of Office, but only for the MSI installer (ie not Click-to-Run).

Some example results:

SQLQuery


Select
  sys.Name0 as 'Device Name',
  sof.ProductName0 as 'Product Name',
  sof.ProductVersion0 as 'Product Version',
  sof.InstallDate0 as 'Installation Date',
  sof.Language0 as 'Language Code',
  sof.SoftwareCode0 as 'Software Code',
  'Release Version' =
    Case substring(sof.SoftwareCode0,2,1)
      When '0' Then 'Prior to Beta 1'
      When '1' Then 'Beta 1'
      When '2' Then 'Beta 2'
      When '3' Then 'RC0'
      When '4' Then 'RC1 / OEM Preview'
      When '9' Then 'RTM'
      When 'A' Then 'SP1'
      When 'B' Then 'SP2'
      When 'C' Then 'SP3'
      Else 'Unknown'
    End,
  'Release Type' =
    Case substring(sof.SoftwareCode0,3,1)
      When '0' Then 'Volume License'
      When '1' Then 'Retail / OEM'
      When '2' Then 'Trial'
      When '5' Then 'Download'
      Else 'Unknown'
    End,
  substring(sof.SoftwareCode0,4,2) as 'Major Version',
  substring(sof.SoftwareCode0,6,4) as 'Minor Version',
  'Bit Version' =
    Case substring(sof.SoftwareCode0,21,1)
      When '0' Then '32-bit'
      When '1' Then '64-bit'
      Else 'Unknown'
    End
from v_GS_INSTALLED_SOFTWARE sof
inner join v_R_System sys on sof.ResourceID = sys.ResourceID
where sof.SoftwareCode0 like '%0ff1ce%'
-- Querying for Office Professional Plus --
and sof.ProductName0 like '%Professional Plus%'
-- Querying for 64-bit Office (0 = x86, 1 = x64) --
and substring(sof.SoftwareCode0,21,1) = 1
Order by sys.Name0