Create Collections for SCCM Client Installation Failures by Error Code

Ok, so in a perfect SCCM world you would never have any SCCM client installation failures and this post would be totally unnecessary. But in the real world, you are very likely to have a number of systems that fail to install the SCCM client and the reasons can be many.

To identify such systems, it can be helpful to create collections for some of the common client installation failure codes so you can easily see and report on which type of installation failures you are experiencing and the number of systems affected.

To identify the installation failure error codes you have in your environment for Windows systems, run the following SQL query against the SCCM database:

select 
	Count(cdr.Name) as 'Count',
	cdr.CP_LastInstallationError as 'Last Installation Error Code'
from v_CombinedDeviceResources cdr
where
	cdr.CP_LastInstallationError is not null
	and cdr.IsClient = 0
	and cdr.DeviceOS like '%Windows%'
group by cdr.CP_LastInstallationError
order by 'Count' desc
Client installation error counts

Next simply create a collection for each error code using the following WQL query, changing the LastInstallationError value to the relevant error code:

select 
    SYS.ResourceID,
    SYS.ResourceType,
    SYS.Name,
    SYS.SMSUniqueIdentifier,
    SYS.ResourceDomainORWorkgroup,
    SYS.Client 
from SMS_R_System as SYS 
Inner Join SMS_CM_RES_COLL_SMS00001 as COL on SYS.ResourceID = COL.ResourceID  
Where COL.LastInstallationError = 53 
And (SYS.Client = 0  Or SYS.Client is null)

Error codes are all fine and dandy, but unless you have an error code database in your head you’ll want to translate those codes to friendly descriptions. To do that, I use a PowerShell function I created that pulls the description from the SrsResources.dll which you can find in any SCCM console installation. There’s more than one way to translate error codes though – see my blog post here. Better yet, create yourself an error code SQL database which you can join to in your SQL queries and is super useful for reporting purposes – see my post here.

Anyway, once you’ve translated the error codes, you can name your collections with them for easy reference:

Client installation failure collections

Now comes the hard part – figuring out how to fix those errors and working through all the affected systems ūüė¨

New Tool: Delivery Optimization Monitor

Delivery Optimization Monitor is a tool for viewing Delivery Optimization data on the local or a remote PC.

It is based on the built-in Delivery Optimization UI in Windows 10 but allows you to view data graphically from remote computers as well.

The tool uses the Delivery Optimization PowerShell cmdlets built in to Windows 10 to retrieve and display DO data, including stats and charts for the current month, performance snapshot data and data on any current DO jobs.

Requirements

  • A supported version of Windows 10 (1703 onward)
  • PowerShell 5 minimum
  • .Net Framework 4.6.2 minimum
  • PS Remoting enabled to view data from remote computers.

This WPF tool is coded in Xaml and PowerShell and uses the MahApps.Metro and LiveCharts open source libraries.

Download

Download the tool from the Technet Gallery.

Use

To use the tool, extract the ZIP file, right-click the Delivery Optimization Monitor.ps1 and run with PowerShell.

To run against the local machine, you must run the tool elevated. To do so, create a shortcut to the ps1 file. Edit the properties of the shortcut and change the target to read:

PowerShell.exe -ExecutionPolicy Bypass -File “<pathtoPS1file>”

Right-click the shortcut and run as administrator, or edit the shortcut properties (under Advanced) to run as administrator.

For completeness, you can also change the icon of the shortcut to the icon file included in the bin directory.

Delivery Optimization Statistics

There are 3 tabs – the first displays DO data for the current month together with charts for download and upload statistics.

The second tab displays PerfSnap data and the third displays any current DO jobs.

Shout Out

Shout out to Kevin Rahetilahy over at dev4sys.com for blogging about LiveCharts in PowerShell.

Source Code

Source code can be found on GitHub.

Create Disk Usage Reports with PowerShell and WizTree

Recently I discovered a neat little utility called WizTree, which can be used to report on space used by files and folders on a drive. There are many utilities out there that can do that, but this one supports running on the command line which makes it very useful for scripting scenarios. It also works extremely quickly because it uses the Master File Table on disk instead of the slower Windows / .Net methods.

I wanted to create a disk usage report for systems that have less than 20GB of free space – the recommended minimum for doing a Windows 10 in-place upgrade – so that I can easily review it and identify files / folders that could potentially be deleted to free space on the disk. I wanted to script it so that it can be run in the background and deployed via ConfigMgr, and the resulting reports copied to a server share for review.

The following script does just that.

First, it runs WizTree on the command line and generates two CSV reports, one each for all files and folders on the drive. Next, since the generated CSV files contain sizes in bytes, the script imports the CSVs, converts the size data to include KB, MB and GB, then outputs to 2 new CSV files.

The script then generates 2 custom HTML reports that contain a list of the largest 100 files and folders, sorted by size.

Next it generates an HTML summary report that shows visually how much space is used on the disk and tells you how much space you need to free up to drop under the minimum 20GB-free limit.

Finally, it copies those reports to a server share, which will look like this:

fs

The Disk Usage Summary report will look something like this:

dus

And here’s a snippet from the large directories and files reports:

ld

lf

There are also CSV reports which contain the entire list of files and directories on the drive:

csv

To use the script, simply download the WizTree Portable app, extract the WizTree64.exe and place it in the same location as the script (assuming 64-bit OS).  Set the run location in the script (ie $PSScriptRoot if calling the script, or the directory location if running in the ISE), the temporary location where it can create files, and the server share where you want to copy the reports to. Then just run the script in admin context.

Getting Data from the Intune Data Warehouse with PowerShell

The Intune Data Warehouse is a great addition to the Microsoft Intune service allowing visibility of historical data for reporting, data and trend analysis for your Microsoft MDM environment. It comes with an OData feed that allows you to connect to the data with PowerBI, Microsoft’s reporting and data visualization service.

The Data Warehouse RESTful API (currently in Beta) can be used to get data from the warehouse using a REST client. I decided to explore how to do this with PowerShell so I can run some ad-hoc queries and analyse trends in the data.

To get data from the Intune Data Warehouse we need to do three main things:

  1. Create a native App in Azure and give it access to the Intune Data Warehouse
  2. Authenticate with Azure using OAuth 2.0 and get an access token
  3. Invoke the RESTful web service using http

Create a Native App in Azure

In your Azure portal, go to Azure Active Directory > App registrations. Click New application registration.

Give it a name, make sure it is a Native app (do not use Web app / API) and use the redirect URI https://login.live.com/oauth20_desktop.srf .

appreg

Click Create.

Once created, make a note of the Application ID as we will need this later.

Now, in the App in the Settings blade, click Required permissions > Add > Select an API and select Microsoft Intune API.

In the Add API access blade click Select permissions and grant the delegated permission Get data warehouse information from Microsoft Intune.

perms

Save your changes.

Authenticate with Azure

To authenticate with Azure I wrote the following function in PowerShell:

What the code does…

To authenticate with Azure we need to use the Active Directory Authentication Library from Microsoft. This library is actually included in resources like the ConfigMgr client and the AzureRM PowerShell module, but these do not have the latest versions of the library and the methods it contains have changed over time. So I prefer to use the latest version of the library (3.17.2 at the time of writing) which is available as a NuGet package.

The function will download NuGet and use it to download the latest version of the ADAL library to your user profile area. Once we have the library we will add it in PowerShell. Then we will acquire an access token.

The access token expires after an hour so once created, the token will probably be good for your current session. If it expires, simply run the function again. The access token will be saved to a variable in the script scope and will be used by the function that queries the data warehouse.

How to use it…

Make sure the following parameters contain the required values. I recommend that you store those in the function directly so you don’t have to add them every time.

Simply run the function:


New-IntuneDataWarehouseAccessToken

If you have not previously authenticated with Azure in your current session you will be prompted to sign in to your Azure account:

auth

The first time you use the native app you created, you will also be prompted for permission:

Prompt

Invoke the Web Service using the OData feed

Now that we have an access token, we can invoke the web service using http. I wrote the following PowerShell function to do that:

What the code does…

The function uses the custom OData feed URL for your tenant, creates an http client to invoke the web service and gets data for the data entity (ie collection, or table) that you specify. The results are then returned to the PowerShell console.

You can read more about the data model for the warehouse and get a reference for the various data entities and their relationships on the Microsoft Docs site.

How to use it…

Make sure the following parameter is set in the function:

  • WarehouseUrl

This Url is the custom feed URL for your tenant and you can find it from the Intune blade in Azure. On the Overview blade, on the right you find Other tasks and underneath Set up Intune Data Warehouse.

url

To list the data entities that are available to query use the ListDataEntities switch:


Get-IntuneDataWarehouseData -ListDataEntities

entities

To return the data from a specific data entity, use the DataEntity parameter. This example returns data from the devices table.


Get-IntuneDataWarehouseData -DataEntity devices

devices

Working with the Data

The API supports a few query parameters in the OData protocol v4.0, so rather than returning all the results in the data entity, you can narrow them down. However, I have noticed that the query parameters do not always work as expected when they are combined, at least in the beta version of the API.

For example, you can use the Filter parameter to return only matching results. This query finds a specific device:


Get-IntuneDataWarehouseData -DataEntity devices -Filter "deviceName eq 'SW-IT-LT-AZURE1'"

You can select only specific properties to be returned in the results using the Select parameter:


Get-IntuneDataWarehouseData -DataEntity devices -Select 'deviceName,serialNumber,lastContact'

Select the top 5 results:


Get-IntuneDataWarehouseData -DataEntity devices -Top 5

Skip 10 results and return the rest:


Get-IntuneDataWarehouseData -DataEntity devices -Skip 10

Sort results by a particular property:


Get-IntuneDataWarehouseData -DataEntity devices -OrderBy 'lastContact'

You can learn more about how to use query parameters in the OData protocol from the Microsoft Developer site for the Graph API.

These query parameters are certainly helpful, but for relational queries where you want to reference more than one table or entity, the API comes short and more complex Powershell code is required.

For example, here I am searching for the ethernet MAC address from the most recent hardware inventory for a device, and even with query parameters applied I still need to do some filtering and sorting in Powershell. Remember that the data warehouse only contains snapshots of historic data, so for current data you can use the Graph API instead, but this is just for an example.


$DeviceName = "SW-IT-LT-AZURE1"
Get-IntuneDataWarehouseData -DataEntity 'mdmDeviceInventoryHistories' -Select 'deviceKey,datekey,ethernetMac' |
    Where {$_.devicekey -eq ((Get-IntuneDataWarehouseData -DataEntity 'devices' -Filter "deviceName eq '$DeviceName'").devicekey)} |
    Sort datekey -Descending |
    Select -First 1 |
    Select -ExpandProperty ethernetMac 

This code takes a while to run however because it’s querying the data warehouse more than once to get the data. Another way to do this would be to first load the required device entities into memory, then I can query them more quickly and run other queries from this cached data.

The following code will load just those two entities into a hash table:


$DataEntities = "mdmDeviceInventoryHistories","devices"
$DataHash = @{}
foreach ($DataEntity in $DataEntities)
{
    Write-host "Loading $DataEntity"
    [void]$DataHash.Add($DataEntity,(Get-IntuneDataWarehouseData -DataEntity $DataEntity))
}

Then I can run the following code to get the ethernet MAC address and it returns the result instantly:


$DeviceName = "SW-IT-LT-AZURE1"
$DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select -First 1 |
    Select -ExpandProperty ethernetMac 

You could load the entire data warehouse into memory using the following code, then you can simply work with the resultant hashtable:


$DataEntities = Get-IntuneDataWarehouseData -ListDataEntities
$DataHash = @{}
foreach ($DataEntity in $DataEntities)
{
    Write-host "Loading $DataEntity"
    [void]$DataHash.Add($DataEntity,(Get-IntuneDataWarehouseData -DataEntity $DataEntity))
}

The benefit of a data warehouse of course is that you can review snapshots of data over a period of time and analyse the data for trends or identify when things changed. The following example is using the data hashtable and is reporting the device inventory history of a specific device over time. The Intune data warehouse keeps up to 90 days of historic data. In particular, I want to see how much the free space on disk is changing over time.


$DeviceName = "SW-IT-LT-158"
$Results = $DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select dateKey, deviceName,
        softwareVersion,
        @{e={$([math]::Round(($_.storageFree / 1GB),2))};l="storageFree (GB)"},
        @{e={$([math]::Round(($_.storageTotal / 1GB),2))};l="storageTotal (GB)"}
foreach ($Result in $results){
    $Result | Add-Member -MemberType NoteProperty -Name date -Value (($DataHash['dates'] | Where {$_.dateKey -eq $Result.dateKey}).fullDate | Get-Date -Format "dd MMM yyyy")
}
$Results | Select deviceName,date,softwareVersion,'storageFree (GB)','storageTotal (GB)' | ft

You can readily see that it’s necessary to manipulate the data quite a bit to get the results I want to see, for example in order to do something equivalent to a ‘join’ in SQL I am using Where-Object, and in order to add the data from another table to my results I am using Add-Member. I am also converting the values of the storage data into GB and formatting the date using the UK short date code.

trend1

The results are returned in an array object, but for data like this it can also be useful to use a datatable as you would for SQL data for example.  Then you can add / remove columns, change column order, set the datatype for a column, change headers etc.

This code does exactly the same thing as the last example, but using a datatable for the results.


$DeviceName = "sw-it-lt-158"
$Datatable = New-Object System.Data.DataTable
[void]$Datatable.Columns.AddRange(@('deviceName','date','softwareVersion','storageFree (GB)','storageTotal (GB)'))
$Results = $DataHash['mdmDeviceInventoryHistories'] |
    Where {$_.devicekey -eq (($DataHash['devices'] | where {$_.deviceName -eq $DeviceName}).devicekey)} |
    Sort datekey -Descending |
    Select dateKey,
        deviceName,
        softwareVersion,
        @{e={$([math]::Round(($_.storageFree / 1GB),2))};l="storageFree (GB)"},
        @{e={$([math]::Round(($_.storageTotal / 1GB),2))};l="storageTotal (GB)"}
foreach ($Result in $results){
    [datetime]$Date = ($DataHash['dates'] | Where {$_.dateKey -eq $Result.dateKey}).fullDate
    [void]$DataTable.Rows.Add($Result.deviceName,$Date.ToShortDateString(),$Result.softwareVersion,$Result.'storageFree (GB)', $Result.'storageTotal (GB)')
}
$Datatable | ft

Reviewing the results I can see that the available disk space is decreasing slightly over time. It would be nice to see that data represented graphically, and of course this is where the integration with PowerBI will shine, but we can also generate graphical charts in Powershell, so let’s give that a go.

Here is a function I wrote that will generate a spline chart using the .Net chart controls and display it in a WPF window. It takes a single series of data and you need to provide a title, a data object as an array, the X and Y axis names (which must match the header names in the data object).

To generate the chart, I will use the results from my previous example (not the datatable but the array), sort them by date, select the last 20 data snapshots, select the X and Y axis data into a new object and provide this to the chart function:


$Data = $Results | Sort dateKey | Select date,'storageFree (GB)' | Select -Last 20
New-SingleSeriesSplineChart -Title "Trend of Available Free Storage on SW-IT-LT-158" -Data $Data -AxisX "date" -AxisY "storageFree (GB)"

Now I have a nice graphical view ūüôā

SplineChart

I have focused just on devices in this blog, but there is lots of data available in the Intune Data Warehouse including users, policies, compliance, configurations, MAM data etc, all of which can provide valuable insights into your MDM estate and whether you use PowerShell, PowerBI, Excel or whichever tool, the ability to view and analyse historic data is a welcome improvement to the ever-evolving Intune service.

Query for 32-bit or 64-bit Versions of Microsoft Office with ConfigMgr

Quick post – I needed to query for 64-bit versions of Microsoft Office installed on our clients. Usually, the 32-bit version gets installed as this is Microsoft’s recommendation due to add-in compatibility etc. But in some cases, the 64-bit version is required to take advantage of additional RAM. I couldn’t find anything useful online to distinguish between the x86 and x64 versions of Office using inventory data from ConfigMgr, but I found that the product code GUID of the Office product actually contains this information (see here).

So I put together the following SQL query which extracts some additional data about Office from the GUID, including the Release Version (ie RTM, SP1 etc), Release Type (ie Volume License, Trial etc) and the Bit Version.

This should work for any version of Office, but only for the MSI installer (ie not Click-to-Run).

Some example results:

SQLQuery


Select
  sys.Name0 as 'Device Name',
  sof.ProductName0 as 'Product Name',
  sof.ProductVersion0 as 'Product Version',
  sof.InstallDate0 as 'Installation Date',
  sof.Language0 as 'Language Code',
  sof.SoftwareCode0 as 'Software Code',
  'Release Version' =
    Case substring(sof.SoftwareCode0,2,1)
      When '0' Then 'Prior to Beta 1'
      When '1' Then 'Beta 1'
      When '2' Then 'Beta 2'
      When '3' Then 'RC0'
      When '4' Then 'RC1 / OEM Preview'
      When '9' Then 'RTM'
      When 'A' Then 'SP1'
      When 'B' Then 'SP2'
      When 'C' Then 'SP3'
      Else 'Unknown'
    End,
  'Release Type' =
    Case substring(sof.SoftwareCode0,3,1)
      When '0' Then 'Volume License'
      When '1' Then 'Retail / OEM'
      When '2' Then 'Trial'
      When '5' Then 'Download'
      Else 'Unknown'
    End,
  substring(sof.SoftwareCode0,4,2) as 'Major Version',
  substring(sof.SoftwareCode0,6,4) as 'Minor Version',
  'Bit Version' =
    Case substring(sof.SoftwareCode0,21,1)
      When '0' Then '32-bit'
      When '1' Then '64-bit'
      Else 'Unknown'
    End
from v_GS_INSTALLED_SOFTWARE sof
inner join v_R_System sys on sof.ResourceID = sys.ResourceID
where sof.SoftwareCode0 like '%0ff1ce%'
-- Querying for Office Professional Plus --
and sof.ProductName0 like '%Professional Plus%'
-- Querying for 64-bit Office (0 = x86, 1 = x64) --
and substring(sof.SoftwareCode0,21,1) = 1
Order by sys.Name0 

Inventory Local Administrator Privileges with PowerShell and ConfigMgr

Any security-conscious enterprise will want to have visibility of which users have local administrator privilege on any given system, and if you are an SCCM administrator then the job of gathering this information will likely be handed to you!

However, this task may not be as simple as it seems. Gathering the membership of the local administrators group is one thing, but perhaps more important to know is whether the primary user of a system has administrator privileges. If that user is a member of a group that has been added to the local administrators group, then it isn’t immediately obvious whether they actually have administrator rights without also checking the membership of that group. And what if there are further nested groups – ie the user is a member of a group that’s a member of a group that’s a member of the local administrators group?! Obviously things can get complicated here, making reporting and compliance checking a challenge.

Thankfully, PowerShell can handle complication quite nicely, and ConfigMgr is more than capable as a both a delivery vehicle and a reporting mechanism, so the good news is – we can do this!

The following solution uses PowerShell to gather local administrator information and stamp it to the local registry. A Compliance item in SCCM is used as the delivery vehicle for the script and then RegKeyToMof is used to update the hardware inventory classes in SCCM to gather this information from the client’s registry into the SCCM database, where we can query and report on it.

Gathering Local Administrator Information with PowerShell

To start with, let’s have a look at some of the PowerShell code and the information we will gather with it.

First, we need to identify who is the primary user of the system. Since the script is running locally on the client computer, we will not use User Device Affinity. True, UDA information is stored in WMI in the CCM_UserAffinity class, in the¬†ROOT\CCM\Policy\Machine\ActualConfig namespace. ¬†But this class can contain multiple instances so you can’t always determine the primary user that way.

A better way is to use the SMS_SystemConsoleUsage class in the ROOT\cimv2\sms namespace and query the TopConsoleUser property. This will give you the user account who has had the most interactive logons on the system and for the most part will indicate who the primary user is.


$TopConsoleUser = Get-WmiObject -Namespace ROOT\cimv2\sms -Class SMS_SystemConsoleUsage -Property TopConsoleUser -ErrorAction Stop | Select -ExpandProperty TopConsoleUser

Next, to find if the user is a local admin or not, we will not simply query the local administrator group membership and check if the user is in there. Instead we will create a WindowsIdentity object in .Net and run a method called HasClaim(). I describe this more in a previous blog, but using this method we can determine if the user has local administrator privilege whether through direct membership or through a nested group.


$ID = New-Object Security.Principal.WindowsIdentity -ArgumentList $TopConsoleUser
$IsLocalAdmin = $ID.HasClaim('http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid','S-1-5-32-544')
$ID.Dispose()

The SID for the local admin group (S-1-5-32-544) is used as this is the same across all systems. This will only work for domain accounts as it uses kerberos to create the identity.

Now we will also get the local administrator group membership using the following code (more .Net stuff), and filter just the SamAccountNames.


Add-Type -AssemblyName System.DirectoryServices.AccountManagement -ErrorAction Stop
$ContextType = [System.DirectoryServices.AccountManagement.ContextType]::Machine
$PrincipalContext = New-Object -TypeName System.DirectoryServices.AccountManagement.PrincipalContext -ArgumentList $ContextType, $($env:COMPUTERNAME) -ErrorAction Stop
$IdentityType = [System.DirectoryServices.AccountManagement.IdentityType]::Name
$GroupPrincipal = [System.DirectoryServices.AccountManagement.GroupPrincipal]::FindByIdentity($PrincipalContext, $IdentityType, ‚ÄúAdministrators‚ÄĚ)
$LocalAdminMembers = $GroupPrincipal.Members | select -ExpandProperty SamAccountName | Sort-Object
$PrincipalContext.Dispose()
$GroupPrincipal.Dispose()

Next, if the user is a local admin through nested group membership, I will call a custom function which will check the nested group membership within the local admin group, for the user account. Let’s say that Group B is a member of Group A, which is a member of the local administrators group. We will check the membership of both Groups B and A to see which ones the user is a member of, and therefore which group/s is effectively giving the user administrator privilege. We do this by querying the $GroupPrincipal object created in the previous code. The custom function will query nested membership up to 3 levels deep.

Now I will query the Install Date for the operating system, since in some cases where a machine is newly built, the TopConsoleUser may not yet be the primary user of the system, but the admin who built the machine, for example. This date helps to identify any such systems.


[datetime]$InstallDate = [System.Management.ManagementDateTimeConverter]::ToDateTime($(Get-WmiObject win32_OperatingSystem -Property InstallDate -ErrorAction Stop | Select -ExpandProperty InstallDate)) | Get-date -Format 'yyyy-MM-dd HH:mm:ss'

Now we gather all this information into a datatable, and call another custom function to write it to the local registry. I use the following registry key, but you can change this in the script if you wish:

HKLM:SOFTWARE\IT_Local\LocalAdminInfo

The script will create the key if it doesn’t exist.

Here’s an example of the kind of data that will be gathered:

localadmin

You can see in this example, that my user account is a local administrator both by direct membership and through nested groups. The actual groups that grant this right are listed in the NestedGroupMembership property.

Create a Compliance Item

Now lets go ahead and create a compliance item in SCCM to run this script.

In the Console, navigate Assets and Compliance > Compliance Settings > Configuration Items.

Click Create Configuration Item

config1

Click Next and select which OS’s you will target. ¬†Remember the Windows XP and Server 2003 may not have PowerShell installed.

Click Next again, then click New to create a new setting.

Choose Script as the setting type, and String as the data type.

config2

Now we need to add the scripts.  You can download both the discovery and remediation scripts from my Github repo here:

https://github.com/SMSAgentSoftware/ConfigMgr/tree/master/PowerShell%20Scripts/Compliance%20Settings/LocalAdministratorInfo

Click Add Script and paste or open the relevant script for each. Make sure Windows Powershell is selected as the script language.

The discovery script simply checks whether the script has been run in the last 15 minutes, and if not returns non-compliant.  This allows the script to run according to the schedule you define for it, ie once a day or once a week etc, to keep the information up-to-date in the registry.

The remediation script does the hard work ūüôā

Click OK to close the Create Setting window.

Click Next, then click New to create a new Compliance Rule as follows:

config3

Click OK to close, then Next, Next and Close to finish.

Create a Configuration Baseline

Click on Configuration Baselines and Create Configuration Baseline to create a new baseline.

Give it a name, click Add and add the Configuration Item you just created.

config4

Click OK to close.

Deploy the Baseline

Right-click the baseline and choose Deploy. Make sure to remediate noncompliance and select the collection you wish to target.

config5

Update SCCM Hardware Inventory

Creating the MOF Files

For this part you will need the excellent RegKeyToMOF utility, which you can download from here:

https://gallery.technet.microsoft.com/RegKeyToMof-28e84c28

You will also need to do this on a machine that has either run the remediation script to create the registry keys, or has run the configuration baseline.

Open RegKeyToMOF and browse to the registry key:

HKLM:SOFTWARE\IT_Local\LocalAdminInfo

You can deselect the ‘Enable 64bits …’ option as the registry key is not located in the WOW6432Node.

Click Save MOF to save the required files.

regkey

Copy the SMSDEF.mof and the CM12Import.mof to your SCCM site server.

Update Client Settings

In the SCCM console, navigate Administration > Site Configuration > Client Settings. Open your default client settings and go to the Hardware Inventory page.

Click Set Classes…, then Import…

Browse to the CM12Import.mof and click Import.

import

Close the Client Settings windows.

Update Configuration.mof

Now open your configuration.mof file at <ConfigMgr Installation Directory> \inboxes\clifiles.src\hinv.

In the section at the bottom for adding extensions, which starts like this…

//========================
// Added extensions start
//========================

…paste the contents of the SMSDEF.mof file. ¬†Save and close the file.

Reporting

Now that you’ve deployed the configuration item and updated the SCCM hardware inventory, a new view called dbo.v_GS_LocalAdminInfo0 has been added to the SCCM database. Note that initially there will be no data here until your clients have updated their policies, ran the configuration baseline, and ran the hardware inventory cycle.

You can query using the Queries node in the SCCM console…

query

…or create yourself a custom SCCM report, create an Excel report with a SQL data connection, query the SCCM database with PowerShell – whatever method you need or prefer.

Here is a sample SQL query that will query the view and add some client health data and the chassis type to help distinguish between desktop, laptops, servers etc.


Select
  ComputerName0 as 'ComputerName',
  Case When enc.ChassisTypes0 = 1 then 'Other'
    when enc.ChassisTypes0 = 2 then 'Unknown'
    when enc.ChassisTypes0 = 3 then 'Desktop'
    when enc.ChassisTypes0 = 4 then 'Low Profile Desktop'
    when enc.ChassisTypes0 = 5 then 'Pizza Box'
    when enc.ChassisTypes0 = 6 then 'Mini Tower'
    when enc.ChassisTypes0 = 7 then 'Tower'
    when enc.ChassisTypes0 = 8 then 'Portable'
    when enc.ChassisTypes0 = 9 then 'Laptop'
    when enc.ChassisTypes0 = 10 then 'Notebook'
    when enc.ChassisTypes0 = 11 then 'Hand Held'
    when enc.ChassisTypes0 = 12 then 'Docking Station'
    when enc.ChassisTypes0 = 13 then 'All in One'
    when enc.ChassisTypes0 = 14 then 'Sub Notebook'
    when enc.ChassisTypes0 = 15 then 'Space-Saving'
    when enc.ChassisTypes0 = 16 then 'Lunch Box'
    when enc.ChassisTypes0 = 17 then 'Main System Chassis'
    when enc.ChassisTypes0 = 18 then 'Expansion Chassis'
    when enc.ChassisTypes0 = 19 then 'SubChassis'
    when enc.ChassisTypes0 = 20 then 'Bus Expansion Chassis'
    when enc.ChassisTypes0 = 21 then 'Peripheral Chassis'
    when enc.ChassisTypes0 = 22 then 'Storage Chassis'
    when enc.ChassisTypes0 = 23 then 'Rack Mount Chassis'
    when enc.ChassisTypes0 = 24 then 'Sealed-Case PC'
    else 'Unknown'
  End as 'Chassis Type',
  TopConsoleUser0 as 'Primary User',
  TopConsoleUserIsAdmin0 as 'Primary User is Admin?',
  AdminGroupMembershipType0 as 'Primary User Local Admin Group Membership Type',
  LocalAdminGroupMembership0 as 'Local Admin Group Membership',
  NestedGroupMembership0 as 'Primary User Local Admin Nested Group Membership',
  OSAgeInDays0 as 'OS Age (days)',
  OSInstallDate0 as 'OS Installation Date',
  LastUpdated0 as 'Last Updated Date',
  la.TimeStamp as 'HW Inventory Date',
  ch.ClientStateDescription,
  ch.LastActiveTime
from dbo.v_GS_LocalAdminInfo0 la
join v_R_System sys on la.ComputerName0 = sys.Name0
left join v_GS_SYSTEM_ENCLOSURE enc on sys.ResourceID = enc.ResourceID
left join v_CH_ClientSummary ch on sys.ResourceID = ch.ResourceID
where ComputerName0 is not null
  and enc.ChassisTypes0 <> 12