PowerBI Reports for Windows 10 Feature Update Compliance

This morning I saw an interesting tweet from Sandy Zeng with a Log Analytics workbook she’d created for W10 feature updates based on Update Compliance data. I’d been meaning to create a similar report for that myself in PowerBI for some time, so I took inspiration from her tweet and got to work on something!

Microsoft’s Update Compliance solution can be used to report on software updates and feature updates status across your estate from Windows telemetry data. If you use Desktop Analytics, you can even combine the data for richer reporting.

Log Analytics allows exporting of queries in Power Query M formula language, which can be imported into PowerBI to create some nice reports.

Here’s a screenshot of what I ended up with. You can filter the data to view devices with Safeguard holds, for example, which since Windows 10 2004 has been a show stopper for many wanting to upgrade…

There are pages for 2004 and 20H2, as well as a page listing some of the known Safeguard hold IDs that have been publicly disclosed by Microsoft.

You can download this report for your own use with the links below. Note I have created two reports – the first assumes you have both Update Compliance and Desktop Analytics using the same Log Analytics workspace. If this is not the case for you, download the second report which doesn’t link to DA data and just uses Update Compliance. The only data I’ve included from DA is the make and model since these can be helpful in analysing devices affected by Safeguard holds.

Windows 10 Feature Update Compliance

Windows 10 Feature Update Compliance (no DA)

To create your own report, you’ll need the latest version of PowerBI desktop installed, with preview support for dynamic M query parameters.

Open PowerBI and go to File > Options and settings > Options > Preview features and enable Dynamic M Query Parameters.

Restart PowerBI and open the downloaded PBI template. Upon opening, you’ll be prompted for the Log Analytics workspace ID. You can find this on the Overview pane of your workspace in the Azure portal.

The reports contain data with a timespan of the last 48 hours, but you can change this if you want by editing the queries in the Advanced editor, and changing the value “P2D”.

You also might want to play around with the LastScan filter so you only get devices with a recent scan date, and avoid duplicates.

Hope it’s helpful!

Prevent Users from Disabling Toast Notifications – Can it be Done?

Another toast notifications post – this time to deal with an issue where users have turned off toast notifications. In my deployment of Windows 10 feature updates for example, I use toast notifications to inform users an update is available. Once we hit the installation deadline, the notifications become more aggressive and display more frequently and do not leave the screen unless the user actions or dismisses them. But we found that some users turn off toast notifications altogether – perhaps they just don’t like any notifications, or perhaps they don’t like being reminded to install the feature update.

In any case, since toast notifications are a key communications channel with our users, it’s important for us that they stay enabled.

Users can disable toast notifications in Settings > System > Notification & actions – simply turn off the setting Get notifications from apps and other senders.

There is also a group policy setting that can disable toast notifications and lock the setting so the user can’t turn it back on.

However, I was surprised to find no setting to do the opposite thing – turn notifications on and lock the setting preventing the user from turning them off..

What I did find is a registry key that enables or disables toast notifications in the user context, but it doesn’t take effect without restarting a service called Windows Push Notifications User Service.

Here’s the registry key. Setting it to 1 enables notifications and 0 disables.

Because this is not being done by group policy, you can’t lock the setting unfortunately. But what you can do is use a Configuration Manager compliance baseline, or even Proactive remediations in MEM, to detect and remediate and turn notifications back on if a user has turned them off. It needs to run with sufficient frequency to be effective.

Here is a detection script for MEMCM that will check the registry key and if it exists and is set to zero, will flag non-compliance.

$ToastEnabled = Get-ItemProperty -Path "HKCU:\SOFTWARE\Microsoft\Windows\CurrentVersion\PushNotifications" -Name ToastEnabled -ErrorAction SilentlyContinue | Select -ExpandProperty ToastEnabled
If ($ToastEnabled -eq 0)
{
    Write-host "Not compliant"
}
Else
{
    Write-host "Compliant"
}

And here’s a remediation script that will set the registry key to the ‘enabled’ value, and restart the push notifications service.

Set-ItemProperty -Path "HKCU:\SOFTWARE\Microsoft\Windows\CurrentVersion\PushNotifications" -Name ToastEnabled -Value 1 -Force
Get-Service -Name WpnUserService* | Restart-Service -Force

Remember to run these in the user context and allow remediation.

With this active, we can’t completely prevent users from turning off notifications altogether, but if they do, we’ll turn them back on. If they want to fight with the remediation, that’s on them 🙂

Get Previous and Scheduled Evaluation Times for ConfigMgr Compliance Baselines with PowerShell

I was testing a compliance baseline recently and wanted to verify if the schedule defined in the baseline deployment is actually honored on the client. I set the schedule to run every hour, but it was clear that it did not run every hour and that some randomization was being used.

To review the most recent evaluation times and the next scheduled evaluation time, I had to read the scheduler.log in the CCM\Logs directory, because I could only find a single last evaluation time recorded in WMI.

The following PowerShell script reads which baselines are currently deployed to the local machine, displays a window for you to choose one, then basically reads the Scheduler log to find when the most recent evaluations were and when the next one is scheduled.

Select a baseline
Baseline evaluations
##############################################################
## ##
## Reads the most recent and next scheduled evaluation time ##
## for deployed Compliance Baselines from the Scheduler.log ##
## ##
##############################################################
#requires -RunAsAdministrator
# Get Baselines from WMI
# Excludes co-management policies
Try
{
$Instances = Get-CimInstance Namespace ROOT\ccm\dcm ClassName SMS_DesiredConfiguration Filter "PolicyType!=1" OperationTimeoutSec 5 ErrorAction Stop | Select DisplayName,IsMachineTarget,Name
}
Catch
{
Throw "Couldn't get baseline info from WMI: $_"
}
If ($Instances.Count -eq 0)
{
Throw "No deployed baselines found!"
}
# Datatable to hold the baselines for the WPF window
$DataTable = New-Object System.Data.DataTable
[void]$DataTable.Columns.Add("DisplayName")
[void]$DataTable.Columns.Add("IsMachineTarget")
foreach ($Instance in ($Instances | Sort DisplayName))
{
[void]$DataTable.Rows.Add($Instance.DisplayName,$Instance.IsMachineTarget)
}
# WPF Window for baseline selection
Add-Type AssemblyName PresentationFramework,PresentationCore,WindowsBase
$Window = New-Object System.Windows.Window
$Window.WindowStartupLocation = [System.Windows.WindowStartupLocation]::CenterScreen
$Window.SizeToContent = [System.Windows.SizeToContent]::WidthAndHeight
$window.ResizeMode = [System.Windows.ResizeMode]::NoResize
$Window.Title = "DOUBLE-CLICK A BASELINE TO SELECT"
$DataGrid = New-Object System.Windows.Controls.DataGrid
$DataGrid.ItemsSource = $DataTable.DefaultView
$DataGrid.CanUserAddRows = $False
$DataGrid.IsReadOnly = $true
$DataGrid.SelectionMode = [System.Windows.Controls.DataGridSelectionMode]::Single
$DataGrid.Height = "NaN"
$DataGrid.MaxHeight = "250"
$DataGrid.Width = "NaN"
$DataGrid.AlternatingRowBackground = "#e6ffcc"
$DataGrid.Add_MouseDoubleClick({
$script:SelectedRow = $This.SelectedValue
$Window.Close()
})
$Window.AddChild($DataGrid)
[void]$Window.ShowDialog()
If (!$SelectedRow)
{
Throw "No baseline was selected!"
}
# If the baseline is user-targetted
If ($SelectedRow.row.IsMachineTarget -eq $false)
{
# Get Logged-on user SID
$LogonUIRegPath = "HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Authentication\LogonUI"
#Could also use this:
#Get-ItemProperty -Path HKLM:\SOFTWARE\Microsoft\SMS\CurrentUser -Name UserSID -ErrorAction Stop
$Property = "LastLoggedOnUserSID"
$LastLoggedOnUserSID = Get-ItemProperty Path $LogonUIRegPath Name $Property | Select ExpandProperty $Property
$LastLoggedOnUserSIDUnderscore = $LastLoggedOnUserSID.Replace('','_')
$Namespace = "ROOT\ccm\Policy\$LastLoggedOnUserSIDUnderscore\ActualConfig"
}
Else
{
$Namespace = "ROOT\ccm\Policy\Machine\ActualConfig"
}
# Get assignment info
$BaselineName = $SelectedRow.Row.DisplayName
$Pattern = [Regex]::Escape($BaselineName)
$CIAssignment = Get-CimInstance Namespace $Namespace ClassName CCM_DCMCIAssignment | where {$_.AssignmentName -match $Pattern}
$AssignmentIDs = $CIAssignment | Select AssignmentID,AssignmentName
Write-host "Baseline: $BaselineName" ForegroundColor Magenta
foreach ($AssignmentID in $AssignmentIDs)
{
# Read the scheduler log
$Log = "$env:SystemRoot\CCM\Logs\Scheduler.log"
If ($SelectedRow.row.IsMachineTarget -eq $false)
{
$LogEntries = Select-String Path $Log SimpleMatch "$LastLoggedOnUserSID/$($AssignmentID.AssignmentID)"
}
Else
{
$LogEntries = Select-String Path $Log SimpleMatch "Machine/$($AssignmentID.AssignmentID)"
}
If ($LogEntries)
{
# Get the previous evaluations date/time
$Evaluations = New-Object System.Collections.ArrayList
$EvaluationEntries = $LogEntries | where {$_ -match "SMSTrigger"}
Foreach ($Entry in $EvaluationEntries)
{
$Time = $Entry.Line.Split('=')[1]
$Date = $Entry.Line.Split('=')[2]
$a = $Time.Split()[0].trimend().replace('"','')
$b = $Date.Split()[0].trimend().replace('"','').replace('','/')
$Time = (Get-Date $a).ToLongTimeString()
$Date = [DateTime]"$b $Time"
$LocalDate = Get-Date $date Format (Get-Culture).DateTimeFormat.RFC1123Pattern
[void]$Evaluations.Add($LocalDate)
}
# Get the next scheduled evaluation date/time
$LastEvaluation = $EvaluationEntries | Select Last 1
$date = $LastEvaluation.Line.Split()[8]
$time = $LastEvaluation.Line.Split()[9]
$ampm = $LastEvaluation.Line.Split()[10]
$NextEvaluation = [DateTime]"$date $time $ampm"
$NextEvaluationLocal = Get-Date $NextEvaluation Format (Get-Culture).DateTimeFormat.RFC1123Pattern
# Return the results
Write-Host "Assignment: $($AssignmentID.AssignmentName)" ForegroundColor Green
Write-host "Last Evaluations:"
foreach ($Evaluation in $Evaluations)
{
Write-host " $Evaluation" ForegroundColor Yellow
}
Write-host "Next Scheduled Evaluation:"
Write-Host " $NextEvaluationLocal" ForegroundColor Yellow
}
Else
{
Write-Host "No log entries found!" ForegroundColor Red
}
}

Inventory Local Administrator Privileges with PowerShell and ConfigMgr

Any security-conscious enterprise will want to have visibility of which users have local administrator privilege on any given system, and if you are an SCCM administrator then the job of gathering this information will likely be handed to you!

However, this task may not be as simple as it seems. Gathering the membership of the local administrators group is one thing, but perhaps more important to know is whether the primary user of a system has administrator privileges. If that user is a member of a group that has been added to the local administrators group, then it isn’t immediately obvious whether they actually have administrator rights without also checking the membership of that group. And what if there are further nested groups – ie the user is a member of a group that’s a member of a group that’s a member of the local administrators group?! Obviously things can get complicated here, making reporting and compliance checking a challenge.

Thankfully, PowerShell can handle complication quite nicely, and ConfigMgr is more than capable as a both a delivery vehicle and a reporting mechanism, so the good news is – we can do this!

The following solution uses PowerShell to gather local administrator information and stamp it to the local registry. A Compliance item in SCCM is used as the delivery vehicle for the script and then RegKeyToMof is used to update the hardware inventory classes in SCCM to gather this information from the client’s registry into the SCCM database, where we can query and report on it.

Gathering Local Administrator Information with PowerShell

To start with, let’s have a look at some of the PowerShell code and the information we will gather with it.

First, we need to identify who is the primary user of the system. Since the script is running locally on the client computer, we will not use User Device Affinity. True, UDA information is stored in WMI in the CCM_UserAffinity class, in the ROOT\CCM\Policy\Machine\ActualConfig namespace.  But this class can contain multiple instances so you can’t always determine the primary user that way.

A better way is to use the SMS_SystemConsoleUsage class in the ROOT\cimv2\sms namespace and query the TopConsoleUser property. This will give you the user account who has had the most interactive logons on the system and for the most part will indicate who the primary user is.


$TopConsoleUser = Get-WmiObject -Namespace ROOT\cimv2\sms -Class SMS_SystemConsoleUsage -Property TopConsoleUser -ErrorAction Stop | Select -ExpandProperty TopConsoleUser

Next, to find if the user is a local admin or not, we will not simply query the local administrator group membership and check if the user is in there. Instead we will create a WindowsIdentity object in .Net and run a method called HasClaim(). I describe this more in a previous blog, but using this method we can determine if the user has local administrator privilege whether through direct membership or through a nested group.


$ID = New-Object Security.Principal.WindowsIdentity -ArgumentList $TopConsoleUser
$IsLocalAdmin = $ID.HasClaim('http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid','S-1-5-32-544')
$ID.Dispose()

The SID for the local admin group (S-1-5-32-544) is used as this is the same across all systems. This will only work for domain accounts as it uses kerberos to create the identity.

Now we will also get the local administrator group membership using the following code (more .Net stuff), and filter just the SamAccountNames.


Add-Type -AssemblyName System.DirectoryServices.AccountManagement -ErrorAction Stop
$ContextType = [System.DirectoryServices.AccountManagement.ContextType]::Machine
$PrincipalContext = New-Object -TypeName System.DirectoryServices.AccountManagement.PrincipalContext -ArgumentList $ContextType, $($env:COMPUTERNAME) -ErrorAction Stop
$IdentityType = [System.DirectoryServices.AccountManagement.IdentityType]::Name
$GroupPrincipal = [System.DirectoryServices.AccountManagement.GroupPrincipal]::FindByIdentity($PrincipalContext, $IdentityType, “Administrators”)
$LocalAdminMembers = $GroupPrincipal.Members | select -ExpandProperty SamAccountName | Sort-Object
$PrincipalContext.Dispose()
$GroupPrincipal.Dispose()

Next, if the user is a local admin through nested group membership, I will call a custom function which will check the nested group membership within the local admin group, for the user account. Let’s say that Group B is a member of Group A, which is a member of the local administrators group. We will check the membership of both Groups B and A to see which ones the user is a member of, and therefore which group/s is effectively giving the user administrator privilege. We do this by querying the $GroupPrincipal object created in the previous code. The custom function will query nested membership up to 3 levels deep.

Now I will query the Install Date for the operating system, since in some cases where a machine is newly built, the TopConsoleUser may not yet be the primary user of the system, but the admin who built the machine, for example. This date helps to identify any such systems.


[datetime]$InstallDate = [System.Management.ManagementDateTimeConverter]::ToDateTime($(Get-WmiObject win32_OperatingSystem -Property InstallDate -ErrorAction Stop | Select -ExpandProperty InstallDate)) | Get-date -Format 'yyyy-MM-dd HH:mm:ss'

Now we gather all this information into a datatable, and call another custom function to write it to the local registry. I use the following registry key, but you can change this in the script if you wish:

HKLM:SOFTWARE\IT_Local\LocalAdminInfo

The script will create the key if it doesn’t exist.

Here’s an example of the kind of data that will be gathered:

localadmin

You can see in this example, that my user account is a local administrator both by direct membership and through nested groups. The actual groups that grant this right are listed in the NestedGroupMembership property.

Create a Compliance Item

Now lets go ahead and create a compliance item in SCCM to run this script.

In the Console, navigate Assets and Compliance > Compliance Settings > Configuration Items.

Click Create Configuration Item

config1

Click Next and select which OS’s you will target.  Remember the Windows XP and Server 2003 may not have PowerShell installed.

Click Next again, then click New to create a new setting.

Choose Script as the setting type, and String as the data type.

config2

Now we need to add the scripts.  You can download both the discovery and remediation scripts from my Github repo here:

https://github.com/SMSAgentSoftware/ConfigMgr/tree/master/PowerShell%20Scripts/Compliance%20Settings/LocalAdministratorInfo

Click Add Script and paste or open the relevant script for each. Make sure Windows Powershell is selected as the script language.

The discovery script simply checks whether the script has been run in the last 15 minutes, and if not returns non-compliant.  This allows the script to run according to the schedule you define for it, ie once a day or once a week etc, to keep the information up-to-date in the registry.

The remediation script does the hard work 🙂

Click OK to close the Create Setting window.

Click Next, then click New to create a new Compliance Rule as follows:

config3

Click OK to close, then Next, Next and Close to finish.

Create a Configuration Baseline

Click on Configuration Baselines and Create Configuration Baseline to create a new baseline.

Give it a name, click Add and add the Configuration Item you just created.

config4

Click OK to close.

Deploy the Baseline

Right-click the baseline and choose Deploy. Make sure to remediate noncompliance and select the collection you wish to target.

config5

Update SCCM Hardware Inventory

Creating the MOF Files

For this part you will need the excellent RegKeyToMOF utility, which you can download from here:

https://gallery.technet.microsoft.com/RegKeyToMof-28e84c28

You will also need to do this on a machine that has either run the remediation script to create the registry keys, or has run the configuration baseline.

Open RegKeyToMOF and browse to the registry key:

HKLM:SOFTWARE\IT_Local\LocalAdminInfo

You can deselect the ‘Enable 64bits …’ option as the registry key is not located in the WOW6432Node.

Click Save MOF to save the required files.

regkey

Copy the SMSDEF.mof and the CM12Import.mof to your SCCM site server.

Update Client Settings

In the SCCM console, navigate Administration > Site Configuration > Client Settings. Open your default client settings and go to the Hardware Inventory page.

Click Set Classes…, then Import…

Browse to the CM12Import.mof and click Import.

import

Close the Client Settings windows.

Update Configuration.mof

Now open your configuration.mof file at <ConfigMgr Installation Directory> \inboxes\clifiles.src\hinv.

In the section at the bottom for adding extensions, which starts like this…

//========================
// Added extensions start
//========================

…paste the contents of the SMSDEF.mof file.  Save and close the file.

Reporting

Now that you’ve deployed the configuration item and updated the SCCM hardware inventory, a new view called dbo.v_GS_LocalAdminInfo0 has been added to the SCCM database. Note that initially there will be no data here until your clients have updated their policies, ran the configuration baseline, and ran the hardware inventory cycle.

You can query using the Queries node in the SCCM console…

query

…or create yourself a custom SCCM report, create an Excel report with a SQL data connection, query the SCCM database with PowerShell – whatever method you need or prefer.

Here is a sample SQL query that will query the view and add some client health data and the chassis type to help distinguish between desktop, laptops, servers etc.


Select
  ComputerName0 as 'ComputerName',
  Case When enc.ChassisTypes0 = 1 then 'Other'
    when enc.ChassisTypes0 = 2 then 'Unknown'
    when enc.ChassisTypes0 = 3 then 'Desktop'
    when enc.ChassisTypes0 = 4 then 'Low Profile Desktop'
    when enc.ChassisTypes0 = 5 then 'Pizza Box'
    when enc.ChassisTypes0 = 6 then 'Mini Tower'
    when enc.ChassisTypes0 = 7 then 'Tower'
    when enc.ChassisTypes0 = 8 then 'Portable'
    when enc.ChassisTypes0 = 9 then 'Laptop'
    when enc.ChassisTypes0 = 10 then 'Notebook'
    when enc.ChassisTypes0 = 11 then 'Hand Held'
    when enc.ChassisTypes0 = 12 then 'Docking Station'
    when enc.ChassisTypes0 = 13 then 'All in One'
    when enc.ChassisTypes0 = 14 then 'Sub Notebook'
    when enc.ChassisTypes0 = 15 then 'Space-Saving'
    when enc.ChassisTypes0 = 16 then 'Lunch Box'
    when enc.ChassisTypes0 = 17 then 'Main System Chassis'
    when enc.ChassisTypes0 = 18 then 'Expansion Chassis'
    when enc.ChassisTypes0 = 19 then 'SubChassis'
    when enc.ChassisTypes0 = 20 then 'Bus Expansion Chassis'
    when enc.ChassisTypes0 = 21 then 'Peripheral Chassis'
    when enc.ChassisTypes0 = 22 then 'Storage Chassis'
    when enc.ChassisTypes0 = 23 then 'Rack Mount Chassis'
    when enc.ChassisTypes0 = 24 then 'Sealed-Case PC'
    else 'Unknown'
  End as 'Chassis Type',
  TopConsoleUser0 as 'Primary User',
  TopConsoleUserIsAdmin0 as 'Primary User is Admin?',
  AdminGroupMembershipType0 as 'Primary User Local Admin Group Membership Type',
  LocalAdminGroupMembership0 as 'Local Admin Group Membership',
  NestedGroupMembership0 as 'Primary User Local Admin Nested Group Membership',
  OSAgeInDays0 as 'OS Age (days)',
  OSInstallDate0 as 'OS Installation Date',
  LastUpdated0 as 'Last Updated Date',
  la.TimeStamp as 'HW Inventory Date',
  ch.ClientStateDescription,
  ch.LastActiveTime
from dbo.v_GS_LocalAdminInfo0 la
join v_R_System sys on la.ComputerName0 = sys.Name0
left join v_GS_SYSTEM_ENCLOSURE enc on sys.ResourceID = enc.ResourceID
left join v_CH_ClientSummary ch on sys.ResourceID = ch.ResourceID
where ComputerName0 is not null
  and enc.ChassisTypes0 <> 12

 

New Free Tool: ConfigMgr Remote Compliance

Remote Compliance

Today I released a new free tool for ConfigMgr administrators and support staff.

ConfigMgr Remote Compliance can be used to view, evaluate and report on System Center Configuration Manager Compliance Baselines on a remote computer. It provides similar functionality to the Configurations tab of the Configuration Manager Control Panel, but for remote computers. It is a useful troubleshooting tool for remotely viewing client compliance, evaluating baselines, viewing the evaluation report or opening DCM log files from the client, without needing to access the client computer directly.

ConfigMgr Remote Compliance can be downloaded from here.

Source code for this application is available on GitHub and code contributions are welcome.

Deploying Custom Microsoft Office Templates with System Center Configuration Manager

Some time ago a wrote a blog describing a way to deploy custom templates for Microsoft Office applications using SCCM Compliance Settings. Since then, I have re-written the solution into something much more manageable as the previous incarnation was not very clearly defined in how to update templates, and involved some considerable admin overhead. This updated solution is much improved and better manages the lifecycle of your custom templates, including updating, adding and retiring templates. Much of this process is now automated using PowerShell, and I have removed the need to manually specify all the template file names in the scripts so it is also much easier to set up and deploy.

It is relatively detailed, so instead of writing a blog I put this into a free PDF guide which you can download from here:

Deploying Custom Microsoft Office Templates with System Center Configuration Manager

pdfimg

Export / Backup Compliance Setting Scripts with PowerShell

In my SCCM environment I have a number of Compliance Settings that use custom scripts for discovery and remediation, and recently it dawned on me that a lot of time has been spent on these and it would be good to create a backup of those scripts. It would also be useful to be able to export the scripts so they could be edited and tested before being updated in the Configuration Item itself. So I put to together this PowerShell script which does just that!

The Configuration Item scripts are stored in an XML definition, and this can be read from the SCCM database directly and parsed with PowerShell, so that’s what this script does. It will load all the Configuration Items into a datatable from a SQL query, then go through each one looking for any settings that have scripts defined. These scripts will be exported in their native file format.

You could then edit those scripts, or add the export location to your file/folder backup for an extra level of protection for your hard work!

Here you can see an example of the output for my “Java Settings” Configuration item. A subdirectory is created for the current package version, then subdirectories under that for each Configuration setting, then the discovery and remediation scripts for that setting.

cis

Exported Configuration Item Scripts

Note that the script will only process Compliance Items with a CIType_ID of 3, which equates to the Operating System type you will see in the SCCM console for the Configuration Item, which is the type that may use a script as the discovery source.

Export-CMConfigurationItemScripts.ps1

<#
.Synopsis
Exports all scripts (discovery and remediation) used in all SCCM Compliance Setting Configuration Items
.DESCRIPTION
This script connects to the SCCM database to retrieve all Compliance Setting Configuration Items. It then processes each item looking for
discovery and remediation scripts for the current (latest) version. It will export any script found into a directory structure.
.NOTES
Requirements – 'db_datareader' permission to the SCCM SQL database with the account running this script.
Parameters – set the parameters below as required
#>
################
## PARAMETERS ##
################
# Root directory to export the scripts to
$RootDirectory = "C:\temp"
# Name of the subdirectory to create
$SubDirectory = "Compliance_Settings_CI_Scripts"
# SCCM SQL Server (and instance where applicable)
$SQLServer = 'mysqlserver\inst_sccm'
# SCCM Database name
$Database = 'CM_ABC'
##################
## SCRIPT START ##
##################
# Create the subdirectory if doesn't exist
If (!(Test-Path "$RootDirectory\$SubDirectory"))
{
New-Item Path "$RootDirectory" Name "$SubDirectory" ItemType container | Out-Null
}
# Define the SQL query
$Query = "
Select * from dbo.v_ConfigurationItems
where CIType_ID = 3
and IsLatest = 'true'"
# Run the SQL query
$connectionString = "Server=$SQLServer;Database=$Database;Integrated Security=SSPI"
$connection = New-Object TypeName System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $Query
$reader = $command.ExecuteReader()
$ComplianceItems = New-Object TypeName 'System.Data.DataTable'
$ComplianceItems.Load($reader)
$connection.Close()
# Process each compliance item returned
$ComplianceItems | foreach {
# Set some variables
$PackageVersion = "v $($_.SDMPackageVersion)"
[xml]$Digest = $_.SDMPackageDigest
$CIName = $Digest.ChildNodes.OperatingSystem.Annotation.DisplayName.Text
# Create subdirectory structure if doesn't exist: configuration item name > current package version
If (!(Test-Path "$RootDirectory\$SubDirectory\$CIName"))
{
New-Item Path "$RootDirectory\$SubDirectory" Name "$CIName" ItemType container | Out-Null
}
If (!(Test-Path "$RootDirectory\$SubDirectory\$CIName\$PackageVersion"))
{
New-Item Path "$RootDirectory\$SubDirectory\$CIName" Name "$PackageVersion" ItemType container | Out-Null
}
# Put each compliance item setting in XML format into an arraylist for quick processing
$Settings = New-Object System.Collections.ArrayList
$Digest.DesiredConfigurationDigest.OperatingSystem.Settings.RootComplexSetting.SimpleSetting | foreach {
[void]$Settings.Add([xml]$_.OuterXml)
}
# Process each compliance item setting
$Settings | foreach {
# Only process if this setting has a script source
If ($_.SimpleSetting.ScriptDiscoverySource)
{
# Set some variables
$SettingName = $_.SimpleSetting.Annotation.DisplayName.Text
$DiscoveryScriptType = $_.SimpleSetting.ScriptDiscoverySource.DiscoveryScriptBody.ScriptType
$DiscoveryScript = $_.SimpleSetting.ScriptDiscoverySource.DiscoveryScriptBody.'#text'
$RemediationScriptType = $_.SimpleSetting.ScriptDiscoverySource.RemediationScriptBody.ScriptType
$RemediationScript = $_.SimpleSetting.ScriptDiscoverySource.RemediationScriptBody.'#text'
# Create the subdirectory for this setting if doesn't exist
If (!(Test-Path "$RootDirectory\$SubDirectory\$CIName\$PackageVersion\$SettingName"))
{
New-Item "$RootDirectory\$SubDirectory\$CIName\$PackageVersion" Name $SettingName ItemType container Force | Out-Null
}
# If a discovery script is found
If ($DiscoveryScript)
{
# Set the file extension based on the script type
Switch ($DiscoveryScriptType)
{
Powershell { $Extension = "ps1" }
JScript { $Extension = "js" }
VBScript { $Extension = "vbs" }
}
# Export the script to a file
New-Item Path "$RootDirectory\$SubDirectory\$CIName\$PackageVersion\$SettingName" Name "Discovery.$Extension" ItemType file Value $DiscoveryScript Force | Out-Null
}
# If a remediation script is found
If ($RemediationScript)
{
# Set the file extension based on the script type
Switch ($RemediationScriptType)
{
Powershell { $Extension = "ps1" }
JScript { $Extension = "js" }
VBScript { $Extension = "vbs" }
}
# Export the script to a file
New-Item Path "$RootDirectory\$SubDirectory\$CIName\$PackageVersion\$SettingName" Name "Remediation.$Extension" ItemType file Value $RemediationScript Force | Out-Null
}
}
}
}
<# For reference: CIType_IDs
1 Software Updates
2 Baseline
3 OS
4 General
5 Application
6 Driver
7 Uninterpreted
8 Software Updates Bundle
9 Update List
10 Application Model
11 Global Settings
13 Global Expression
14 Supported Platform
21 Deployment Type
24 Intend Install Policy
25 DeploymentTechnology
26 HostingTechnology
27 InstallerTechnology
28 AbstractConfigurationItem
60 Virtual Environment
#>

Disabling Java Content in all Browsers with ConfigMgr Compliance Settings

Some organisations like to disable Java applets from running in a web browser for tighter security.  This can be done with group policy, but in our organisation I already manage Java settings across the enterprise with Configuration Manager’s Compliance Settings (as documented in my solution guide for Java), so I decided to use a Compliance Setting for this also.

The best way to disable Java in the browser is simply to deselect the “Enable Java content in the browser” setting in the Java Control Panel:

Capture

Doing that will change a fair number of registry keys, too many to manage or set individually. Thankfully, you can achieve the same result using the following command from your Java installation files:

“C:\Program Files\Java\<java version>\bin\ssvagent.exe” -disablewebjava

When Java in the browser is disabled, the following key is set in the registry, which we can use as a way of programatically detecting whether Java in the browser has been enabled or not:

Key: HKLM:\SOFTWARE\Oracle\JavaDeploy
Name: WebDeployJava
Value: disabled

Now I create a new setting in my “Java Settings” configuration item, which I’ll call “Java WebDeploy”:

Capture

In this setting I use two PowerShell scripts, one for discovery, and one for remediation, which you can find below.  The discovery script will use the registry key above to determine whether Java has been disabled in the browser or not, and the remediation script will run the command that disables web Java for all installed Java versions.

Capture

For my compliance rule, I simply use the value “Compliant” which is outputted by the script:

Capture

Once this setting has been deployed in a baseline to your computers, Java will be disabled in web browsers for each machine that has Java installed in the default locations.  Should a user manually enable it from the Java control panel, the ConfigMgr client will disable it again according to the compliance evaluation schedule you have defined.

Discovery Script


$key = "HKLM:\SOFTWARE\Oracle\JavaDeploy"
if (Test-Path $key)
    {
        if ((Get-ItemProperty -Path $key -Name WebDeployJava -ErrorAction SilentlyContinue | Select -ExpandProperty WebDeployJava) -ne "disabled")
            {
                Write-Host "Not Compliant"
            }
        Else {write-host "Compliant"}
    }
else {write-host "Compliant"}

Remediation Script


$JavaInstallPaths = @()

if (${env:ProgramFiles(x86)})
    {
        $path = "${env:ProgramFiles(x86)}\Java"
        if (Test-Path $path)
            {
                $JavaInstall = Get-ChildItem -Path "${env:ProgramFiles(x86)}\Java" | select -ExpandProperty FullName
                $JavaInstallPaths += $JavaInstall
            }
    }

if ($env:ProgramFiles)
    {
        $path = "$env:ProgramFiles\Java"
        if (Test-Path $path)
            {
                $JavaInstall = Get-ChildItem -Path "$env:ProgramFiles\Java" | select -ExpandProperty FullName
                $JavaInstallPaths += $JavaInstall
            }
    }

$JavaInstallPaths

foreach ($JavaInstallPath in $JavaInstallPaths)
    {
        Start-Process -FilePath "$JavaInstallPath\bin\ssvagent.exe" -ArgumentList "-disablewebjava" -Wait -Verb Runas
    }

Create a Database of Error Codes and Descriptions for Windows and ConfigMgr

In a recent post, I described different ways to translate error codes for Windows and Configuration Manager into their friendly descriptions.  In this post, I will show you how to create a SQL database of known error codes and descriptions that you can join to in your SQL queries, to help simplify your troubleshooting, and I will also give some example queries you can use with Configuration Manager.

Windows and system error codes are standard and are published by Microsoft on MSDN, but there is no published resource of error codes for Configuration Manager 2012 onwards that I know of.  To have a database of all these codes is quite useful as they are not stored either in WMI or in the ConfigMgr database – only the error codes themselves are stored.  These codes are translated to their descriptions by the ConfigMgr console and the ConfigMgr SSRS Reports probably utilizing dll files.

I extracted a list of 11,839 error codes and descriptions using the SrsResource.dll, as described in the previous post, and exported them into a csv file.  Using the PowerShell function below, I converted each error code to give the hex and decimal codes for each.  In Configuration Manager, the log files and reports tend to use the hexadecimal value or the ‘signed integer’ decimal value for the error code, however WMI stores the codes as ‘unsigned integers’ (always positive or zero), therefore I have included all three for easy referencing.


function Convert-Number {
[CmdletBinding()]
    param
        (
        [Parameter(Mandatory=$True)]
            $Number,
        [Parameter(Mandatory=$True,ParameterSetName='Binary')]
            [switch]$ToBinary,
        [Parameter(Mandatory=$True,ParameterSetName='Hex')]
            [switch]$ToHexadecimal,
        [Parameter(Mandatory=$True,ParameterSetName='Signed')]
            [switch]$ToSignedInteger,
        [Parameter(Mandatory=$True,ParameterSetName='Unsigned')]
            [switch]$ToUnSignedInteger
        )

$binary = [Convert]::ToString($Number,2)

if ($ToBinary)
    {
        $binary
    }

if ($ToHexadecimal)
    {
        $hex = "0x" + [Convert]::ToString($Number,16)
        $hex
    }

if ($ToSignedInteger)
    {
        $int32 = [Convert]::ToInt32($binary,2)
        $int32
    }
if ($ToUnSignedInteger)
    {
        $Uint64 = [Convert]::ToUInt64($binary,2)
        $Uint64
    }
}

Using this function, you can convert between binary, hexadecimal, signed and unsigned integers:

CaptureTo import those codes into a SQL database, first download the attached XLSX file which contains all the codes, and save it in CSV format.  The error descriptions have had any line breaks removed so that they will import correctly.

ErrorCodes_Final.xlsx

Now run the following T-SQL code against your SQL instance.  It will create a new database called ‘ErrorCodes’ and import all the entries from the CSV into a new table called ‘WindowsErrorCodes’.  Change the path to the CSV file as needed.

I’m using the same SQL instance as my Configuration Manager database so I can easily reference the two.


Create Database ErrorCodes
Go
USE ErrorCodes;
CREATE TABLE WindowsErrorCodes (
Hexadecimal VARCHAR(10) NOT NULL,
SignedInteger BIGINT NOT NULL,
UnSignedInteger BIGINT NOT NULL,
ErrorDescription NVARCHAR(MAX)
);

BULK
INSERT WindowsErrorCodes
FROM '<mycomputer>\C$\temp\ErrorCodes_Final.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = 'n'
)
GO

Now let’s run a quick query to find a Configuration Manager error description:

Capture3

If I want to query for application deployment errors, similar to the PowerShell script in my last post, then I can use the following query entering the AssignmentID of the application deployment, which you can find from the ConfigMgr Console in the additional columns.  I will join the app deployment errors by their error code to my new database to return the error descriptions for each.  Join the ErrorCode field from the ConfigMgr database views with the SignedInteger field from the error code database.


select  app.ApplicationName, ass.CollectionName,
sys.Name0 as 'Computer Name',
det.ResourceID, det.CIVersion, det.ErrorCode, det.Errortype,
err.Hexadecimal, err.ErrorDescription
from v_CIErrorDetails det
inner join V_R_System sys on det.ResourceID = sys.ResourceID
inner join v_CIAssignmentToCI ci on det.CI_ID = ci.CI_ID
inner join v_CIAssignment ass on ci.AssignmentID = ass.AssignmentID
inner join v_ApplicationAssignment app on ci.AssignmentID = app.AssignmentID
left join ErrorCodes.dbo.WindowsErrorCodes err on det.ErrorCode = err.SignedInteger
where ci.AssignmentID = 16777540
order by sys.Name0

Results:

capture2

Cool 🙂

I can also get summary data categorized by the error code, for that deployment, again using the AssignmentID:


Select
sum.CollectionName,
sum.Description,
err.DTCI,
err.StatusType,
err.EnforcementState,
err.ErrorCode,
code.Hexadecimal,
code.ErrorDescription,
err.Total
from vAppDeploymentErrorStatus err
inner join v_CIAssignment ass on err.AssignmentUniqueID = ass.Assignment_UniqueID
inner join vAppDTDeploymentSummary sum on err.DTCI = sum.DTCI
left join ErrorCodes.dbo.WindowsErrorCodes code on err.ErrorCode = code.SignedInteger
where err.assignmentID = 16777540
and sum.assignmentID = err.AssignmentID
and err.ErrorCode <> 0
order by Description, Total desc

Results:

capture5

Both of these queries together roughly equate to what you can see in the ConfigMgr Console > Deployments node:

capture4Since we now don’t depend on the Console or the SSRS reports to translate the error descriptions for us, we can go ahead and more easily create custom reports or SQL queries or PowerShell scripts to report this information for us 🙂