Find the Full Windows Build Number with PowerShell

Much to my surprise I discovered that the full build number for a Windows OS is not stored in WMI in the usual Win32_OperatingSystem class.

In Windows 10 at least, the full build number containing the “UBR”, or essentially the CU patch level of the build, is a useful piece of information.

Open Settings > System > About on a Windows 10 box, and you’ll find the OS Build value, in my case 15063.183

W10

If I run the usual WMI query to get the build number I just get 15063:

WMI

Same if I query the environment:

DotNet

To find the full number I have to query the registry in the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion branch.

So I put together a PowerShell script that can be used to get the Windows version for a local or remote computer (or group of computers) which includes the Edition, Version and full OS Build values.

Query the local system like this:


Get-WindowsVersion

Or query remote computers:


Get-WindowsVersion -ComputerName PC001


Get-WindowsVersion -ComputerName @("PC001","PC002","SRV001","SRV002")

Result:

result

The script

Automatically Set SQL MaxServerMemory on Cluster Failover with PowerShell

On a two-node Windows Failover Cluster, I have two SQL instances installed. Each instance runs on its own node in the cluster, so that it can make maximum use of the available memory on that server. However, when a failover occurs, it is necessary to reduce the maximum server memory setting for both instances, so that they can share the available memory on the one server. Rather than have to do that manually, however, I decided to automate the process using PowerShell and the Windows Task Scheduler, and here’s how.

Note: the script will work for two SQL instances in a cluster. For additional instances, the script will need to be updated accordingly.

  1. Save the PowerShell script (download from Technet Gallery) to each server in the cluster.
  2. Update the PowerShell script setting the required variables in the parameters section, such as the log file location, the SQL instance names, the SQL service names, the path to the SQL SMO dll, the maximum server memory limit you want to set, and the timeout period.
  3. Create a scheduled task on each server, running as an account that has the appropriate permissions on each instance. Add 2 triggers – one for each SQL instance – and use the event trigger. You can use event ID 101, for example, to identify when a SQL instance becomes active on that node. As the source, use the “SQLAgent$<instancename>”.

event

As the task action, use Start a program:

  • Program: Powershell.exe
  • Arguments: -ExecutionPolicy Bypass -WindowStyle Hidden -File “<PathToScript>\Set-SQLClusterMaximumMemory.ps1”

When a SQL instance fails over to the other node, the script will be triggered and will set the maximum server memory limit for both instances on that node.

maxmem

The script will also log the process:

log

Automation. Gotta love it 🙂

Finding the ‘LastLogon’ Date from all Domain Controllers with PowerShell

In an Active Directory environment, probably the most reliable way to query the last logon time of a computer is to use the Last-Logon attribute.  The Last-Logon-Timestamp attribute could be used, but this will not likely be up-to-date due to the replication lag.  If you are using PowerShell, the LastLogonDate attribute can also be used, however this is also a replicated attribute which suffers from the same delay and potential inaccuracy.

The Last-Logon attribute is not replicated, however, it is only stored on the DC that the computer authenticated against.  If you have multiple domain controllers, you will get multiple values for this attribute depending on which DC the computer has authenticated with and when.

To find the Last-Logon date from the DC that the computer has most recently authenticated with, you need to query all domain controllers for this attribute, then select the most recent.

Following is a PowerShell script I wrote that will read a list of domain controllers from an Active Directory OU, query each one, then return the most recent Last-Logon value.  It uses parallel processing to return the result more quickly than processing each DC in turn, which is useful in a multi-DC environment.

To use the script, simply pass the computer name and optionally the AD OU containing your domain controllers, to the function.  You can hard-code the ‘DomainControllersOU’ parameter in the script if you prefer, so you don’t need to call it.  You need the Active Directory module installed to use this.

Example


Get-ADLastLogon -ComputerName PC001 -DomainControllersOU "OU=Domain Controllers,DC=contoso,DC=com"

capture

Get-ADLastLogon


function Get-ADLastLogon {

    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$false,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        [string]$DomainControllersOU = "OU=Domain Controllers,DC=contoso,DC=com",

        [string]
        $ComputerName
    )

    # Multithreading function
    function Invoke-InParallel {
        [CmdletBinding()]
        param(
            [parameter(Mandatory = $True,ValueFromPipeline=$true,Position = 0)]
            $InputObject,
            [parameter(Mandatory = $True)]
            [ScriptBlock]$Scriptblock,
            [string]$ComputerName,
            [parameter()]
            $ThrottleLimit = 32,
            [parameter()]
            [switch]$ShowProgress
        )

        Begin
        {
            # Create runspacepool, add code and parameters and invoke Powershell
                [void][runspacefactory]::CreateRunspacePool()
                $SessionState = [System.Management.Automation.Runspaces.InitialSessionState]::CreateDefault()
                $script:RunspacePool = [runspacefactory]::CreateRunspacePool(1,$ThrottleLimit,$SessionState,$host)
                $RunspacePool.Open()

            # Function to start a runspace job
            function Start-RSJob
            {
                param(
                    [parameter(Mandatory = $True,Position = 0)]
                    [ScriptBlock]$Code,
                    [parameter()]
                    $Arguments
                )
                if ($RunspacePool.GetAvailableRunspaces() -eq 0)
                    {
                        do {}
                        Until ($RunspacePool.GetAvailableRunspaces() -ge 1)
                    }

                $PowerShell = [powershell]::Create()
                $PowerShell.runspacepool = $RunspacePool
                [void]$PowerShell.AddScript($Code)
                foreach ($Argument in $Arguments)
                {
                    [void]$PowerShell.AddArgument($Argument)
                }
                $job = $PowerShell.BeginInvoke()

                # Add the job and PS instance to the arraylist
                $temp = '' | Select-Object -Property PowerShell, Job
                $temp.PowerShell = $PowerShell
                $temp.Job = $job
                [void]$Runspaces.Add($temp)  

            }

        # Start a 'timer'
        $Start = Get-Date

        # Define an arraylist to add the runspaces to
        $script:Runspaces = New-Object -TypeName System.Collections.ArrayList
        }

        Process
        {
            # Start an RS job for each computer
            $InputObject | ForEach-Object -Process {
                Start-RSJob -Code $Scriptblock -Arguments $_, $ComputerName
            }
        }

        End
        {
            # Wait for each script to complete
            foreach ($item in $Runspaces)
            {
                do
                {
                }
                until ($item.Job.IsCompleted -eq 'True')
            }

            # Grab the output from each script, and dispose the runspaces
            $return = $Runspaces | ForEach-Object -Process {
                $_.powershell.EndInvoke($_.Job)
                $_.PowerShell.Dispose()
            }
            $Runspaces.clear()
            [void]$RunspacePool.Close()
            [void]$RunspacePool.Dispose

            # Stop the 'timer'
            $End = Get-Date
            $TimeTaken = [math]::Round(($End - $Start).TotalSeconds,2)

            # Return the results
            $return
        }
    }

    # Get list of domain controllers from OU
    try {
    Import-Module ActiveDirectory | out-null
    $DomainControllers = Get-ADComputer -Filter * -SearchBase $DomainControllersOU -Properties Name -ErrorAction Stop | Select -ExpandProperty Name | Sort
    }
    catch {}

    # Define Code to run in each parallel runspace
    $Code = {
        param($DC,$ComputerName)
        Import-Module ActiveDirectory | out-null
        $Date = [datetime]::FromFileTime((Get-ADComputer -Identity $ComputerName -Server $DC -Properties LastLogon | select -ExpandProperty LastLogon))
        $Result = '' | Select 'Domain Controller','Last Logon'
        $Result.'Domain Controller' = $DC
        $Result.'Last Logon' = $Date
        Return $Result
    }

    # Run code in parallel
    $Result = Invoke-InParallel -InputObject $DomainControllers -Scriptblock $Code -ComputerName $ComputerName -ThrottleLimit 64

    # Return most recent logon date
    return $Result | sort 'Last Logon' -Descending | select -First 1
}

 

Translating Error Codes for Windows and Configuration Manager

As a Windows and Configuration Manager administrator, I often come across error codes that need translating into their more friendly descriptions.  In Configuration Manager, sometimes these codes are translated for you in the log files, reports and the ConfigMgr console, but sometimes they are not.  Sometimes they will be in decimal format, and sometimes hexadecimal.  For Windows error codes, there are a number of methods to return the friendly descriptions, for example the “net helpmsg”:

Capture

But it can only handle decimal codes:

Capture1

In PowerShell, there is the .Net namespace ComponentModel.Win32Exception, which can handle both decimal and hex:

Capture3

Common Windows error codes are also documented in MSDN:

https://msdn.microsoft.com/en-us/library/windows/desktop/ms681381(v=vs.85).aspx
https://msdn.microsoft.com/en-us/library/cc231199.aspx

However, for error codes that are specific to Configuration Manager, you can use the handy CMTRACE utility in the Configuration Manager toolkit, which has an error lookup.  This returns error descriptions for both Windows and Configuration Manager, supports decimal and hex, and supports error codes from more sources too, including WMI and Windows Update Agent:

Capture2

Capture4

Capture5

But if you are scripting and want to translate an error code, how can you do that?  Well there is a handy little dll file called SrsResources.dll that comes with the installation of the Configuration Manager Console, and can be found here: %ProgramFiles(x86)%\Microsoft Configuration Manager\AdminConsole\bin\SrsResources.dll.  Using this dll, we can translate error codes for Windows, Configuration Manager, WMI etc, and even translate status message IDs.  It will call other dll files when it needs to, to find the error string.

Using PowerShell, we can create the following simple function which will use the SrsResources.dll to translate a decimal or hex error code for us:


function Get-CMErrorMessage {
[CmdletBinding()]
    param
        (
        [Parameter(Mandatory=$True,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
            [int64]$ErrorCode
        )

[void][System.Reflection.Assembly]::LoadFrom("C:\Program Files (x86)\Microsoft Configuration Manager\AdminConsole\bin\SrsResources.dll")
[SrsResources.Localization]::GetErrorMessage($ErrorCode,"en-US")
}

Capture6

To take it further, we can export a list of error codes, for example here we will use the same function to enumerate all decimal codes between 0 and 50, and also output the equivalent hex codes:


$errorcodes = @()
$i = -1
Do
    {
        $i ++
        $description = Get-CMErrorMessage -ErrorCode $i
        if ($description -notlike "Unknown Error*")
            {
                $hex = '{0:x}' -f $i
                $errorcode = New-Object psobject
                Add-Member -InputObject $errorcode -MemberType NoteProperty -Name DecimalErrorCode -Value $i
                Add-Member -InputObject $errorcode -MemberType NoteProperty -Name HexErrorCode -Value ("0x" + $hex)
                Add-Member -InputObject $errorcode -MemberType NoteProperty -Name ErrorDescription -Value $description
                $errorcodes += $errorcode
            }

    }
Until ($i -eq 50)
$errorcodes | ft -AutoSize

Capture7Pretty cool 🙂  Using this SrsResources.dll creates a log file in your %TEMP% directory called SCCMReporting.log, and this log quickly increases in size, so if you use it a lot check the size of this log file from time to time.  The logging can be useful for identifying which dll was used to find the error string.

To convert between decimal and hexadecimal and vice-versa, we can use this simple function. With PowerShell, you can convert to decimal natively in the console just by entering the hexadecimal code,  but using this function allows us to convert both ways, and is more useful for scripts.


function Convert-ErrorCode {
[CmdletBinding()]
    param
        (
        [Parameter(Mandatory=$True,ParameterSetName='Decimal')]
            [int64]$DecimalErrorCode,
        [Parameter(Mandatory=$True,ParameterSetName='Hex')]
            $HexErrorCode
        )
if ($DecimalErrorCode)
    {
        $hex = '{0:x}' -f $DecimalErrorCode
        $hex = "0x" + $hex
        $hex
    }

if ($HexErrorCode)
    {
        $DecErrorCode = $HexErrorCode.ToString()
        $DecErrorCode
    }
}

Capture8Finally, wrapping all this together, here is a script that uses both functions we have created earlier, and will return all the machines that are in an error state for a ConfigMgr application deployment, with the error code and description.  Because we filter using the current application revision, this actually returns more accurate results than the ConfigMgr console > Deployments node, as that data will include previous application revisions where no data is available for the current revision, which produces misleading results.

First, we query WMI on the ConfigMgr site server for the list of applications and choose the one we want:

Capture

Then we query for the deployments and deployment types for that application, and choose the one we want.  The numbers of errors is returned, but as previously mentioned, this may not be completely accurate at this stage.

Capture2Then we return the results translating the error codes into their descriptions.

Capture3

Cool 🙂

Note that WMI stores the error codes as unsigned integers, but the ConfigMgr console displays errors as signed integers, so we do a conversion and include both in our results.

In the next blog, I describe how to create a SQL database of these error codes for easy referencing in SQL queries: Create a database of error codes and descriptions for Windows and Configmgr

Here’s the complete script:


<#

.SYNOPSIS
    Returns the error code and error descriptions for all computers in an error state for an application deployment

.DESCRIPTION
    This script asks you to choose a ConfigrMgr application, then choose a deployment / deployment type for that application, then returns all the computers that are in an error state for that
    deployment, with the error code and error description.
    Requires to be run on a computer with the ConfigMgr console installed, and the path to the SrsResources.dll needs to be specified in the "Get-CMErrorMessage" function.  You may also
    need to change the localization in this function to your region, eg "en-US".

.PARAMETER SiteServer
    The name of the ConfigMgr Site server

.PARAMETER SiteCode
    The ConfigMgr Site Code

.NOTES
    Script name: Get-CMAppDeploymentErrors.ps1
    Author:      Trevor Jones
    Contact:     @trevor_smsagent
    DateCreated: 2015-06-17
    Link:        https://smsagent.wordpress.com

#>

[CmdletBinding(SupportsShouldProcess=$True)]
    param
        (
        [Parameter(Mandatory=$False)]
            [string]$SiteServer="sccmserver-01",
        [Parameter(Mandatory=$False)]
            [string]$SiteCode="ABC"
        )

function Get-CMErrorMessage {
[CmdletBinding()]
    param
        (
        [Parameter(Mandatory=$True,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
            [int64]$ErrorCode
        )

[void][System.Reflection.Assembly]::LoadFrom("C:\Program Files (x86)\Microsoft Configuration Manager\AdminConsole\bin\SrsResources.dll")
[SrsResources.Localization]::GetErrorMessage($ErrorCode,"en-US")
}

function Convert-ErrorCode {
[CmdletBinding()]
    param
        (
        [Parameter(Mandatory=$True,ParameterSetName='Decimal')]
            [int64]$DecimalErrorCode,
        [Parameter(Mandatory=$True,ParameterSetName='Hex')]
            $HexErrorCode
        )
if ($DecimalErrorCode)
    {
        $hex = '{0:x}' -f $DecimalErrorCode
        $hex = "0x" + $hex
        $hex
    }

if ($HexErrorCode)
    {
        $DecErrorCode = $HexErrorCode.ToString()
        $DecErrorCode
    }
}

# Get Application
$App = Get-WmiObject -ComputerName $SiteServer -Namespace ROOT\sms\Site_$SiteCode -Class SMS_ApplicationLatest |
    Sort LocalizedDisplayName |
    Select LocalizedDisplayName,SDMPackageVersion,ModelName |
    Out-GridView -Title "Choose an Application" -OutputMode Single

# Get Deployment Types and Deployments for Application
$DT = Get-WmiObject -ComputerName $SiteServer -Namespace ROOT\sms\Site_$SiteCode -query "Select * from SMS_AppDTDeploymentSummary where AppModelName = '$($App.ModelName)'" |
    Select Description,CollectionName,CollectionID,NumberErrors,AssignmentID |
    Out-GridView -Title "Choose a Deployment / Deployment Type" -OutputMode Single

# Get Errors
$Errors = Get-WmiObject -ComputerName $SiteServer -Namespace ROOT\sms\Site_$SiteCode -query "Select * from SMS_AppDeploymentErrorAssetDetails where AssignmentID = '$($DT.AssignmentID)' and DTName = '$($DT.Description)' and Revision = '$($App.SDMPackageVersion)' and Errorcode <> 0" |
    Sort Machinename |
    Select MachineName,Username,Starttime,Errorcode

if ($Errors -ne $null)
{
    # Create new object with error descriptions in
    $AllErrors = @()
    foreach ($item in $Errors)
        {
            $errordescription = Get-CMErrorMessage -ErrorCode $item.Errorcode
            $hex = Convert-ErrorCode -DecimalErrorCode $item.Errorcode
            $int = [int]$hex
            $obj = New-Object psobject
            Add-Member -InputObject $obj -MemberType NoteProperty -Name ComputerName -Value $item.MachineName
            Add-Member -InputObject $obj -MemberType NoteProperty -Name UserName -Value $item.Username
            Add-Member -InputObject $obj -MemberType NoteProperty -Name StartTime -Value $([management.managementDateTimeConverter]::ToDateTime($item.Starttime))
            Add-Member -InputObject $obj -MemberType NoteProperty -Name UnsignedIntErrorCode -Value $item.Errorcode
            Add-Member -InputObject $obj -MemberType NoteProperty -Name SignedIntErrorCode -Value $int
            Add-Member -InputObject $obj -MemberType NoteProperty -Name HexErrorCode -Value $hex
            Add-Member -InputObject $obj -MemberType NoteProperty -Name ErrorDescription -Value $errordescription
            $AllErrors += $obj
        }
    # Return results
    write-host "Application: $($App.LocalizedDisplayName)"
    write-host "DeploymentType: $($DT.Description)"
    write-host "TargetedCollection: $($DT.CollectionName)"
    $AllErrors | ft -AutoSize
}
Else {Write-host "No results returned."}

 

Creating Dynamic Reports for Configuration Manager with Microsoft Excel

Today I reached 10,000 hits on my blog, very cool, and what better way to celebrate than with a new post 🙂  Thanks to everyone who has stopped by, I hope that you have found something helpful and continue to do so!

In today’s post, I want to look at how to create dynamic reports for ConfigMgr using Microsoft Excel.  Excel has the ability to create and store a SQL server connection, so we can use it to talk directly with the ConfigMgr database and pull data into our worksheets, where we can summarize, combine, perform functions, display charts etc.  In addition, the SQL connection can be refreshed meaning that the report can always have the most recent data from the database. Excel is a very handy tool to use to create reports from the simple to the complex, and is a great option for use with Configuration Manager.

What about SSRS?

The built-in reporting capability of Configuration Manager is of course very good.  The SQL Server Reporting Services reports are all predefined for you and allow you to get some very useful information.  You can also create custom reports, subscriptions, export data and share reports with interested parties.  But I find there are still some limitations that can make Excel an attractive alternative when creating custom reports.  For one thing, finding the data you want can sometimes mean drilling through more than one report and selecting various options.  And then I often find that the data I want is not readily available in a single report and I have to create a custom report to get it.  But creating custom reports with SSRS is a bit of a slow and cumbersome process.

Excel has some advantages here worth considering:

  • You can create multiple worksheets and use multiple SQL queries to return your results into a single workbook
  • Creating charts to visually display the data is quick and easy
  • Data can be persisted to create a ‘snapshot’ report, or dynamically refreshed to keep the report always-up-to-date
  • You can leverage the power of both SQL language and Excel formulas to manipulate your data

I have used Excel for reporting to great effect, and if time permits I will share some of the reports I’ve created in future posts, including a spreadsheet to dynamically monitor SCCM OS Deployments and a single report that will get the current deployment status of any ConfigMgr Application.

In this post, I will focus on the fundamentals and create a simple report that will give me a dynamic inventory of my servers with various data such as make, model, OS version, IP address, MAC address, RAM, disk space, serial tag etc

Security

The first thing to do is to decide which account will be used to get access to the ConfigMgr database.  If you are creating a report that is for your own use only, then there is no need to use specific account credentials, you can use the built-in Windows authentication option (assuming you have access to the database with your account).  If your report will be made available to other users, then those users will need at least read access (db_datareader) to the SQL database so the SQL query that runs in the background has permission to do so in their user context.  There is, however, a viable alternative if you don’t want to assign SQL permissions to all the users who need to view the report – you can create a local SQL account that has db_datareader access to the database, and embed these credentials in the report.  There is a security consideration here, as the local SQL credentials will be stored in clear text within the workbook, but since the permission level is read-only that should not be a major concern, and this is the method I will use in this post.

Create a Data Connection

Open Microsoft Excel (I’m using 2013, but it’s almost the same in 2010) and go to the Data ribbon. In the Get External Data section, click From Other Sources, then From SQL Server.
In the Data Connection Wizard window, enter the name of your SQL server and instance that hosts the ConfigMgr database, and enter the SQL credentials to connect to it.
dataconn1Select the ConfigMgr database in the drop-down list.  At this point we cannot enter a specific SQL query, so we have to pull in any random table for now, then we’ll update the connection later.

dataconn2Click Next.  You can change the File Name and Friendly Name of the data connection to make it more obvious as to what data it is getting, and to distinguish it from other data connections you may create in the workbook.

dataconn3

Click Finish.

Edit the Connection Properties

At the Import Data window, click on Properties.

dataconn4In the Connection Properties window, on the Usage tab, make sure that Enable background refresh is selected, and also Refresh data when opening the file.

This allows us to run the SQL queries in the data connections automatically every time we open the Workbook, and also perform a manual refresh when desired, from the Data ribbon. Optionally, you can also refresh the data regularly using the Refresh every .. minutes option.

dataconn5On the Definition tab, check the option Save password.  You will be prompted to allow the saving of the password unencrypted in the Excel file.  Click Yes.

dataconn6In the Command Type field, change to SQL, then enter your SQL query in the Command text field.  In this example, I’m using the following query which gives me an inventory of all my servers (change the sys.Name0 like ‘%srv%’ statement in the where clause to suit your environment):


select sys.Name0 as 'Server Name',
 case when comp.DomainRole0 = 3 then 'Server'
 when comp.DomainRole0 = 4 then 'Domain Controller'
 when comp.DomainRole0 = 5 then 'Primary Domain Controller'
 End as 'Server Type',
comp.Manufacturer0 as 'Manufacturer', comp.Model0 as 'Model', comp.Roles0 as Roles, os.Caption0 as 'OS', os.CSDVersion0 as 'SP Level',
comp.SystemType0 as 'Architecture',
os.Description0 as Description, nic.IPAddress0 as 'IP Address', nic.MACAddress0 as 'MAC Address', cpu.Name0 as 'CPU', comp.NumberOfProcessors0 as 'No. of Processors',
CAST(CAST(comp.TotalPhysicalMemory0 AS NUMERIC) / 1024 / 1024 as DECIMAL(10,2)) as 'Total Physical Memory (GB)',
disk.Partitions0 as 'No. of Disk Partitions (Primary Disk)', CAST(CAST(disk.Size0 as NUMERIC) / 1024 AS DECIMAL(10, 2)) as 'Total Disk Capacity (GB) (Primary Disk)',
bios.SerialNumber0 as 'Serial Tag', os.InstallDate0 as 'OS Install Date', os.LastBootupTime0 as 'Last Bootup Time',
sys.Last_Logon_Timestamp0 as 'Last Logon Timestamp', sys.User_Name0 as 'Last Logged On User'
from v_R_System sys
inner join v_GS_Operating_System os on sys.ResourceID = os.ResourceID
inner join v_GS_Computer_System comp on sys.ResourceID = comp.ResourceID
inner join v_GS_PC_BIOS bios on sys.ResourceID = bios.ResourceID
inner join v_GS_Network_Adapter_Configuration nic on sys.ResourceID = nic.ResourceID
inner join v_GS_Processor cpu on sys.ResourceID = cpu.ResourceID
inner join v_GS_Disk disk on sys.ResourceID = disk.ResourceID
where os.ProductType0 in ('2','3') and nic.IPAddress0 is not null and cpu.DeviceID0 = 'CPU0' and sys.Name0 like '%srv%' and disk.MediaType0 = 'Fixed hard disk media' and disk.DeviceID0 = '\\.\PHYSICALDRIVE0'
order by 'Server Name'

You can expand the window if you have a large query and need to see more of it.

Capture

Click OK, and you will be prompted that the connection has now changed in the Workbook, click Yes to accept.

Back in the Import Data window, make sure that Table is selected and the cell range =$A$1 in the Existing worksheet, then click OK.

dataconn4

The query will  run and pull the data into the worksheet.  Now I have a nice inventory of my servers:

Capture2

You can see that it places a filter on all the column headers, making it easy to filter your data.

Edit the SQL Query

If you need to edit your SQL query, you can do so from the Data ribbon.  Click Connections, select the data connection and click Properties.

con3You can now go to the Definition tab and edit your SQL query as required.

Microsoft Query

An alternative way to get data from ConfigMgr into Excel is to use Microsoft Query.  This is a kind of ‘gui’ way to construct a query and may be useful if you are not familiar with creating SQL queries directly, although I find it easier to create the queries myself!

We won’t cover Microsoft Query in this post, but to get started with it, simply click on the Data ribbon in Excel, and choose Get External Data > From Other Sources > Microsoft Query.

Querying the SQL Database

Your report will only be as good as your SQL querying skills, your Excel skills and your knowledge of the ConfigMgr database of course, but on the latter Microsoft has provided some nice documentation (here) to help you find your way around the available SQL views.  In most cases you will query the views not the tables as they are designed for that, so take a look through the Microsoft documentation to help you identify where to find the data you need.  If you are familiar with querying WMI for ConfigMgr, then you should already find the database friendly as many of the views have similar names to the classes used in WMI.

Complex Reports

This was a simple example of a report that is not much different than running a WQL query in ConfigMgr directly.  However, you can begin to understand the potential here if you create additional worksheets with their own data connections, and you have multiple SQL queries pulling data into one report.  You can then create a summary sheet, for example, that summarises data from each of the worksheets, and populates graphs for a visually pleasing display.  You can store this report in any network share and control access by NTFS permissions, or make it available in a document repository such as Sharepoint.  The next time someone asks you for some data from ConfigMgr, you can simply send them a link.

Happy reporting!

 

 

 

Monitoring Disk Space and Content Directories on ConfigMgr Distribution Points

Recently I noticed that the available disk space on my ConfigMgr distribution points was getting low.  Since they are mostly VMs using SAN storage it’s fairly easy to increase the disk space, but it got me thinking about how I could monitor the DPs for available free space, as well as the sizes of the key directories that ConfigMgr is using to store large amounts of data, including the SCCMContentLib, the legacy package share (SMSPKGx$), and in my case, the WSUSUpdates folder, as I also use our DPs as WSUS servers for patching during OSD.  These directories will certainly increase in size over time, as new content is distributed and new patches are downloaded.

There are, of course, many ways to monitor the overall free disk space, including WMI, SNMP, various 3rd party software, your corporate monitoring tool and even ConfigMgr itself.  But monitoring the ConfigMgr/WSUS directory sizes can be a little trickier.  Rather than purchase some new software to do it, I wanted to build a ‘free’ solution using tools already available to me in the Enterprise.  I also wanted to have all this data for all our distributions points displayed graphically in a report.

Since Microsoft Excel is my favourite reporting tool, I decided to build a workbook that has a data connection to a SQL database where I will store the disk and directory information from the DPs.  Then I run a PowerShell script on each of the DPs as a scheduled task that will get the information I need and insert it into the SQL database.

The result is a nice graphical report that displays all the data I want and keeps itself up-to-date automatically.  You can download a sample report here.

BigGraph

AvailableSpace

SCCMContentLib

Here is a step-by-step guide to create such a report.

Create a SQL Database and Table

First, we will create an SQL database where we will store this data.  You could, of course, use an existing database as we only need a single table to store this data, but just for good organisation I will create a new one for this purpose. Here is a sql script that will create the database, create a table, and create records for each of my distribution points with blank data values for now.  Edit the script with the hostnames of your distribution points, and run the script in your SQL instance.


CREATE DATABASE SCCM_Server_Data
GO
USE SCCM_Server_Data;
CREATE TABLE DiskSpace (
ID INT NOT NULL,
Server VARCHAR (20),
WSUS_Updates DECIMAL(6,2),
SMSPKGx DECIMAL(6,2),
SCCMContentLib DECIMAL(6,2),
DiskCapacity DECIMAL(6,2),
AvailableSpace DECIMAL(6,2),
PRIMARY KEY (ID)
);

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (1, 'SCCMDP-01', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (2, 'SCCMDP-02', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (3, 'SCCMDP-03', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (4, 'SCCMDP-04', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (5, 'SCCMDP-05', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (6, 'SCCMDP-06', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (7, 'SCCMDP-07', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (8, 'SCCMDP-08', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (9, 'SCCMDP-09', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (10, 'SCCMDP-10', 0, 0, 0, 0, 0 );

INSERT INTO DiskSpace (ID,Server,WSUS_Updates,SMSPKGx,SCCMContentLib,DiskCapacity,AvailableSpace)
VALUES (11, 'SCCMDP-11', 0, 0, 0, 0, 0 );

SQL Permissions

I recommend to create an SQL login to the database you created that has only ‘db_datareader’ permission.  This way you can include the credentials required to get data from the database in the report, and not have to give SQL database permissions to everyone you want to share the report with.  If your SQL Instance is not using Mixed-mode authentication, you can easily change this.  I created a SQL login called ‘ConfigMgrDB_Read’ for this purpose (I use the same login to access my ConfigMgr database for reporting purposes).

Note: The SQL credentials will be stored in clear text in the Excel data connection file, but since they are read-only permission, it shouldn’t be a security concern.

Create a PowerShell Script and Scheduled Task on each DP

Next, we need to create a PowerShell script that will get the directory sizes of the following locations:

  • SCCMContentLib (SCCM Content Library)
  • SMSPKGx$ (SCCM Legacy Package Share)
  • WSUS_Updates (WSUS Update files location)

We will also get the disk capacity and amount of free space remaining.

We will put this data into a simple formatted text table that can be sent in an email.  We will then create a connection to the SQL database where we will store this data, and update the records with the new data values.

I recommend to run this script as a scheduled task on each DP, rather than running it against multiple DPs from one location, especially if your DPs are across a WAN, as the Get-ChildItem commands run much quicker locally than remotely.

The account running this script must have permission to input data to the SQL database.

You will need to change some things in the script for your environment:

  • Line 4: Enter the hostname of the server you are running the script on and for
  • Lines 7-11: Enter the email details
  • Lines 16,19,22: Enter the locations of the directories you want to get the size for
  • Line 25: Change the drive letter for your DP data drive
  • Lines 47,48: Enter the SQL Instance and database name

## Variables

# Enter ComputerName
$ComputerName = "SCCMDP-01"

# Enter email details
$SendEmail = "Yes"
$smtpServer = "mysmtpserver@mydomain.com"
$smtpFrom = "$ComputerName@mydomain.com"
$smtpTo = "MyRecipients@mydomain.com"
$messageSubject = "Drive Space Report for $ComputerName"

## Get and email the data

# Get drive and directory data
$colItems = (Get-ChildItem G:\WSUS_Updates -recurse | Measure-Object -property length -sum)
$WSUS_Updates = "{0:N2}" -f ($colItems.sum / 1GB)

$colItems = (Get-ChildItem G:\SMSPKGG$ -recurse | Measure-Object -property length -sum)
$SMSPKGG = "{0:N2}" -f ($colItems.sum / 1GB)

$colItems = (Get-ChildItem G:\SCCMContentLib -recurse | Measure-Object -property length -sum)
$SCCMContentLib = "{0:N2}" -f ($colItems.sum / 1GB)

$data = Get-WmiObject -Class Win32_Volume -ComputerName $ComputerName | Where-Object {$_.DriveLetter -eq 'G:'} | Select Capacity,FreeSpace,SystemName,DriveLetter

$capacity = "{0:N2}" -f ($data.capacity / 1GB)
$freespace = "{0:N2}" -f ($data.freespace / 1GB)
$systemname = $data.SystemName
$driveletter = $data.DriveLetter

# Add data to a simple text 'table'
$table = "All values in GB`nServer:`t`t`t$systemname`nWSUS_Updates:`t$WSUS_Updates`nSMSPKGx:`t`t$SMSPKGG`nSCCMContentLib:`t$SCCMContentLib`nDiskCapacity:`t`t$capacity`nAvailableSpace:`t`t$freespace"

# Send data as an email
if ($SendEmail -eq "Yes")
{
$message = New-Object System.Net.Mail.MailMessage $smtpfrom, $smtpto
$message.Subject = $messageSubject
$message.Body = $table
$smtp = New-Object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($message)
}
# SQL Stuff

# Database info
$dataSource = “MySQLServer\INST_SCCM”
$database = “SCCM_Server_Data”

# Open a connection
cls
Write-host "Opening a connection to '$database' on '$dataSource'"
$connectionString = “Server=$dataSource;Database=$database;Integrated Security=SSPI;”
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()

# Set and run the SQL Command
$update = "UPDATE DiskSpace SET WSUS_Updates = '$WSUS_Updates', SMSPKGx = '$SMSPKGG', SCCMContentLib = '$SCCMContentLib', DiskCapacity = '$capacity', AvailableSpace = '$freespace' WHERE SERVER = '$ComputerName'"
$command = $connection.CreateCommand()
$command.CommandText = $update
$command.ExecuteNonQuery()

# Close the connection
$connection.Close()

Note: if you don’t want to receive the email from each DP, simply put ‘No’ in the $SendEmail variable, although it’s a useful way to verify that the Scheduled Task has run.

Now create a scheduled task on the DP with the schedule of your choice and point to the location of your Powershell script:

  • Action: Start a program
  • Program/script: Powershell
  • Add arguments: -Command “C:\LocalScripts\DP-DiskSpaceScript.ps1” -ExecutionPolicy Bypass

You can manually start the task to verify that it works, you should receive an email like below, and data should be entered into the SQL table in the record for that server.  In this case, I now have lots of free space.  Nice 🙂

Capture

Create an Excel Workbook

Create the Data Connection

Open a new blank workbook in Microsoft Excel (I’m using Office 2013), right-click the tab for Sheet1 and rename it Data.

On the Data ribbon, in the Get External Data section, click From Other Sources and choose From SQL Server.

In the Data Connection Wizard, enter the name of the Instance that contains your database, and enter the SQL account credentials.  Click Next.

DataConWizard

Select the SCCM_Server_Data database from the drop-down list, then simply click next.  The database only contains one table, and we will pull the entire table into the workbook, so no need for a specific SQL query.

DataCon2

Rename the File Name and Friendly Name of the data connection file if you wish, then click Finish.

DataCon3

In the Import Data window, make sure to choose Table and the cell range =$A$1 in the existing worksheet, then click Ok.

DataCon4

You will now see the data from the database in the worksheet:

DataTab

Finally, go to the Data ribbon again and click Connections.  Make sure your data connection is selected, the click Properties.  Check the option to Refresh data when opening the file.  This will make sure that you always get the current data from the database every time the file is opened.

DataCon5

On the Definition tab, check the box Save password, click Yes at the prompt to save without encryption, then click OK to save and close the Connection Properties.

DataCon6

Close the Workbook Connections window.

Create a Summary Chart

Now that we have the data, let’s create some nice charts to display it.

First, select all the populated cells in the Data sheet, except for the ‘ID’ column, then right-click the Data tab and choose Insert…  Choose Chart and click Ok.

You should see a chart automatically added for you.  Let’s make it look a bit nicer.  (These customizations are just my own suggestions, of course.  Do what you like :-))

SummaryChart1

Rename the Chart tab to Disk Space Summary.

Click on the Chart, then choose Change Chart Type from the Chart ribbon.  In the Column charts, select 3D Column and click Ok.

Double-click the chart title and change it to SCCM DP Disk Space Summary

Cool, looking better to my eye.

SummaryChart2

Create Additional Charts

Now let’s create some additional charts so we can compare the data on their own scales.

Go back to the Data sheet and select all the populated cells in the Server column, and all the populated cells in the WSUSUpdates column, including the headers.

WSUSUPdates

Right-click on the Data tab and choose Insert…  Choose Chart and click Ok.  Change the chart type again, this time choosing 3D Clustered Column.

Rename the chart tab to WSUSUpdates.

Job done.

WSUSUPdates2

Now do the same for each of the remaining data columns in the Data sheet, selecting the populated cells in the Server column, and the next data column, and adding charts.

There you have it.  Now you can keep your eye on those SCCM DPs, know the sizes of the key content directories, observe how they are growing over time, and make sure there is plenty of free disk space available!

Note: you could also create an SSRS report, if you prefer, with the SQL data and add the report to your existing ConfigMgr reports folder.

!Tip

If you want the most up-to-date data in your report, then you can manually trigger all the scheduled tasks to run immediately using PowerShell.  If you have PowerShell remoting available, you can run the following script which will trigger the Task on all your distribution points.

You need to enter your DP names in the $server variable, and it assumes that your scheduled tasks all have the same name on each DP.


<#

This script manually triggers the 'DriveSpaceReport' Scheduled Task on each of the DPs listed in the $servers variable

#>

$servers = "sccmsrvdp-01", `
"sccmsrvdp-02, `
"sccmsrvdp-03", `
"sccmsrvdp-04", `
"sccmsrvdp-05", `
"sccmsrvdp-06", `
"sccmsrvdp-07", `
"sccmsrvdp-08", `
"sccmsrvdp-09", `
"sccmsrvdp-10", `
"sccmsrvdp-11", `
"sccmsrvdp-12"

cls
write-host "######################################################"
write-host "Trigger the DriveSpaceReport Scheduled Task on all DPs"
write-host "######################################################"
write-host ""

# Trigger the Scheduled tasks

write-host "Get task status and trigger the task for each server" -ForegroundColor Green
write-host ""

foreach ($Server in $servers)
{
write-host "Processing $server" -ForegroundColor Yellow
Write-Host "Getting Scheduled Task Details for 'DriveSpaceReport'" -ForegroundColor Cyan
schtasks /query /S $server /TN DriveSpaceReport /FO List
Write-Host ""
Write-Host "Run the task..." -ForegroundColor Cyan
schtasks /run /S $server /TN DriveSpaceReport /I
Write-Host ""
}

# Report task status

Write-Host ""
write-host "Get updated task status for each server" -ForegroundColor Green

foreach ($Server in $servers)
{
Write-Host ""
$Task = schtasks /query /S $server /TN DriveSpaceReport /V /FO CSV | ConvertFrom-Csv
$Name = $Task.TaskName
$Status = $Task.Status
$LastRunTime = $Task.'Last Run Time'
$Result = $Task.'Last Result'
Write-host $server -ForegroundColor Cyan
Write-host "TaskName: $Name"
Write-host "Status: $Status"
write-host "Last Run Time: $LastRunTime"
Write-host "Last Result: $Result"
}

Installing and Configuring WSUS with Powershell

In setting up our SCCM 2012 infrastructure, I decided to patch our OS deployments using WSUS instead of SCCM Software Updates.  Since we have multiple distribution points in different geographical areas, I decided to install a WSUS server in each location where we do deployments.  Granted, installing and configuring WSUS is not the most technically challenging thing in the world, but when you have to do it multiple times, it begs for automation!  So I fired up my trusty Powershell ISE to see what could be done.

I wrote this script for my own environment, but it should be flexible enough to be used by anyone.  It’s tested for use on Windows Server 2012 R2 and It’s designed to run in Powershell ISE, so it doesn’t take any parameters, just set the variables as required.  Also you can change any of the WSUS configuration, such as Products and Classifications, just edit the relevant section of the script.

Download from Technet Gallery.

What does the script do?

First, we install .Net Framework 3.5 if it isn’t already installed, this is a requirement for WSUS.  Next, we download and install Microsoft Report Viewer 2008 SP1, which is required for viewing WSUS reports.  If you chose the ‘SQLExpress’ installation, we download SQL Server 2012 Express SP1 with tools and run an unattended installation using default parameters.  Then we install WSUS and run the post-installation tasks with wsusutil.exe.

Now, we do a basic configuration, which is equivalent to running the WSUS Configuration Wizard.  We set the location to sync updates from, the update language/s, run a metadata sync to get available Products and Classifications, set which Products and Classifications we want to sync, and enable the automatic sync schedule.  Then we do a full sync.

Once the sync is completed, we decline certain updates that we don’t want, such as all ‘itanium’ updates, configure and enable the Default Automatic Approval Rule, then run it so the updates will start downloading.

Most of these activities are optional and are activated using variables which you must set before you run the script, so if you want to use WID, or an existing SQL instance you can.  You can skip the configuration entirely and do it manually, or just do the bare minimum, and of course you can customise the configuration in the script.

Step by Step Walkthrough

First, we set the variables, such as the WSUS installation type, the location for Updates, things to configure etc.


###############
## Variables ##
###############

##//INSTALLATION//##

# Do you want to install .NET FRAMEWORK 3.5? If true, provide a location for the Windows OS media in the next variable
    $DotNet = $True
# Location of Windows sxs for .Net Framework 3.5 installation
    $WindowsSXS = "D:\sources\sxs"
# Do you want to download and install MS Report Viewer 2008 SP1 (required for WSUS Reports)?
    $RepViewer = $True
# WSUS Installation Type.  Enter "WID" (for WIndows Internal Database), "SQLExpress" (to download and install a local SQLExpress), or "SQLRemote" (for an existing SQL Instance).
    $WSUSType = "SQLRemote"
# If using an existing SQL server, provide the Instance name below
    $SQLInstance = "MyServer\MyInstance"
# Location to store WSUS Updates (will be created if doesn't exist)
    $WSUSDir = "C:\WSUS_Updates"
# Temporary location for installation files (will be created if doesn't exist)
    $TempDir = "C:\temp"

##//CONFIGURATION//##

# Do you want to configure WSUS (equivalent of WSUS Configuration Wizard, plus some additional options)?  If $false, no further variables apply.
# You can customise the configurations, such as Products and Classifications etc, in the "Begin Initial Configuration of WSUS" section of the script.
    $ConfigureWSUS = $True
# Do you want to decline some unwanted updates?
    $DeclineUpdates = $True
# Do you want to configure and enable the Default Approval Rule?
    $DefaultApproval = $True
# Do you want to run the Default Approval Rule after configuring?
    $RunDefaultRule = $False

We install .Net Framework 3.5 if required


# Install .Net Framework 3.5 from media
if($DotNet -eq $true)
{
write-host 'Installing .Net Framework 3.5'
Install-WindowsFeature -name NET-Framework-Core -Source $WindowsSXS
}

We install the Report Viewer from Microsoft for viewing WSUS reports. We start a bits job to download it, the we do a silent install.

# Download MS Report Viewer 2008 SP1 for WSUS reports

if ($RepViewer -eq $True)
{
write-host "Downloading Microsoft Report Viewer 2008 SP1...please wait"
$URL = "http://download.microsoft.com/download/3/a/e/3aeb7a63-ade6-48c2-9b6a-d3b6bed17fe9/ReportViewer.exe"
Start-BitsTransfer $URL $TempDir -RetryInterval 60 -RetryTimeout 180 -ErrorVariable err
if ($err)
{
write-host "Microsoft Report Viewer 2008 SP1 could not be downloaded!" -ForegroundColor Red
write-host 'Please download and install it manually to use WSUS Reports.' -ForegroundColor Red
write-host 'Continuing anyway...' -ForegroundColor Magenta
}

# Install MS Report Viewer 2008 SP1

write-host 'Installing Microsoft Report Viewer 2008 SP1...'
$setup=Start-Process "$TempDir\ReportViewer.exe" -verb RunAs -ArgumentList '/q' -Wait -PassThru
if ($setup.exitcode -eq 0)
{
write-host "Successfully installed"
}
else
{
write-host 'Microsoft Report Viewer 2008 SP1 did not install correctly.' -ForegroundColor Red
write-host 'Please download and install it manually to use WSUS Reports.' -ForegroundColor Red
write-host 'Continuing anyway...' -ForegroundColor Magenta
}
}

I prefer to use WSUS with a local SQL Express installation so I have some access to the database if I need to.  If chosen, we download and install SQL Server Express 2012 SP1 with admin tools using an unattended installation. We use the ‘ALLFEATURES_WITHDEFAULTS’ role, and add the local administrators group to the SQL sysadmin accounts.

# Download SQL 2012 Express SP1 with tools

if ($WSUSType -eq 'SQLExpress')
{
write-host "Downloading SQL 2012 Express SP1 with Tools...please wait"
Start-Sleep -Seconds 10 # wait 10 seconds in case of BITS overload error
$URL = "http://download.microsoft.com/download/5/2/9/529FEF7B-2EFB-439E-A2D1-A1533227CD69/SQLEXPRWT_x64_ENU.exe"
Start-BitsTransfer $URL $TempDir -RetryInterval 60 -RetryTimeout 180 -ErrorVariable err
if ($err)
{
write-host "Microsoft SQL 2012 Express SP1 could not be downloaded!  Please check internet availability." -ForegroundColor Red
write-host 'The script will stop now.' -ForegroundColor Red
break
}

# Install SQL 2012 Express with defaults

write-host 'Installing SQL Server 2012 SP1 Express with Tools...'
$setup=Start-Process "$TempDir\SQLEXPRWT_x64_ENU.exe" -verb RunAs -ArgumentList '/QUIETSIMPLE /IACCEPTSQLSERVERLICENSETERMS /ACTION=INSTALL /ROLE=ALLFEATURES_WITHDEFAULTS /INSTANCENAME=SQLEXPRESS /SQLSYSADMINACCOUNTS="BUILTIN\ADMINISTRATORS" /UPDATEENABLED=TRUE /UPDATESOURCE="MU"' -Wait -PassThru

if ($setup.exitcode -eq 0)
{
write-host "Successfully installed"
}
else
{
write-host 'SQL Server 2012 SP1 Express did not install correctly.' -ForegroundColor Red
write-host 'Please check the Summary.txt log at C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log' -ForegroundColor Red
write-host 'The script will stop now.' -ForegroundColor Red
break
}
}

Next we install WSUS

# Install WSUS (WSUS Services, SQL Database, Management tools)

if ($WSUSType -eq 'WID')
{
write-host 'Installing WSUS for WID (Windows Internal Database)'
Install-WindowsFeature -Name UpdateServices -IncludeManagementTools
}
if ($WSUSType -eq 'SQLExpress' -Or $WSUSType -eq 'SQLRemote')
{
write-host 'Installing WSUS for SQL Database'
Install-WindowsFeature -Name UpdateServices-Services,UpdateServices-DB -IncludeManagementTools
}

Then we run the post-install configuration tasks using the wsusutil.exe

# Run WSUS Post-Configuration

if ($WSUSType -eq 'WID')
{
sl "C:\Program Files\Update Services\Tools"
.\wsusutil.exe postinstall CONTENT_DIR=$WSUSDir
}
if ($WSUSType -eq 'SQLExpress')
{
sl "C:\Program Files\Update Services\Tools"
.\wsusutil.exe postinstall SQL_INSTANCE_NAME="%COMPUTERNAME%\SQLEXPRESS" CONTENT_DIR=$WSUSDir
}
if ($WSUSType -eq 'SQLRemote')
{
sl "C:\Program Files\Update Services\Tools"
.\wsusutil.exe postinstall SQL_INSTANCE_NAME=$SQLInstance CONTENT_DIR=$WSUSDir
}

Now we begin to configure WSUS. We connect to the WSUS server and get the configuration. We tell it to sync from Microsoft Update, then set the updates language to English.

# Get WSUS Server Object
$wsus = Get-WSUSServer

# Connect to WSUS server configuration
$wsusConfig = $wsus.GetConfiguration()

# Set to download updates from Microsoft Updates
Set-WsusServerSynchronization –SyncFromMU

# Set Update Languages to English and save configuration settings
$wsusConfig.AllUpdateLanguagesEnabled = $false
$wsusConfig.SetEnabledUpdateLanguages("en")
$wsusConfig.Save()

We do an initial sync to get the available products and categories from Microsoft Update

# Get WSUS Subscription and perform initial synchronization to get latest categories
$subscription = $wsus.GetSubscription()
$subscription.StartSynchronizationForCategoryOnly()
write-host 'Beginning first WSUS Sync to get available Products etc' -ForegroundColor Magenta
write-host 'Will take some time to complete'
While ($subscription.GetSynchronizationStatus() -ne 'NotProcessing') {
    Write-Host "." -NoNewline
    Start-Sleep -Seconds 5
}
write-host ' '
Write-Host "Sync is done." -ForegroundColor Green

We tell WSUS which Products we want to sync. It’s very important to get these right, otherwise you will download a lot of updates that you don’t need and fill up your disk space! Obviously you’ll want to customise these for your environment.

# Configure the Platforms that we want WSUS to receive updates
write-host 'Setting WSUS Products'
Get-WsusProduct | where-Object {
    $_.Product.Title -in (
    'Report Viewer 2005',
    'Report Viewer 2008',
    'Report Viewer 2010',
    'Visual Studio 2005',
    'Visual Studio 2008',
    'Visual Studio 2010 Tools for Office Runtime',
    'Visual Studio 2010',
    'Visual Studio 2012',
    'Visual Studio 2013',
    'Microsoft Lync 2010',
    'Microsoft SQL Server 2008 R2 - PowerPivot for Microsoft Excel 2010',
    'Dictionary Updates for Microsoft IMEs',
    'New Dictionaries for Microsoft IMEs',
    'Office 2003',
    'Office 2010',
    'Office 2013',
    'Silverlight',
    'System Center 2012 - Orchestrator',
    'Windows 7',
    'Windows 8.1 Drivers',
    'Windows 8.1 Dynamic Update',
    'Windows 8',
    'Windows Dictionary Updates',
    'Windows Server 2008 R2',
    'Windows Server 2008',
    'Windows Server 2012 R2',
    'Windows Server 2012',
    'Windows XP 64-Bit Edition Version 2003',
    'Windows XP x64 Edition',
    'Windows XP')
} | Set-WsusProduct

We do the same for the Update Classifications

# Configure the Classifications
write-host 'Setting WSUS Classifications'
Get-WsusClassification | Where-Object {
    $_.Classification.Title -in (
    'Critical Updates',
    'Definition Updates',
    'Feature Packs',
    'Security Updates',
    'Service Packs',
    'Update Rollups',
    'Updates')
} | Set-WsusClassification

I guess it’s a bug, but it seems WSUS sometimes enables the entire parent Product when adding them by script this way, so we pause the script and prompt to check in the WSUS console that the correct Products are selected before continuing.

# Prompt to check products are set correctly
write-host 'Before continuing, please open the WSUS Console, cancel the WSUS Configuration Wizard,' - -ForegroundColor Red
write-host 'Go to Options > Products and Classifications, and check that the Products are set correctly.' - -ForegroundColor Red
write-host 'Pausing script' -ForegroundColor Yellow
$Shell = New-Object -ComObject "WScript.Shell"
$Button = $Shell.Popup("Click OK to continue.", 0, "Script Paused", 0) # Using Pop-up in case script is running in ISE

We set the automatic sync schedule to once per day at midnight, then start the first full synchronisation.

# Configure Synchronizations
write-host 'Enabling WSUS Automatic Synchronisation'
$subscription.SynchronizeAutomatically=$true

# Set synchronization scheduled for midnight each night
$subscription.SynchronizeAutomaticallyTimeOfDay= (New-TimeSpan -Hours 0)
$subscription.NumberOfSynchronizationsPerDay=1
$subscription.Save()

# Kick off a synchronization
$subscription.StartSynchronization()

We monitor the progress of the sync in the Powershell console as it can take some time.

# Monitor Progress of Synchronisation

write-host 'Starting WSUS Sync, will take some time' -ForegroundColor Magenta
Start-Sleep -Seconds 60 # Wait for sync to start before monitoring
while ($subscription.GetSynchronizationProgress().ProcessedItems -ne $subscription.GetSynchronizationProgress().TotalItems) {
    Write-Progress -PercentComplete (
    $subscription.GetSynchronizationProgress().ProcessedItems*100/($subscription.GetSynchronizationProgress().TotalItems)
    ) -Activity "WSUS Sync Progress"
}
Write-Host "Sync is done." -ForegroundColor Green

After the sync is complete, we decline some updates that we don’t want. In my example, we are declining IE10 and the Microsoft Browser Choice EU updates, which we don’t want (I used the KB article number in the ‘TextIncludes’ parameter to find them), then we decline all ‘itanium’ updates because we don’t have any itanium servers. Do you?

# Decline Unwanted Updates

if ($DeclineUpdates -eq $True)
{
write-host 'Declining Unwanted Updates'
$approveState = 'Microsoft.UpdateServices.Administration.ApprovedStates' -as [type]

# Declining All Internet Explorer 10
$updateScope = New-Object Microsoft.UpdateServices.Administration.UpdateScope -Property @{
    TextIncludes = '2718695'
    ApprovedStates = $approveState::Any
}
$wsus.GetUpdates($updateScope) | ForEach {
    Write-Verbose ("Declining {0}" -f $_.Title) -Verbose
    $_.Decline()
}

# Declining Microsoft Browser Choice EU
$updateScope = New-Object Microsoft.UpdateServices.Administration.UpdateScope -Property @{
    TextIncludes = '976002'
    ApprovedStates = $approveState::Any
}
$wsus.GetUpdates($updateScope) | ForEach {
    Write-Verbose ("Declining {0}" -f $_.Title) -Verbose
    $_.Decline()
}

# Declining all Itanium Update
$updateScope = New-Object Microsoft.UpdateServices.Administration.UpdateScope -Property @{
    TextIncludes = 'itanium'
    ApprovedStates = $approveState::Any
}
$wsus.GetUpdates($updateScope) | ForEach {
    Write-Verbose ("Declining {0}" -f $_.Title) -Verbose
    $_.Decline()
}
}

Then we enable the Default Automatic Approval Rule and configure it with the classifications we want.

# Configure Default Approval Rule

if ($DefaultApproval -eq $True)
{
write-host 'Configuring default automatic approval rule'
[void][reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration")
$rule = $wsus.GetInstallApprovalRules() | Where {
    $_.Name -eq "Default Automatic Approval Rule"}
$class = $wsus.GetUpdateClassifications() | ? {$_.Title -In (
    'Critical Updates',
    'Definition Updates',
    'Feature Packs',
    'Security Updates',
    'Service Packs',
    'Update Rollups',
    'Updates')}
$class_coll = New-Object Microsoft.UpdateServices.Administration.UpdateClassificationCollection
$class_coll.AddRange($class)
$rule.SetUpdateClassifications($class_coll)
$rule.Enabled = $True
$rule.Save()
}

Finally we run the rule which will approve the updates and begin the file downloads. However, in my testing this always errors with a timeout when activated through Powershell, so I put it in a try-catch-finally block to finish the script successfully. Even if it errors, the rule is actually run as you will be able to see from the WSUS console.


# Run Default Approval Rule

if ($RunDefaultRule -eq $True)
{
write-host 'Running Default Approval Rule'
write-host ' >This step may timeout, but the rule will be applied and the script will continue' -ForegroundColor Yellow
try {
$Apply = $rule.ApplyRule()
}
catch {
write-warning $_
}
Finally {
# Cleaning Up TempDir

write-host 'Cleaning temp directory'
if (Test-Path $TempDir\ReportViewer.exe)
{Remove-Item $TempDir\ReportViewer.exe -Force}
if (Test-Path $TempDir\SQLEXPRWT_x64_ENU.exe)
{Remove-Item $TempDir\SQLEXPRWT_x64_ENU.exe -Force}
If ($Tempfolder -eq "No")
{Remove-Item $TempDir -Force}

write-host 'WSUS log files can be found here: %ProgramFiles%\Update Services\LogFiles'
write-host 'Done!' -foregroundcolor Green
}
}

Monitoring the Update File Downloads

After the Default Approval Rule has been run, you can monitor the ‘Download Status’ of the update files in the WSUS console.  But since it can take a long time, I wrote a little script that will monitor the downloads and email me once they have finished.  It must be run as administrator on the WSUS server.


$Computername = $env:COMPUTERNAME
$ToEmail = "myemailaddress@mydomain.com"
$FromEmail = "WSUS.on.$Computername@mydomain.com"
$smtpServer = "mysmtpServer"
# Polling frequency in seconds
$Seconds = "320"

cls
Write-host 'Monitoring WSUS Update File Downloads...'
write-host 'Will send email when completed.'
[reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration") | out-null
$wsus = [Microsoft.UpdateServices.Administration.AdminProxy]::GetUpdateServer();
$updateScope = new-object Microsoft.UpdateServices.Administration.UpdateScope;
$updateScope.updateApprovalActions = [Microsoft.UpdateServices.Administration.UpdateApprovalActions]::Install
while (($wsus.GetUpdates($updateScope) | Where {$_.State -eq "NotReady"}).Count -ne 0) {
Start-Sleep -Seconds $Seconds
}
send-mailmessage -To $ToEmail -From $FromEmail -Subject "WSUS Update File Download Completed on $ComputerName" -body "Download of Update Files on $Computername has completed!" -smtpServer $smtpServer

Email Notifications

Finally, if you configure E-mail Notifications in WSUS, you may hit the lovely 5.7.1 error from Exchange:

Mailbox unavailable. The server response was: 5.7.1 Client does not have permissions to send as this sender 

This is because it tries to authenticate with its computer account.  So you have to create a new Receive Connector in Exchange to allow relaying from anonymous users with TLS-authentication to work around the problem.

You can run a command like the following to create it:


New-ReceiveConnector -Name "WSUS Relay" -Bindings 0.0.0.0:25 -RemoteIPRanges 10.x.x.1.,10.x.x.2 -AuthMechanism Tls -Enabled
 $true -PermissionGroups AnonymousUsers -Server MyEdgeServer

Incidentally, you can’t really configure E-mail Notifications with Powershell as you must set the recipient email address for it to work, and this is a read-only property that Powershell can’t change, so better to do it manually.

That’s it!  Feel free to suggest some improvements, or take the code and make something better yourself!

Most of the WSUS code I learned from these great resources, especially the work of Boe Prox

http://blogs.technet.com/b/heyscriptingguy/archive/2013/04/15/installing-wsus-on-windows-server-2012.aspx
http://learn-powershell.net/2010/11/14/wsus-administrator-module/
http://p0w3rsh3ll.wordpress.com/2013/02/05/wsus-on-windows-server-2012-core-from-scratch/
http://community.spiceworks.com/attachments/post/0006/1234/powershell_wsus.ps1
http://learn-powershell.net/2013/11/12/automatically-declining-itanium-updates-in-wsus-using-powershell/
http://poshwsus.codeplex.com/