In the 2002 release of Endpoint Configuration Manager, Microsoft added a nice capability to collect log files from a client to the site server. Whilst this is a cool capability, you might not be on 2002 yet or you might prefer to send logs to a storage account in Azure rather than to the site server. You can do that quite easily using the Run Script feature. This works whether the client is connected on the corporate network or through a Cloud Management Gateway.
To do this you need a storage account in Azure, a container in the account, and a Shared access signature.
I’ll assume you have the first two in place, so let’s create a Shared access signature. In the Storage account in the Azure Portal, click on Shared access signature under Settings.
- Under Allowed services, check Blob.
- Under Allowed resource types, check Object.
- Under Allowed permissions, check Create.
Set an expiry date then click Generate SAS and connection string. Copy the SAS token and keep it safe somewhere.
Below is a PowerShell script that will upload client log files to Azure storage.
Update the following parameters in your script:
- ContainerURL. This is the URL to the container in your storage account. You can find it by clicking on the container, then Properties > URL.
- SASToken. This is the SAS token string you created earlier.
Create and approve a new Script in ConfigMgr with this code. You can then run it against any online machine, or collection. When it’s complete, it will output how many log files were uploaded and how long the upload took.
To view the log files, you can either browse them in storage account in the Azure portal looking at the container directly, or using the Storage explorer. My preferred method is to use the standalone Microsoft Azure Storage Explorer app, where you can simply double-click a log file to open it, or easily download the folder containing the log files to your local machine.