ConfigMgr MaxExecutionTime Guesses for Updates

There is a situation which MIGHT happen for you.  The default for Cumulative Updates is, I believe 60 minutes now.  But many updates are still defaulting to 10 minutes.  I don't personally think that default should change, however, occasionally there are large updates (think Microsoft Office updates) which might be several hundred GB in size, and might take more than 10 minutes to install.  In your reporting, and when looking at local logs, the CM client says the install "Failed", but all you do is a re-scan for updates, and CM says it's installed.  So what gives, you wonder?  Well, this could be a possible reason.  It's not that the install 'failed' per se.  But after 10 minutes, the CM client stopped 'watching for' the successful install.  It timed out kind of.  Since I noticed a pattern that "it's usually when those updates are ginormous, they take longer to install", below is a POSSIBLE sql query to perhaps help you find and adjust the "Max Execution Timeout" on any individual updates.

A couple of pre-requisites.  Naturally, the content has to be downloaded. So if you run this 5 minutes after a "hotfix Tuesday" sync, it might not have much to say.  Because the content hasn't been downloaded to calculate "how big" any particular update is.  So you do have to wait until your content is downloaded to track these down.

Also note that I haven't created any kind of "powershell script" to automatically adjust the Max Execution Timeout.  This is just a report, and the admin would either posh-script changing each individual update, or use the console, find each update, right-click on it and in properties for that update, adjust up the max Execution Timeout to fit.

Also note these "suggestions" are just that, suggestions.  There is no right or wrong answer for how long Max Execution Timeout should be.  Leaving it all alone as-is with no changes from what you have will still work just fine.  One of the problems you may encounter might discourage you from touching or doing anything with this at all could be this following scenario...  Here's the scenario where following these suggestions would be a big bad horrible idea.  Let's say you allow your devices to have a service window every night for 4 hours.  Then you follow these suggestions, and for whatever reason, there were 8 different Office updates, and you changed them all from 10 minutes to 60 minutes each... for a total of 8 hours estimated time to install.  A client, when it gets the Software Update deployment, when it used to think "ok, these 8 will take me 80 minutes, I can do that in my 4 hour window, let's start!".  It'll start installing, and maybe it only gets 3 done... but it does get 3 done.  If you set them to 60 minutes each, the client might decide "wow, 8 hours... I can't do that in my service window... I'll just wait until I have 8+ hours to get this done".  and of course... it may never install any of them.  So be careful in deciding whether or not this is a potentially BAD idea, for your environment.  Or at least be aware of the potential repercussions, so you know what to un-do.

What this sql does, is list for Updates released in the last 30 days, and content has been downloaded, kind of look at the maxexecutiontime set, vs. how big the content is.  and if, for example, the content size is between 50 and 100mb, but it's maxexecutiontime isn't 20 minutes or more, then maybe you the admin might want to think about making MaxExecutionTime on that specific update to be 20 minutes--so you don't get false "I failed to install" reports which a re-scan will address.

Again... this isn't perfect.  It's just a possible suggestion, if you maybe have seen this behavior in your Software Updates deployments, and were wondering if there was a way to be pro-active about increasing the MaxExecutionTime without waiting for your reports to tell you the next day.

DECLARE @StartDate datetime = DateADD(Day, -30, GETDATE())
DECLARE @EndDate datetime = GetDate()

;with cte as (select ui.MaxExecutionTime/60 [Max ExecutionTime in Minutes], ui.articleid, ui.title, ui.DateLastModified, ui.DatePosted
,ui.IsSuperseded, ui.IsExpired
,(SUM(files.FileSize)/1024)/1 as [Size in KB]
,(SUM(files.FileSize)/1024/1024)/1 as [Size in MB]
from v_updateinfo ui
join v_UpdateContents content on content.CI_ID=ui.CI_ID
join vCI_ContentFiles files on files.Content_ID=content.Content_ID
where severity is not null
and content.ContentProvisioned = 1
and ui.dateposted between @StartDate and @EndDate
and ui.IsExpired = 0
group by ui.MaxExecutionTime, ui.articleid, ui.title, ui.DateLastModified, ui.dateposted, ui.IsSuperseded, ui.IsExpired
)

select
Case when cte.[Size in MB] < 50 and cte.[Max ExecutionTime in Minutes] >= 10 then 0
when cte.[Size in MB] BETWEEN 50 and 100 and cte.[Max ExecutionTime in Minutes] >= 20 then 0
when cte.[Size in MB] between 100 and 150 and cte.[Max ExecutionTime in Minutes] >= 30 then 0
when cte.[Size in MB] between 150 and 200 and cte.[Max ExecutionTime in Minutes] >= 40 then 0
when cte.[Size in MB] between 200 and 250 and cte.[Max ExecutionTime in Minutes] >= 50 then 0
when cte.[Size in MB] between 250 and 300 and cte.[Max ExecutionTime in Minutes] >= 60 then 0
when cte.[Size in MB] > 300 and cte.[Max ExecutionTime in Minutes] >=90 then 0
else 1
End as [Could use MaxExecutionTime Adjustment],
case when cte.[Size in MB] < 50 then '10 minutes'
when cte.[Size in MB] BETWEEN 50 and 100 then '20 minutes'
when cte.[Size in MB] between 100 and 150 then '30 minutes'
when cte.[Size in MB] between 150 and 200 then '40 minutes'
when cte.[Size in MB] between 200 and 250 then '50 minutes'
when cte.[Size in MB] between 250 and 300 then '60 minutes'
when cte.[Size in MB] > 300 then '90 minutes'
end as 'time to set'
, cte.*

from cte
order by [Could use MaxExecutionTime Adjustment] desc, [Time to set] desc

CMCB, SQL

  • Created on .

Create ConfigMgr Powershell Configuration Items using Powershell

As part of a presentation for the 2019 Midwest Management Summit in Minneapolis, one of the sessions I'm presenting with Jeff Bolduan is Configuration Items.  As part of that session, we'll be demoing using a PowerShell Script to create PowerShell-based Configuration Item.
 
If you want to see how that works (at least it works in my lab) --> Here <-- is the script for creating a Configuration Item with multiple tests inside, where the CIs are posh-based detection, applicability, and remediation scripts.  For demo purposes, I grabbed the scripts from the blog --> about WSUS Administration/WSUSPool <-- settings enforcement via Configuration Items, and got them all working to be made into 1 Configuration Item, with multiple rules.
 
Hopefully for those of you who are looking to create your own re-producible PowerShell code for creating posh-based CIs, the attached example posh will give you an idea of how you might want to get that done.

CMCB, PowerShell

  • Created on .

ConfigMgr Truncate History Tables

Thanks very much to Umair Khan, Twitter @TheFrankUK, for the assist!  One of the hiccups recently was making sure to exclude "globaldata" type HIST tables, so that DRS replication doesn't want to go into MAINTENANCE_MODE and re-initialize global data.

Read more: ConfigMgr Truncate History Tables

  • Created on .

Inventory Per User Installed Applications, For Example, Click-Once

This routine has only had a limited life in a lab environment with only 3 clients.  Use at your own risk, etc. etc.  No promises or guarantees, and it might be the Worst Thing Ever.  Test, test, and test some more. 

What this routine would be for, is a custom powershell script, which tries to read what per-user installed things are installed, for the currently logged in user.  I tried in the lab to run it as a Baseline/Compliance Item... but one of the problems with it is that although running as 'SYSTEM', it wants to look at whatever user is currenly logged in.  As a Baseline, it won't 'wait for a user to logon' to run.  So depending upon when it runs, it might run and make an empty custom class with nothing to say, simply because the user is currently not logged on--even though they are logged on 8 hours a day, it just happened to run within the other 16 hours of that day.

So you, Super CM Admin that you are, you might want to forget about doing this as a baseline.  Instead make the powershell script as the only thing in the source folder for a package.  And then, make a old school/traditional Package, and program.  the program would run the script, "only when a user is logged on", but "with system rights".  and deploy the program to a collection.  If it were me... I'd set the advertisement to run on a schedule, like every 4 days or something.  Note I didn't test this at all in my lab.  I'm just offering this out there into the ether for (hopefully) someone else to take this and make it awesome and bulletproof. 

What the script does is create, and populate, a custom class. 

In the --> attached <-- is also a mof file.  You'd want to go to your console, Administration, Client Settings, Default Cient Settings, Hardware Inventory, set classes, and Import that mof file.  Once that is done, clients will be able to start reporting back on this information.

 

CMCB

  • Created on .

Use CM Console scripts node to gather log files from CM Clients

To assist in answering a question in this forum post:
https://social.technet.microsoft.com/Forums/en-us/9017aca5-06aa-4a79-a034-a646b19b89fe/collecting-log-files-from-the-client?forum=configmgrcbgeneral

I'm blogging on behalf of Srikant Yadav; he gave me permission to do so.  Thanks Srikant! 

How to make this work..

Step 1:
Make a location on a server you manage/control--which has lots of space.

create a folder called (for example):

E:\ClientLogs
Share that out as ClientLogs$
At a minimum, you need these permissions (if you have multiple domains, or support non-domain joined computers, you'll have to figure out what other permissions might be necessary).

 For share permissions, because who will be 'copying' the logs to that share is a computer, add the group:
  <your domain where your computers live>\Domain Computers, with Change, Read.
 On that folder of E:\ClientLogs, for NTFS permissions, add Modify, Read & Execute, List folder contents, read, Write (aka, everything but full control) to
  <that same group you just did for share permissions, aka, \Domain Computers

Step 2:
In the --> attached <-- is a script.  Modify the parameter within that script which is currently...
$Destination = "\\<DummyShare>\ClientLogs$"

To be  \\YourServer\ClientLogs$

Save that modified script as <some location you'll remember>\ImportThisIntoCM.ps1

Step 3:
In your CM Console, go to software library, scripts, create script
ScriptName = Retrieve Client Logs
Script Language = Powershell
Import... and go to <some location you just said you'd never forget> and import that ImportThisIntoCM.ps1 script.
Next
Review the Script Parameters.  You can, if you wish, modify the defaults of the parameters here.  For example, maybe you ALWAYS want to get any ccmsetuplogs, or you know you only want log files that will be within the last 5 days and nothing older.
double-check the Destination is the right servername and sharename
Next, Next, Close.

Step 4:
Approve the script in the Scripts Node.  You may need a peer to do the approval.  In smaller environments, if you are the only admin, you can self approve scripts in the Scripts node if you've configured that in Site Configuration, Site, Hierarchy Settings, uncheck "do not allow script authors to approve their own scripts".  This is a safety feature, that you SHOULD leave checked--because scripts can be powerful.  Some disgruntled admin COULD make a "format c:" type of script, self approve it, and send it as they walk out the door.  Just saying... you might not want to do this.  peer review of scripts is GOOD.

Step 5:
Use it!
As an example, in Assets and Compliance, Devices, pick on a Online device (obviously this only works if the target is online/available), right-click, Run Script.  Pick "Retrieve Client Logs".  At this point, you can change parameters as needed.  Next/next.  You'll see a progress bar. 

When it's done, in the \\yourserver\ClientLogs$ will be Subfolders; CMClientLogs$ for cmclientlogs, WindowsUpdateLogs$ for WindowsUpdateLogs, etc.  Inside those subfolders will be the zipped-up log files, named for the device name targeted.

Step 6:
Have a Cleanup Routine.  The \\YourServer\ClientLogs$ doesn't have any automatic cleanup routine around it.  If say... you were to gather log files all the time, wherever that location exists might fill up the drive.  You want to remember to clear that out either manually occasionally, or setup some kind of maintenance routine on that server to "delete xx when older than yy days" or something.

Possible updates...If you read through the script, you'll see that you can make this extensible yourself.  Perhaps you have a <App Specific to our type of business> which has log files that are always stored in c:\programdata\Widgets\Logs.  You can easily add a new section to the script, with another parameter to grab those log files as needed, if needed.

CMCB, PowerShell

  • Created on .
Copyright © 2019 - The Twin Cities Systems Management User Group