Although use of virtualization provides some of the most stunning examples, nearly all aspects of IT work benefit from more and better use of automation. I submit, in fact, that it is difficult to scale IT up to enterprise or infrastructure levels without it. What do I mean by “automation” here? I’m talking about programmatic use of tools, primarily through scripting or programs, that take one-at-a-time tasks that might have been performed on a one-machine-at-a-time basis in earlier days and run them in batches or remotely on lots of machines in parallel. I guess that means the “machines” that do the automation are intangible (software, mostly) but no less valuable for that.
The Script’s The Thing…PowerShell, That Is!
Given that it’s desirable to take tasks and capture them for repackaging and reuse, how might one go about doing that? Personally, I’ve found that PowerShell offers a terrific toolbox for all kinds of typical Windows administrative tasks, from image creation to manipulation to deployment, to handling backups, performing system updates and cleanups, and much, much more. FYI, prepackaged, ready-to-run PowerShell commands organized in verb-noun form are called “cmdlets” — such as Get-NetConnectionProfile which lists network profiless defined for Windows, or Set-NetConnectionProfile which lets you set attributes for specific network profiles, to use a couple of such things that served me well today as examples. A PowerShell script is a text file that includes one or more cmdlets to perform specific tasks of many kinds.
One good place to go looking for PowerShell information and inspiration is in Microsoft’s PowerShell Gallery, where you’ll find all manner of cmdlets ready for everyday (and occasional) use. The actual cmdlets come in categories, available through the Reference materials at PowerShell/Scripting. Be sure to check out the large collection of sample scripts also available at Microsoft Docs, too.
You can team PowerShell up with other toolsets, too. Most notably these days, that means Puppet Enterprise. (Here’s a link to an eBook entitled Managing Windows with Puppet Enterprise that’s worth checking out.) Looking further afield, you can explore a veritable cornucopia of enterprise automation tools for environments built around (or that include) Windows desktops and servers. A quick look at this Google search will not only confirm this assertion, but bedazzle and bewilder you with too, too many options.
Remember Always: Automation Is Supposed to Save Time and Effort
Digging into automation is a big job. It’s time consuming, endlessly interesting, and can serve as a sinkhole for time, effort, money, and other resources. That’s why it’s important to approach automation with a mission to accomplish. If used properly, it will indeed save time and effort. It may even save money. But you should keep one eye on the time and effort spent to develop automated solutions and the other eye on how long it would take to do it straight up without automating anything. In general, only those tasks or activities that run on lots of machines, or that repeat at regular intervals (where increased frequency also increases the impetus to consider automation) should be candidates for automation.
Also, automation works best in an environment where DevOps principles and practices are followed, including rigorous change management, so that you can schedule, implement, and maintain the automation stuff just like you do the rest of the stuff. Ad-hoc automation usually forgoes what might be called “adult supervision” (tongue in cheek) and is often a recipe for disaster. But put to work properly, and focused on the right kinds of tasks and activities, automation is what makes large-scale, just-in-time, on-demand virtual infrastructures (think “Infrastructure as a Service” or IaaS) possible. If it works for those kinds of operations, it can work for your operations, too. Give it a try, but be careful, and always keep in mind that you’re tyring to SAVE time and effort, not waste it.
Author: Ed Tittel
Ed Tittel is a 30-plus-year computer industry veteran. He’s a Princeton and multiple University of Texas graduate who’s worked in IT since 1981 when he started his first programming job. Over the past three decades he’s also worked as a manager, technical evangelist, consultant, trainer, and an expert witness. See his professional bio for all the details.