Monthly Archives: July 2014

Once you go splat…

Splatting is one of those things you can get used to real quickly!

Who hasn’t used the New-ADUser cmdlet? Just have a look at the syntax. Now imagine going through all those parameters on one line…

New-ADUser -Name "irwins" -GivenName "Irwin" -Surname "Strachan" -SamAccountName "irwins" -DisplayName "Irwin Strachan"

or separate lines using the tick ` character…

 New-ADUser `
    -Name "irwins" `
    -GivenName "Irwin" `
    -Surname "Strachan" `
    -SamAccountName "irwins" `
    -DisplayName "Irwin Strachan"

Granted this is readable until the tick looks like a dead pixel on your screen and you start scratching away at it (Why? First instinct I guess…) only to realize that it was indeed the tick character (Yes I’ve been there…)

Splatting makes for better (readable) code.

$prmNewADUser = @{
    Name = "irwins" 
    GivenName = "Irwin" 
    Surname = "Strachan"
    SamAccountName = "irwins"
    DisplayName = "Irwin Strachan"

New-ADUser @prmNewADUser

Splatting also works with your functions. Go ahead try it! Neat huh?

Because it’s a hash table the add and remove methods are available which can make for some interesting tricks… I’ll discuss that one another time.

One thing though… When splatting the keys/parameters need to be known. For example:

$prmNewADUser = @{
    Name= "irwins" 
    GivenName = "Irwin" 
    Surname = "Strachan"
    SamAccountName = "irwins"
    NickName = “Urv”
    DisplayName = "Irwin Strachan"

New-ADUser @prmNewADUser

This will fail because NickName isn’t a valid parameter. Suppose you’re using a csv file that contains NickName as a field. You could do the following:


right before running

New-ADUser @prmNewADUser

Ok so I got into it just a little… 🙂

Be sure to check out some free ebooks at I enjoyed the gotcha’s ebook. If you’re serious about your powershell scripting skill there isn’t a shortage of good internet resources.

Hope it’s worth something to you…




CMTrace LogFile style

There will come a time you’ll need detailed and extensive logging information.

I got this tip from a colleague scripter of mine, Eelco Labordus.

I haven’t done much with SCCM, so I wasn’t familiar with the CMTrace.exe tool
The pro is that it has color coding, so an error stands out (Default color of an error is red).

I found the original link to the script. It’s by Ephing Admin. Gotta give him his proper props!!! 😉

Ok it was a lively discussion as to the pro’s and cons’ of using this style. I’m always cautious of using third party tools. In this case I shouldn’t bother too much, still… I have had tools in the past that were abandoned and was left hanging to dry.

So this logging style if I understood correctly, will help you pinpoint where a script left off if it suddenly stopped executing. Ok the script failed, isn’t it safer to restart? Don’t get me wrong I’m all for proactive actions, but it should have added value. I’ve seen transcripts been used as logfiles. Goodluck trying to decipher that!!! And if you depend on ISE then your out of luck!

I mainly use verbose for verification. Mind you, the more you log the more bogged down you script becomes. Running a script in verbose mode will slow things down a bit (Not as much as Write-Host, I don’t know why it’s something I’ve noticed…)

When I don’t use verbose then Write-Progress is my next best friend. It’s always good practice to have a general idea what your script is up to…

So having an extra step along the way… Hmmm… Let’s roll with it!


#region: CMTraceLog Function formats logging in CMTrace style
function CMTraceLog {
    Param (





    $Time = Get-Date -Format "HH:mm:ss.ffffff"
    $Date = Get-Date -Format "MM-dd-yyyy"

    if ($ErrorMessage -ne "") {$Type = 3}
    if ($Component -eq $null) {$Component = " "}
    if ($Type -eq $null) {$Type = 1}

    $LogMessage = "<![LOG[$Message $ErrorMessage" + "]LOG]!><time=`"$Time`" date=`"$Date`" component=`"$Component`" context=`"`" type=`"$Type`" thread=`"`" file=`"`">"
    $LogMessage | Out-File -Append -Encoding UTF8 -FilePath $LogFile

#region: verify that the logFile exists
if(!(test-path "$pwd\log\$logFile")) {
    New-Item "$pwd\log\$logFile" -ItemType File

#region: Create hash CMTrace for splatting
$hshCMTrace = @{
    Message = ""
    Component = $MyInvocation.MyCommand.Name
    ErrorMessage = ""
    Type = 1
    LogFile = "$pwd\log\$logFile"

So I created a dsa.log for anything DSA related. Adding the scriptname to component lets me know which script was used to generate the Message, so far so good. This will come in handy gathering all scripts running dsa related stuff! 🙂

Oh and I created a hash table for splatting! Splatting is one of those techniques that you get used to real fast!

If i need to write a line to logfile updating the input is quite easy:

First I’ll update the key I need:
$hshCMTrace.Message = “Importing csv file $csvFile”

Next the message type:
$hshCMTrace.Type = 1

And then splat!!!
CMTraceLog @hshCMTrace

Notice that the $ becomes a @ when splatting. Splatting has a lot of value, one of which I’ll get into another time.

Well that my tip for for logging CMTrace style with a lil’ help from splat! The difficult part is formulating the right Message and Error message… Remember, added value…

Hope it’s worth something to you…



Generating random passwords

I recently did an Active Directory migration and one of the requirements was that the passwords are complex and random.

Ok… So the first thing i did was googled “Powershell Generate Random Password” because imitation is the sincerest form of flattery… 😉 Ah a hit!

Function random-password ($length = 8)
    $punc = 46..46
    $digits = 48..57
    $letters = 65..90 + 97..122

    # Thanks to
    $password = get-random -count $length `
        -input ($punc + $digits + $letters) |
            % -begin { $aa = $null } `
            -process {$aa += [char]$_} `
            -end {$aa}

     return $password

Script is pretty straightforward the function random-password will generate a password of said length. Nice!

Ok so far so good. I also needed to set the complex password as the user’s Default password when logging in the first time. I just smiled and asked are you certain you want to do that? (I’ll explain later…)

Quick sidestep: The function is great at generating random passwords! I did noticed however when setting the new password I got an error at times (randomly… go figure…)

Turns out it wasn’t an error, was just that the password wasn’t totally compliant.
Then I remembered complex passwords have some requirements. So I fired up GMPC just to see what they were again…
Complexity Password Dictates that it must:

Contain characters from three of the following four categories:

1: English uppercase characters (A through Z)
2: English lowercase characters (a through z)
3: Base 10 digits (0 through 9)
4: Non-alphabetic characters (for example, !, $, #, %)

Aaaah… So that’s why it failed randomly! Some of the passwords only had two out of the four categories. Ok so I just need to make sure the password contains at least three of the four characters.

Ok I knew then and there I had to do something with regular expressions. I’ll admit regular expressions isn’t my forte. Lucky for me I have an “ Arco” :-). Arco is a colleague of mine that is the personification of the UberGeek. His idea of “fun” is reprogramming HP SmartArray controllers. He can also program in assembly… Oh and he’s a whizz at regular expressions!!!
Ok truth be told I did try figuring it out for like 3 minutes then I went straight to Arco! Don’t judge me…

Turns out that google is truly your friend! (Yeah… I know… in hindsight…) So Arco sent me this link:
Hey, why reinvent the wheel eh?

function random-password {
        $length = 8
    $punc = 46..46
    $digits = 48..57
    $lowercase = 97..122
    $uppercase = 65..90
    $symbols = 35..37

    # Thanks to
    do {
        $password = get-random -count ($length) `
            -input ($punc +$digits+$uppercase+$symbols+$lowercase) |
                % -begin { $aa = $null } `
                -process {$aa += [char]$_} `
                -end {$aa}
    until( $password -match "^(?=.*[A-Z])(?=.*[0-9])(?=.*[a-z]).{$length}$")

    return $password

So I tweaked the regular expression string to have at least a character from uppercase, lowercase and numbers. symbols are just a bonus. If it meets the requirements, then return it.

The main part of the script looks like this:

foreach ($user in $csvUsers)
    $newUserPassWord =  random-password -length $PasswordLength

    $objRandomPW = new-object psobject -Property @{
        SamAccountName = $user.SamAccountName
        Name = $user.Name
        Password = $newUserPassWord

    $arrExportRandomPW += $objRandomPW

$arrExportRandomPW | out-gridview

I edited it somewhat, but you get the general idea. Remember csv is my friend!

Ok, I did what was asked. Password random? Check! Complex and compliant? Check!


Now I want to get into something different…

As scripters we’re eager to script! If I have to repeat anything twice I’m already thinking how should I script this.

Password are sacred. So the first thing that popped into my head is how are we going to distribute these passwords without compromising them? If I export them to a csv file and they fall into the wrong hands then what?

“We’ll email them their passwords…” Ok… Problem was that we’re also migrating their mail system to Exchange… So yeah… there’s that…

“We’ll generate Word documents containing their password…” Still tricky but ok. Whose generating the word document, I asked with a smirk on my face? “We’ll take care that…” Nice!
BTW if you’re gonna print passwords please use courier font! There’s is nothing more frustrating than trying to figure out if it’s the letter I (Irwin) or l (live) , case in point Il.

“I could also save the password encrypted in csv file, but then you’d need the key to decrypt” Hmmm… is it worth the hassle? Who’ll do the decrypting? If the key and code falls into the wrong hands then we’re back at square one!

I’ve been in a situation where I needed to report asap which accounts were enabled and didn’t log in yet. Someone got fired (Name wasn’t disclosed) and they needed to take some legal actions. So in this case having a default generic password was kinda of a big issue… At times like that adsi filter is your friend!

Here’s my take on it all. I’d like to hear from you what your thought are about this.

The safest bet is this to:

  • Enable accounts JIT (Just in time). Yes it might get a little busy at the beginning but you won’t have any rogue accounts activated.
  • Use a generic password agreed upon. The key is not to enable the account too soon or a colleague may be able to log in with mal intent. Using a generic password just helps with the logistics. I’ve seen first hand what happens when complex passwords aren’t typed in correct, accounts get locked out etc etc…  It’s a lot easier to enable JIT, use the time for AD to sync to explain the user about his new configuration.
  • Make sure the user is required to change password at logon. Just make sure to set the password ahead of time. Depending on your group policy setting you might have to wait a day.

Ok, I think that covers that. It wasn’t my intention to go on and on. Scripting is awesome but.. “With scripting comes responsibility…” Make sure your scripts do no harm as far as it’s up to you.

Hope it’s worth something to you…



Exporting in PowerShell

I’ll admit it, csv is my preferred input and output format. And why not? Most of the time data has been gathered using Excel.

My favorite output cmdlet in PowerShell has to be out-gridview. Three guesses why? 🙂

Sometimes I might want to export results and at other times, having the results in out-gridview is enough.

My default modus operandi is having results in out-gridview. I like to use the [switch] parameter to explicitly enable exporting to a file.

Most of my scripts end with the following piece of code:

if ($export) {
    $arrExportStatus |  Export-CSV -NoTypeInformation "$pwd\export\$exportCSVSetOwnershipOnFiles" -delimiter $delimiter
else {
    if (!($PSCmdlet.MyInvocation.BoundParameters["Verbose"].IsPresent)) {
        $arrExportStatus | Out-GridView -Title "User Homedirectories SetOwnership"

If I gave up the parameter -export, then this will return $true. Because of my script repository and my foolproof LogFileName, I’m ready to export my data with confidence! 😉 (There is method to my madness…) If I didn’t specify -export then I’ll get the results in out-gridview.

You may have noticed that extra line of code, what’s that for? Well I’m glad you asked! Ok so there’s this option [CmdletBinding()]. This gives you access to some cool features like verbose in PowerShell. I won’t get into details right now, but if I specified export AND verbose, then I don’t really need the results in out-gridview. So export parameter will give me the results in a logfile. Without the export option I’ll get the results in out-gridview. If I use the -verbose option then I’ll just receive all verbose information.

Hope it’s worth something to you…



Foolproof logFileName

So your PowerShell skills are coming along nicely and you started generating output!

One of the things you’ll run into sooner or later (more sooner than you think) is the vast amount of output data and how to distinguish which script generated what output when and at times even with which input source.

So before I tell you how I’ll give you little nugget to ponder on.

“Data + Meta Data = Information”

This little gem I picked up a while back reading about data security. There was an interesting chapter about external meta data and internal meta data. My takeaway from that chapter was this,  There is value in adding some metadata to give you that edge. When saving data, save it in such a way you can quickly discern what it’s all about!

I’ve had some periods that i’ll quickly glance at the size of the file to see if I should open it or not. My rule of thumb is, if the file is less than 6 KB it’s probably a test. If it’s bigger than say 100 KB then it may have the info I’m looking for. But what happens if you want or need to revise some time later on? Will you still remember how you produced said data?

Enter  foolproof logFileName.

Whenever I need to export data I’ll use the following format: “date”_”InputFile”_”ScriptName”

Using this format tells me:

  • When I produced the file
  • Which input file I used to generate the data. That comes in handy if you ever need to reproduce something in the future.
  • Which script I used to generate the data for this file. I’ve had times I couldn’t remember which script produced this output

So that’s the why, here’s the how:

$LogDate = get-date -uformat "%Y-%m-%d"
$csvFile = "Temp.csv"
$ScriptName = $MyInvocation.MyCommand.Name

$LogFile = "$($LogDate)_$(($csvFile).TrimEnd('.csv'))_$(($ScriptName).TrimEnd('.ps1')).csv"

If you copy & paste and save the script (I named it Create-LogFileName.ps1… How original is that eh?) you’ll get a string like this: 2014-07-20_Temp_Create-LogFileName.csv

I generally use underscore as a separator in FileNames. Like I said, csv is my goto format so I preformatted it this way.

Well that my tip for generating foolproof log file names to quickly discern what the content is all about.

Hope it’s worth something to you…



Script Repository – Part II

Ok so I figured out how to post the code in WordPress.

FYI, I’ve seen this many time I just didn’t know it was that easy…

Here’s the link to get you started posting your source code Pretty cool huh?

$csvRepository = @"
"@ | ConvertFrom-Csv

function create-scriptrepository {

    if(!(Test-Path "$parentFolder\$folder")) {
        New-Item -Path "$parentFolder\$folder" -ItemType Directory | Out-Null
    Else {
        Write-Warning "Folder $parentFolder\$folder already exists."

foreach($item in $csvRepository) {
    #Create script repository in parent folder
    create-scriptrepository -parentFolder C:\Temp -folder $item.Folder

Ok, here’s a quick rundown of what’s happening here…

ConvertFrom-Csv comes in handy When you don’t want to go through all the hassle of creating and importing a csvFile. The @”…”@ thing is a literal string.

I used a parentFolder parameter just in case you want to create your repository elsewhere.
I’m a big fan of foreach. I know there are one-liners out there but foreach just keeps it readable. It helps if someone else has to read your code.

So there you have it, your own script repository. Feel free to use it and add your own folders. I’m sure you’ll think of something… 😉

Tip: Keep to folder name to a bare minimum. If it’s about SQL Server stuff then sql is a great folder name (Really Urv? I would have never guessed! You sir have a vast grasp of the obvious…) Hehe…

Hope it’s worth something to you. This helps me keep thing tidy and organized. Up next some tips on Script naming that keeps me sane.



Your script repository

Why have a script repository you say?

Well over the years I’ve had scripts in so many folders only to forget what lies where, using which file, you get the picture.

I’ll let you in on a lil’ secret… It all started when I categorized my digital music library by year – genre – artist.  Well it would seem it’s a tick in the Family. My little sister has her library sorted by genre and then alphabetized! So yeah, I guess I got off easy :p

But having your repository is helpful. PowerShell ISE is my goto editor. I’ve set my default location to c:\scripts. That’s the starting point of anything and everything. Ok so what’s the advantage? Well let’s say i need open op a csv file, I’ll just head to my subfolder $pwd\sources\csv. $pwd is one of those default parameters that’s readily available. So what else in sources? Well there’s a xml, txt and xlsx folder as well. So if I need a xml file, I’ll just head on to $pwd\sources\xml

Did I generate a logFile? No problem! $pwd\log it is (Incidentally, I’ll be doing a blog on logFile names soon…) Is it temporary? $pwd\temp

Last but not least the ps1 folder. Once you get to scripting you’re gonna end up with tons of stuff. Some were just to try out, others you’ll keep around just in case (Be sure to give it a name you can quickly identify what its main purpose is, I have some pointers on that as well) So if there’s a certain category your scripting, then why not gather all those scripts together.

I have a subfolder dsa in $pwd\ps1 for all my Active Directory User & Computer scripts. dssite for Sites & Services, gpmc for group policies, you get the idea. It may seem like a lot of hassle but it’s well worth it.

Oh btw I also have a vbs, cmd, export, log, screenshots, subfolders in c:\scripts.

If I need to change or copy to another folder, all I need to do is change the parent folder and I’m good to go!

I haven’t quite figure out posting scripts yet. But once I’ve figured it out I’ll be sure to post it.

So what are thoughts on keeping a script repository? You’re welcome to share!

No way around it…

This just in…

PowerShell Mandatory

This is great news for all PowerShell enthusiasts!!! When it comes managing Microsoft’s newest Server platform, Windows Server 2012 R2 you’re gonna run into PowerShell somewhere along the way. Don’t fight it, just embrace it.

What about managing Office 365? Yep PowerShell! WindowsAzure? PowerShell! Windows 2012 R2 is really geared for the Cloud. Be honest when did you last sit behind a server console? Exactly! Once you go PowerShell… Well you what I’m saying 😉

“Resistance is futile you will be assimilated…”

Have a look at TechNet – Scripting with Windows PowerShell

Think of all the possibilities… Are you exited as I am? Good!!! Now go and spread the gospel of PowerShell!!!