Monthly Archives: June 2015

Embrace the pipeline

I’ve been doing some reflection on my PowerShell scripting skills, let me explain.  When I started scripting KIXs was the norm, VBS followed quickly. One of the things I struggled with while learning PowerShell was to think in Objects. I got that one down, the pipeline… well that’s another story…

Jeffery Hicks along with other MVPs did a PowerShell Blogging Week which was great! One takeaway from that week was “Don’t Learn PowerShell, Use it!” The scripts that the MVPs made all looked and felt like cmdlets and took advantage of the pipeline!

So I reviewed my most frequently used scripts and sure enough no pipeline support! Sure the scripts do what they’re suppose to do, and in some cases they’re a bit overengineerd… Ok a lot!!!

I’m about to shame myself to make a point… Here goes…

Have a look at this:

    Enumerate Groups in CSV file
    [ValidateSet(",", ";", "`t")]
    [string]$delimiter = "`t",

#region: CMTraceLog Function formats logging in CMTrace style
function CMTraceLog {
    Param (




    $Time = Get-Date -Format "HH:mm:ss.ffffff"
    $Date = Get-Date -Format "MM-dd-yyyy"

    if ($ErrorMessage -ne "") {$Type = 3}
    if ($Component -eq $null) {$Component = " "}
    if ($Type -eq $null) {$Type = 1}

    $LogMessage = "<![LOG[$Message $ErrorMessage" + "]LOG]!><time=`"$Time`" date=`"$Date`" component=`"$Component`" context=`"`" type=`"$Type`" thread=`"`" file=`"`">"
    $LogMessage | Out-File -Append -Encoding UTF8 -FilePath $LogFile

#region: ADSI function Get-ADObjectDN
function Get-ADObjectDN {
    param (

    if ("group","user","printqueue","computer" -NotContains $type) {
        Throw "$($type) is not valid! Please use 'group','user','printqueue','computer'"

    $root = [ADSI]"LDAP://$DNSDomainName"
    $searcher = new-object System.DirectoryServices.DirectorySearcher($root)

    if ($type -eq "printqueue") {
        $searcher.filter = "(&(objectCategory=$type)(printerName=$CNObject))"
    Elseif ($type -eq "computer") {
        $searcher.filter = "(&(objectCategory=$type)(sAMAccountName=$CNObject$))"
    Else {
        $searcher.filter = "(&(objectCategory=$type)(sAMAccountName=$CNObject))"

    $ADObject = $searcher.findall()

    if ($ADObject.count -gt 1) {     
        $count = 0

        foreach($objFound in $ADObject) {
            write-host $count ": " $objFound.path 
            $count = $count + 1

        $selection = Read-Host "Please select item: "
        return $ADObject[$selection].path
    else {
        return $ADObject[0].path

#region: verify thata the logFile exists
if(!(test-path "$pwd\log\$logFile")) {
    New-Item "$pwd\log\$logFile" -ItemType File

#region: Create hash CMTrace for splatting
$hshCMTrace = @{
    Message = ""
    Component = $(($MyInvocation.MyCommand.Name).TrimEnd('.ps1'))
    ErrorMessage = ""
    Type = 1 
    LogFile = "$pwd\log\$logFile"

#region: Reading CSV file. Stop if file isn't found
Write-Verbose "Script started : $(Get-Date)`n"
Write-Verbose "Reading CSV File $csvFile"

if (test-path "$pwd\source\csv\$csvFile") {
    Write-Verbose "Importing csv file: $pwd\source\csv\$csvFile`n"
    $hshCMTrace.Message = "Importing csv file $csvFile"
    $hshCMTrace.Type = 1
    CMTraceLog @hshCMTrace
    $csvUsers = Import-CSV "$pwd\source\csv\$csvFile" -delimiter $delimiter -Encoding UTF8
    $LogDate = get-date -uformat "%d-%m-%Y"
    $exportFile = "$($LogDate)_$(($csvFile).TrimEnd('.csv'))_$(($MyInvocation.MyCommand.Name).TrimEnd('.ps1')).csv"
    $arrExportUsers = @()
    $usersCount = $(@($csvUsers).count)
    $indexUsers = 0
    $indexMissing = 0
    $indexFound = 0
else {
    Write-Error -Message "File $csvFile not found...`n" -RecommendedAction "Please verify and try again...`n"

    $hshCMTrace.Message = "File $csvFile not found"
    $hshCMTrace.Type = 2 
    CMTraceLog @hshCMTrace


$hshCMTrace.Message = "Users count in $csvFile : $usersCount"
$hshCMTrace.Type = 1 
CMTraceLog @hshCMTrace


#region: hashtable Progressbar
$progFindUsers = @{
    Activity = "Processing users in $($csvFile)"
    CurrentOperation = ""
    PercentComplete = 0

#region: hashtable UserAcccountControl
#Have a look @site
$hshAccountControl =@{
    512 = "NORMAL_ACCOUNT"

Write-Verbose -Message "Script Started on $(get-date)"

$ADDomainInfo = Get-ADDomain
Write-Verbose -Message "Forest Distinguished Name: $($ADDomainInfo.DistinguishedName)"

foreach ($user in $csvUsers) {
    $progFindUsers.Status = "Processing user $($user.SamAccountName)"
    $progFindUsers.PercentComplete = ($indexUsers/$usersCount) * 100

    Write-Progress @progFindUsers
    #Get User DistinguishedName
    [ADSI]$ldapUser = Get-ADObjectDN -type "user" -DNSDomainName $ADDomainInfo.DistinguishedName -CNObject $user.SamAccountName

    if ($ldapUser -ne $null){

        $value = $ldapUser.userAccountControl
        $ouIndex = $($ldapUser.DistinguishedName).IndexOf("OU=")
        $OU = ($ldapUser.DistinguishedName).Substring($ouIndex)

        $objUser = new-object psobject -Property @{
            Name = $($ldapUser.Name)
            DistinguishedName =$($ldapUser.DistinguishedName)
            WhenCreated = $($ldapUser.WhenCreated)
            sAMAccountName =$($ldapUser.samACCountName)
            AccountControl = $hshAccountControl.Item($($value))
            HomeDir =$($ldapUser.HomeDirectory)
            OU= $($OU)
    else {
        $objUser = new-object psobject -Property @{
            Name = $null
            DistinguishedName =$null
            WhenCreated = $null
            sAMAccountName =$($user.samACCountName)
            AccountControl = $null
            HomeDir =$null
            OU= $null


        $hshCMTrace.Message = "Missing: $($user.SamAccountName)"
        $hshCMTrace.Type = 2 
        CMTraceLog @hshCMTrace

    $arrExportUsers += $objUser

$hshCMTrace.Message = "Users found in $csvFile : $indexFound"
$hshCMTrace.Type = 1 
CMTraceLog @hshCMTrace

$hshCMTrace.Message = "Users mssing in $csvFile : $indexMissing"
$hshCMTrace.Type = 1 
CMTraceLog @hshCMTrace

if ($export) {
    Write-Verbose "Exporting results to $pwd\export\dsa\$exportFile"
    $arrExportUsers| select Name,SamAccountName,DistinguishedName,WhenCreated,AccountControl,HomeDir,OU |  Export-CSV -NoTypeInformation "$pwd\export\dsa\$exportFile" -delimiter $delimiter -Encoding UTF8
else {
    if (!($PSCmdlet.MyInvocation.BoundParameters["Verbose"].IsPresent)) {
        $arrExportUsers | select Name,SamAccountName,DistinguishedName,WhenCreated,AccountControl,HomeDir,OU  | Out-GridView -Title "Found Users $($csvFile) - $(Get-Date)"

Write-Verbose "End script : $(Get-date)"	

I’m all over the place! There’s a bit of everything!!! Trust me it works… Looking at it now, gotta ask myself, what was I thinking??? This was a script way back in the days when cmdlet performance was questionable… Rule of thumb if there’s a cmdlet use it! No need to reinvent the wheel…

So here’s a better (readable) version

$Prop =  @('canonicalname','homedirectory','mail','homedrive','scriptpath','initials','profilepath','userprincipalname')
import-csv -Path .\source\csv\moas-users.csv -Delimiter "`t" -Encoding UTF8 |
ForEach-Object {
    get-aduser -Filter "SamAccountName -eq '$($_.SamAccountName)'" -properties $Prop  |
    Select-Object Name,GivenName,SurName,Initials,mail,SamAccountName,Enabled,DistinguishedName,canonicalname,
} |

This will get me the same information with waaay much lesser code… No need  for complex verification,progress etc etc… Clean and simple… Processing each object as they go through the pipeline…

It is important to know beforehand what your objective is, is it a script or a tool? Don jones sums it up nicely stating:

“A scripts is something you usually make for yourself. Scripts are often quick and dirty, and although they might be long and complicated, they’re just a way for you to automate something that only you will ever do…”

Just because a script seems complicated doesn’t make it a tool, case in point. The former script is over engineered plain and simple. I got a bit carried away with all the possibilities in PowerShell.

If I had to start learning PowerShell today I’d advice myself to better understand and use the pipeline…


Hope it’s worth something to you!




Robocopy is a DataMigration specialist’s best friend!

‘Sup PSHomies? 😛

I’ve been doing migrations for as long as I can remember now. Data migrations can be a challenge! If there is one tool I can rely on to always come through it has to be RoboCopy!!! The more I use it the more my appreciation grows… Here are some of the crazy things you can do with RoboCopy.

Delete Folder content

Ever been in need of clearing a folder of content but those pesky pop-up keep appearing in explorer? Oh and don’t even get me started on the time it takes… Enter RoboCopy!

Just use an empty folder as Source and do a /MIR! No longpath error or anything like that. Just make sure you point it to the right folder you want to clear… Yeah… learned that the hard way…

Get FolderSize

This one is courtesy Joakim Svendsen aka SvendsenTech! His PowerShell (is there any other kind… 😛 ) will get you a FolderSize, as promised… Fast! I’ve made use of it quite a few times!


RoboCopy’s logging capability is simply awesome and easy to parse using… I’ll let you fill it in… Hehe… I’ve spoiled many managers with RoboCopy LogSummaries in Excel, so much so that they wouldn’t have it any other way now! What I like about having all the RoboCopy jobs in Excel is that it’s easy to filter what went wrong, which job took the longest, even the speed is available! Now All I need to figure out is how Pivot-tables work…

Blog RCLogSummary

More on Logging

I recently needed to make a report on just how PSTs where part of the Data Migration. PSTs are the worst! Once opened in Outlook it’s a full sync all over again. Sure I used RoboCopy to list the PSTs. Looking at the log output I thought: “Wouldn’t it be nice just to have the files?” Guess what? You can!

Using /NDL (No Directory List – don’t log directory names) you’ll only have the file as log output. I added /FP (Include Full Pathname of files in the output) for good measure. To keep to logfile size to a minimum just add /NP (No Progress – don’t display % copied)


Now I had all the necessary log information without any clutter! Nice!


Depending on your migration strategy you might need to migrate Data with security intact, that’s where /COPY:DATS can help. Omit the S and security isn’t processed. And if the security didn’t process at first, you can always do a /SECFIX. We usually use a Synology NAS to transport data. Synology is great for capacity, ACLs… well I’m sure with some tinkering… probably, I can’t really say I know how… Glad that RoboCopy could help. Incidently Ralf Kerverzee figured out how /SECFIX worked! Go Ralfie!!!

Exclude Files/Folders

Depending on your window, you might need to exclude some files during initial sync. And if you’re merging folders to one target then there’s no way around that. I’ve used /XF and /XD to reduce Data Sync time from 20 hours to 3! There was one migration where they were effectively doing a full sync every time, remember /MIR is unforgiving! It took some figuring out, but was well worth the time. I usually create an Excel worksheet with all necessary RoboCopy Parameters which I then use in csv form as input for… Yep… you guessed it! 😉

Press repeat

Data migration is usually done in low peek hours. You can have RoboCopy start during a certain time frame /RH:hhmm-hhmm (Run Hours – times when new copies can be started). No need to create scheduled job.

But wait! There’s more!

Here’s a link to all the possible parameters you can use with RoboCopy. Definitely worth reading! I haven’t encounter a situation (yet) that RoboCopy couldn’t handle.

Well, that’s my tribute to RoboCopy!


Hope it’s worth something to you



Revisiting NTFS Longpaths issue


‘Sup PSHomies! It will be a thing… trust me…

Before the NTFSSecurity module there was a little known trick I used to get the NTFS rights on longpath folders: \\?\

There’s just one catch… It only works with windows command utilities. ICACLS, MD, ATTRIB can all make use of this prefix.

Let’s say I wanted to read the NTFS rights on a longpath folder using icacls. The command for this would be:

icacls “\\?\<Driveletter:>\LongPath” 

Notice the <DriveLetter:> syntax? Well that’s because \\?\ can only be used as a prefix in front of a DriveLetter. It did mean that I had to make a drivemapping, an inconvenience I didn’t mind one bit!

Ok I’ll admit that I tried using a UNCPath, but that didn’t work… can’t blame a brother for trying eh? 😉

While I was using the NTFSSecurity module (Did I mention how much I love this module?) I got an access denied on a folder (No that’s not the exciting part). What I did noticed was a familiar prefix of sorts: \\?\UNC\

So you can use the UNCPath as well! All I had to do was omit the ‘\\’ from the path. Say I wanted to read the NTFS rights on ‘\\SERVERX\SHARE\Longpath’ the syntax would be:


Awesome right?!!! Now you might be asking yourself: ‘Well that’s all gravy Urv, but why bother? Why not just use the NTFSSecurity module?”

There’s an old saying that goes: “A poor workman always blames his tools…”

There may be times when PowerShell isn’t readily available to you (I know perish the thought!). Sometimes you have to make do with what’s available! When it comes to manipulating NTFS rights you should be proficient in using different tools/utilities. I’ll admit while I love PowerShell Set-Acl has to be my least favorite cmdlet!

So what’s next?

Well imagine enumerating a longpath folder without errors and retrieving the NTFS rights.

example ICACLS

Looking at the output, you should be able to do some neat formatting using regular expressions!  Hey! I didn’t say you weren’t allowed to use PowerShell at all… Regular Expressions… Not my forte… Any takers in the PowerShell community? 🙂

Having alternative methods for retrieving NTFS rights is always a good thing. You can always use an alternative methods for verification. Get-Acl (Get-Acl2 if using NTFSSecurity), ICACLS and the security tab (Always use Advanced mode) are my methods of choice for verification.

Hope it’s worth something to you



PS. has some great posts how to handle NTFS the PowerShell way! 😉