Category Archives: Miscellaneous

Pester script to validate GPOs Scope of Management

So here’s another spin on using Pester to validate operational readiness… 😉

Group policies can be pretty tricky! Troubleshooting can be a challenge. There might be even times that you start doubting yourself. Depending on the link order of your Policies, you might not get what you expected…

Operations is dynamic, things get moved around, enabled/disabled, blocked, name it and it’s bound to happen.

How about… some way to validate your GPOs Scope of Management! Once everything is working as it should, create a validation set you can verify later on. Trust me, I’ve been there… Using Pester will definitely give you that edge…

So I improvised a little on Ashley’s McGlone’s GPO Report and made a function Get-GPOsSoM. Just be sure to save it in the same folder as Domain-GPOSoM.Tests.ps1

Now for the fun part! 🙂

So here’s the result:

Pester Test GPO SoM

Now Imagine someone changed your GPO link order:

Pester Test GPO Change Link Order

Run Pester test script again:

Pester Test GPO Change Link Order -Detected

No more doubt! The link order has been tampered with! This is definitely a game changer for Operations!

My new motto : “If you can automate it, you should test it” 😛

Pester for everyone!

Hope it’s worth something to you

Ttyl,

Urv

Pester script to Test DNS Configuration

So I recently blogged about Configuring DNS zones and resource records. While going through my twitter feed, I stumbled upon this little gem by Kevin Marquette. He recently did a session on Pester and uploaded his demo. If you’re interested in Pester (as you should) you should definitely check it out!

So one of the demos was a eureka moment for me: The Active.Directory.System.DC.tests.ps1!

Wait you can do that? I thought Pester was about unit Framework testing not about validating script output. So I can test if my script did what I expected it to do? (Pause to let that sink in). Well alrighty then!!! 😛

So I decided to give it a go for the DNS Configuration.

And here’s a screenshot of the results:

Pester-DNS

So there’s more to Pester that meets the eye… Imagine the possibilities… No wonder Microsoft is shipping Pester with 2016…

Hope it’s worth something to you

Ttyl,

Urv

Configuring DNS zones and resource records

“Hey Irwin! Do you have a script to configure DNS autodiscover?” Eh, no not really… I’m assuming you’re doing something with PowerShell DNS cmdlets? 😉 And so begins another PowerShell journey…

My colleague Ofir is our Exchange guy.

“Ok so what exactly do you want to accomplish Ofir? I’m trying to automate registering some Resource records in DNS, but it isn’t working. I’d like to add some IPAddresses to a specific zone… “Ok let’s see what you’ve got!

Add-DnsServerResourceRecordA -Name "autodiscover" -ZoneName "domain.local" -AllowUpdateAny -IPv4Address "IP address 1","IP address 2","IP address 3" -TimeToLive 01:00:00

“So when I use the cmdlet directly it works. When I use varaiables it doesn’t… Ideally the code should be re-usable…”  Ofir’s words not mine… I’ll admit, I teared up a bit… Kids, they grow up so fast… Hehe…

I think we can do this…

So Ofir was using read-host to get ZoneName and IPvAddress. Ah! What a lovely opportunity to demonstrate params!

“Ok Ofir, instead of using read-host, we’re better off using parameters. Using [CmdletBinding()] gives you the possibility to use Write-Verbose, no extra charge!”

[CmdletBinding()]
Param(
   [string]$fqdn ='domain.local',
   [string[]]$ServerIPAddress
)

Now because the Resource record could be multi-valued we’ll go on and define a string array variable [string[]]$ServerIPAddress

“To make you’re code more readable we’ll just go ahead and create a hashtable we can use to splat your parameters.”

$dnsSplat = @{
   Name = 'AutoDiscover'
   Zonename = $fqdn
   AllowUpdateAny =  $true
   IPv4Address = $ServerIPAddress
   TimeToLive = '01:00:00'
}

“Now all we need to do it run the appropriate cmdlet and we’re good!”

Add-DNSServerResourceRecordA @dnsSplat

Ok, so this got Ofir started… Mind you there’s no error handling or anything of that sort…
We did some tinkering on the fly and this was the end result:

<#

Author: I.C.A. Strachan
Version:
Version History:

Purpose: Create AutoDiscover Zone and add ResourceRecord

#>
[CmdletBinding()]
Param(
   [string]$fqdn ='domain.local',
   [string[]]$ServerIPAddress= @('192.168.1.4', '192.168.1.5')
)

BEGIN{
    $dnsRRA = @{
       Name = 'AutoDiscover'
       Zonename = "autodiscover.$($fqdn)"
       AllowUpdateAny =  $true
       TimeToLive = '01:00:00'
    }

    $dnsPZ = @{
        Name = "autodiscover.$($fqdn)"
        ReplicationScope = 'Forest'
        DynamicUpdate = 'Secure'
    }

    Import-Module DNSServer -Verbose:$false
}

PROCESS{
    #Only Add Zone is count is zero (doesn't exists)
    If (@(Get-DnsServerZone $dnsPZ.name -ErrorAction SilentlyContinue ).Count -eq 0 ){
        Write-Verbose "Creating DNS Zone: $($dnsPZ.name)"
        Add-DnsServerPrimaryZone @dnsPZ
    }

    #Get string equivalent of all A records
    $RR = Get-DnsServerResourceRecord -ZoneName autodiscover.domain.local -RRType A |
    Out-String

    $ServerIPAddress | ForEach-Object {
        If (!$RR.Contains($_)){
            Write-Verbose "Adding resource record $_ to $($dnsPZ.name)"
            Add-DNSServerResourceRecordA @dnsRRA -IPv4Address $_
        }
    }
}

END{}

Ofir was quite happy! Nice! Another satisfied customer. So the other day asked him to send me the code for future reference…

This is what he sent me:

What??? This wasn’t the code I expected! Turns out Ofir had a lot more he needed to configure. Just pointing him in the right direction was sufficient to generate this! Awesome! Give a man a fish… 😉

Go Ofir! It’s fun to see colleagues get excited about PowerShell. Sometimes all that’s needed is just a nudge in the right direction…

Ttyl,

Urv

Creating PowerShell GUIs

I’m old school. I’m a big fan of the “Real men don’t click” club!

My PowerShell bestie, Michaja van der Zouwen, is all about the GUI! We would go back and forth about to GUI or not to GUI… Good times…

We got a great introduction in creating PowerShell GUI app at the recent DuPSUG meeting by none other than June Blender! Just google June Blender… We’re being spoiled here!

Back in the days June was also part of the “Real (wo)men don’t click” club. So what changed? I can totally relate to June’s story about providing a script that could resolve a contractors problem. Now you would think the manager would be grateful eh? Nope! They weren’t interested in learning or using PowerShell even if it solved their problem. The idea was too daunting for them. Enter GUI.

GUI took away that initial fear of learning something new. “Just run this and click this button!” Doesn’t get easier than that eh? So should we all be creating GUIs? Well you should at least know how to 😉 Hence the workshop!

So I’m not against GUI, but it is a different mindset when creating a GUI app. You really need to think in events. You still need to validate parameters, but you need to anticipate what a user’s next move could be. The user’s move needs to be processed by ‘event-handlers’. With a script I have some parameters, I validate them and I’m good! With a GUI you need to think ahead of what could happen if…

We used Sapien PowerShell Studio to create a small GUI app. June gave us some excellent Gotcha and Aha tips! Sapien PowerShell Studio makes creating the GUI easy! Once the GUI interface was created we added the script logic. The event-handlers are basically script blocks!

Here’s where you need to think ahead:

What do I want happen if the textbox value is empty? Then you shouldn’t be able to click the button. Ok… But what if someone enters spaces only? We should validate that and make sure the value isn’t empty or spaces. But what if there’s a lingering space somewhere? Make sure Trim your textbox value.

Sapien PowerShell Studio makes it easy to export the complete script. Just go to Deploy -> “Export To Clipboard”. Paste in ISE and run it! Works! Have look at the script, there’s a lot going on under the hood… Imagine creating that by hand…

All the script logic can be found under ‘User Generated Script’ comment block. The rest PowerShell Studio took care of…

As an introduction to creating PowerShell GUI scripts, mission accomplished!

Here’s the link to June’s github repository for more information.

Thanks June it was a pleasure meeting you in person! I’m more open to the idea of creating a GUI around scripts, only after putting up a fight though… Old habits die hard… Hehe…

Ttyl,

Urv

 

Revisiting Exporting Data

Sup PSHomies!

I’ve been playing with different formats lately. I’d like to share a few thoughts on the subject if I may… For demo purposes I’ll be using the following cmdlets: Export-Csv, Export-Clixml and ConvertTo-Json!

Export-Csv

I’ve talked about my love for exporting to csv in the past. Here’s thing, exporting to CSV treats everything as a string. For reporting purposes this might not be an issue. When it comes to nested objects… Yeah… Then you’re better off exporting to XML. Incidentally Jeff Hicks has a great blog on this topic, you should definitely check it out! CSV is still my goto format because of reporting in Excel, although, I’ve been using Doug Finke’s ImportExcel module more and more! Doug’s module cuts out the middle man and can export to Excel without having to export as a CSV first. It does a whole lot more! Worth looking into!

Export-Clixml

Exporting a nested object is pretty straightforward using Export-Clixml. The structure isn’t pretty though. That was the main reason I didn’t use the cmdlet. Export-Clixml is great when used in combination with Import-Clixml, it restores your nested object without a hitch! You can export your results, send the file and import elsewhere for further processing if needed. When I think of xml, I immediately conjure up ideas of html reporting. The xml tags are too cryptic for any css style, I wouldn’t even know where to begin. I recently discovered PScribo (Thanks to the PowerShell Summit in Stockholm), a module by Ian Brighton! This made html reporting a breeze! All I did was import my XML file back into PowerShell to retrieve my nested object and I did the rest in PowerShell! That was awesome!

ConvertTo-Json

The ConvertTo-Json cmdlet has been introduced in PowerShell version 3.0. Back then I was a stickler for XML so I briefly looked at it and forgot all about it… That is until Azure Resource Manager came along. If you’re doing anything with Azure Resource Manager then Json should be on your radar. If you’re not convinced just look at the ARM Templates out there. Json is a lot easier on the eyes for sure. Still not conviced? Just google Json vs XML.

Ok here’s some code you can play with to get a general idea of what the possibilities are when exporting to different formats. Have a look at the Json and Xml, which would you prefer? That was rhetorical… 😉

Bottom line

Export-Csv is best when you need to report  anything in Excel workbooks and you’re not worried about type. Everyone gets Excel.

Export-Clixml isn’t pretty but excellent when it comes to keeping the data metadata in tact. You can always import preserving the metadata and process further in PowerShell.

Use Json if you want to have a structured data set à la XML. Json is a lot friendlier than XML. I was also surprised that the cmdlet interpreted values to it’s best avail. False became Boolean as you would expect. Json is growing on me…

Hope it’s worth something to you…

Ttyl,

Urv

See what I did there? 😉

I’ve talked about creating your own script repository in the past. I like having some sort of structure. GitHub goes further. It also helps with version control and so much more!. My idea of version control is adding -v# after any script, don’t judge me…

These days everything seems to be in GitHub! The PowerShell DSC Resource Modules are there. Azure PowerShell Module & Templates are there. You can even deploy Azure Resources from GitHub! You can also use Git for continuous deployment for website content/apps.

Visual Studio supports GitHub as well for continuous delivery

I just discovered Gist! Doug Finke created a GUI to help create a template to extract Data from text files. I noticed that the code was straight out of GitHub. So I did a little bit of googling and sure enough, I found out how! Gist lets you share snippets of code with others. All you need to do is copy & paste the url and you’re good to go!

Best part of Gist is having you code “As-Is”. Using WordPress plugins sometimes gets the “<#” , “#>” or “&” wrong, replacing it with HTML equivalent. This way you’ll never have to worry about your code being mutilated anymore! Sweet!!!

Here’s an article to help you on your way getting started with GitHub.

wpid-wp-1440101002843.jpg

GitHub is (going to be) one of those things that you need to master…

Hope it’s worth something to you…

Ttyl,

Urv

 

Sup PSHomies?

So this regular expression thing has it’s advantages.  Like many things practice makes perfect…

In the past whenever I saw regex I’d flat line… There wasn’t really a need to use it, select-string with a dash of  -like, -contains, -split here and there, got me pretty far so why bother eh? Well for one thing regex is for the big boys, much like LDAPFilter on Active Directory cmdlets… Guess it’s time to put on my big boy pants and step my game up! 😛

Jeffrey Hicks recently blogged on converting Text to PSCustomObjects using regex. Anything Jeffrey publishes is golden…

Jeffrey’s pattern is easy enough to follow…

regex]$pattern = '(?<ID>\w+)\s+(?<Chassis>\d)\s+(?<Slot>\d)\s+(?<RAIDID>\w+)\s+(?<Status>\w+)\s+(?<Type>\w+)\s+(?<Media>\w+)'

Here’s a quick breakdown:

  • \w+ : Matches any word character as much as possible (that’s what the + is for)
  • \s+  : Matches any whitespace character as much as possible
  • \d    : Matches any decimal digit

Here a link to regex cheat sheet to explain what the character classes actually do…

Fun fact: I’ve been the proud owner of the Windows PowerShell Pocket reference (first edition, for quite some time I may add) and there is a whole chapter dedicated to regular expressions! Go figure…

The trick is to use capture names in your pattern to store the results. I saw this in the robocopy script by Joakim Svendsen. What I didn’t realize was that you can retrieve the captured names using GetGroupNames() method, nice! Just one thing, the first GroupName is always 0 it seems so just skip that one.

$pattern.GetGroupNames() | select -skip 1

This makes enumerating the names easy! Just head on over to Jeffrey’s blog to see how it’s done… 😉

In the mean time here’s a lil’ something to help you verify that the pattern works… 😉

@'
ID Chassis Slot RAIDID Status Type Media Spare SizeGB
=====================================================
c0d0  0     0    c0r0    Ok     sas  HDD   -    150
c0d0  0     0    c0r1    Ok     sas  HDD   -    150
c0d1  0     1    c0r0    FAILED sas  SDD   -    150
c0d1  0     1    c0r1    FAILED sas  SDD   -    150
'@ | Out-File "$env:TEMP\PatternFile.txt"
 
[regex]$pattern = '(?<ID>\w+)\s+(?<Chassis>\d)\s+(?<Slot>\d)\s+(?<RAIDID>\w+)\s+(?<Status>\w+)\s+(?<Type>\w+)\s+(?<Media>\w+)'
 
 $captureNames = $pattern.GetGroupNames() | Select-Object -skip 1

get-content "$env:TEMP\PatternFile.txt" |
ForEach-Object {
  if($_ -match $pattern){
    $captureResults = @{}
    foreach($captureName in $captureNames){
      $captureResults.Add($captureName,$Matches.$captureName)
    }
    [PSCustomObject]@{
      String = $_
      MatchFound = ($_ -match $pattern)
      regExResults = [PSCustomObject]$captureResults
    }
  }
  Else {
    [PSCustomObject]@{
      String = $_
      MatchFound = ($_ -match $pattern)
      regExResults = ''
    }
  }
} 

wpid-wp-1438336833636.jpg

Hehe… Hope it’s worth something to you

Ttyl,

Urv

‘Sup PSHomies?

Got a lil’ somethin’ for ya… Get-RCLogSummary! As you know I’m a big fan of RoboCopy! I thought I’d share one of the perks of using RoboCopy: the LogFile.

Here’s a list of RoboCopy Logging options, courtesy of ss64.com

   Logging options
                /L : List only - don’t copy, timestamp or delete any files.
               /NP : No Progress - don’t display % copied.
          /unicode : Display the status output as Unicode text.  ##
         /LOG:file : Output status to LOG file (overwrite existing log).
      /UNILOG:file : Output status to Unicode Log file (overwrite)
        /LOG+:file : Output status to LOG file (append to existing log).
     /UNILOG+:file : Output status to Unicode Log file (append)
               /TS : Include Source file Time Stamps in the output.
               /FP : Include Full Pathname of files in the output.
               /NS : No Size - don’t log file sizes.
               /NC : No Class - don’t log file classes.
              /NFL : No File List - don’t log file names.
              /NDL : No Directory List - don’t log directory names.
              /TEE : Output to console window, as well as the log file.
              /NJH : No Job Header.
              /NJS : No Job Summary.

My preference when it comes to logging is to have seperate logfiles instead of appending to one big file. The option /NP is a no brainer, displaying ‘%’ will give you an indication how long it took for that specific file/folder, but who wants that right?It will only increase your logfile size taking more time to parse it down the line. I recently used /NDL and I must say this will keep your logfile footprint small. I did include /FP to still have an idea where the file is being copied from. I’d go with /NDL in combination with /FP when doing a delta-sync. A delta-sync is a robocopy job that will copy the differences once a full-sync has taken place. If the file hasn’t changed robocopy will skip it. Only new and newer files will be copied… Ok enough background, let get scripting shall we? 😛

Function Get-RCLogSummary{
  param(
    [String]$LogFileName,
    [String[]]$LogSummary
  )

  $objLogSummary = @{
    rcLogFile = $LogFileName
    Speed = ''
  }

  Foreach($line in $logSummary) {
    switch ($line){
      #Header
      {$_ | select-string '   Source :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Source',$_.Substring(11).Trim())
        }
      {$_ | select-string '     Dest :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Target',$_.Substring(11).Trim())
        }
      {$_ | select-string '  Started :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Start',$($_.Substring(11).Trim()))
        }
      #Footer
      {$_ | select-string '    Dirs :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('TotalDirs',$_.Substring(11,10).Trim())
          $objLogSummary.Add('FailedDirs',$_.Substring(51,10).Trim())
          $objLogSummary.Add('CopiedDirs',$_.Substring(21,10).Trim())
        }
      {$_ | select-string '   Files :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('TotalFiles',$_.Substring(11,10).Trim())
          $objLogSummary.Add('FailedFiles',$_.Substring(51,10).Trim())
          $objLogSummary.Add('CopiedFiles',$_.Substring(21,10).Trim())
        }
      {$_ | select-string '   Bytes :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('TotalBytes',$_.Substring(11,10).Trim())
          $objLogSummary.Add('FailedBytes',$_.Substring(51,10).Trim())
          $objLogSummary.Add('CopiedBytes',$_.Substring(21,10).Trim())
        }
      {$_ | select-string '   Ended :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('End',$($_.Substring(11).Trim()))
        }
      {$_ | select-string '   Speed :'}
        {
          $_= $_.ToString()
          $objLogSummary.Speed = $($_.Substring(11).Trim())
        }
      {$_ | select-string '   Times :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Time Total',$($_.Substring(11,10).Trim()))
        }
      }
    }

  #return $objLogSummary
  [PSCustomObject]$objLogSummary
}

#region:array with all LogSummary Object Properties
$arrRCProperties = @(
  'rcLogFile',
  'Source',
  'Target',
  'TotalDirs',
  'TotalFiles',
  'TotalBytes',
  'FailedDirs',
  'FailedFiles',
  'FailedBytes',
  'CopiedDirs',
  'CopiedFiles',
  'CopiedBytes',
  'Start',
  'End',
  'Time Total',
  'Speed'
)
#endregion

#region: Get all robocopy LogFiles in specified folder and get Summary
get-childitem '.\log\rc\home\22-06-2015' -File |
ForEach-Object {
  #region: Get File Header & Footer
  $arrSummary  = (Get-Content $_.FullName)[5..8] #Header
  $arrSummary += (Get-Content $_.FullName)[-11..-1] #Footer
  #endregion

  Get-RCLogSummary -LogFileName $_.Name -LogSummary $arrSummary
}|
Select-Object $arrRCProperties |
Out-GridView
#endregion

First I’ll get a list of logFiles and retrieve the first 5-8 lines and the last 10 lines of each file for processing. The LogFileName & array Summary are then passed as parameters to Get-RCLogSummary. I did a select to get the parameters in a certain order. It was a toss up between using [Ordered] Hash or  defining a [PSCustomObject] beforehand. I figured you could minimize the Properties you want by tweaking the $arrRcProperties yourself. last but not least use Out-Gridview or Export-Csv to see the endresult.

I’m working on my pipeline skills, trust me my previous version was more ‘elaborate’, and by elaborate I mean over engineered…

So I guess you’ve noticed that regular expression is missing? Robocopy labels are fixed which is a good thing for me. I’m looking into it…

wpid-wp-1435926794192.jpg

This regular expression isn’t as easy as it seems… This works, just don’t include /Bytes in your robocopy parameter list. In that case you’ll definitely need regular expression. Version 2.0 I guess…

Hope it’s worth something to you

Ttyl,

Urv

Embrace the pipeline

I’ve been doing some reflection on my PowerShell scripting skills, let me explain.  When I started scripting KIXs was the norm, VBS followed quickly. One of the things I struggled with while learning PowerShell was to think in Objects. I got that one down, the pipeline… well that’s another story…

Jeffery Hicks along with other MVPs did a PowerShell Blogging Week which was great! One takeaway from that week was “Don’t Learn PowerShell, Use it!” The scripts that the MVPs made all looked and felt like cmdlets and took advantage of the pipeline!

So I reviewed my most frequently used scripts and sure enough no pipeline support! Sure the scripts do what they’re suppose to do, and in some cases they’re a bit overengineerd… Ok a lot!!!

I’m about to shame myself to make a point… Here goes…

Have a look at this:

<# 
.SYNOPSIS 
    Enumerate Groups in CSV file
.DESCRIPTION 
    
.NOTES 
    Author: 
.LINK 
    
#>
[CmdletBinding()]
param(
    [string]$csvFile="Users.csv", 
    
    [string]$logFile="udm-dsa.log",
    
    [ValidateSet(",", ";", "`t")]
    [string]$delimiter = "`t",
     
    [switch]$Export
)

#region: CMTraceLog Function formats logging in CMTrace style
function CMTraceLog {
    Param (
        [String]$Message,

        [String]$Component,
        
        [String]$ErrorMessage,

        [ValidateRange(1,3)]
        [Int]$Type,

        [Parameter(Mandatory=$true)]
        $LogFile
    )

    $Time = Get-Date -Format "HH:mm:ss.ffffff"
    $Date = Get-Date -Format "MM-dd-yyyy"

    if ($ErrorMessage -ne "") {$Type = 3}
    if ($Component -eq $null) {$Component = " "}
    if ($Type -eq $null) {$Type = 1}

    $LogMessage = "<![LOG[$Message $ErrorMessage" + "]LOG]!><time=`"$Time`" date=`"$Date`" component=`"$Component`" context=`"`" type=`"$Type`" thread=`"`" file=`"`">"
    $LogMessage | Out-File -Append -Encoding UTF8 -FilePath $LogFile
}
#endregion

#region: ADSI function Get-ADObjectDN
function Get-ADObjectDN {
    param (
        [Parameter(Mandatory=$true)]
        [String]$type,
        [String]$DNSDomainName="",
        [String]$CNObject
    )

    if ("group","user","printqueue","computer" -NotContains $type) {
        Throw "$($type) is not valid! Please use 'group','user','printqueue','computer'"
    }

    $root = [ADSI]"LDAP://$DNSDomainName"
    $searcher = new-object System.DirectoryServices.DirectorySearcher($root)

    if ($type -eq "printqueue") {
        $searcher.filter = "(&(objectCategory=$type)(printerName=$CNObject))"
    }
    Elseif ($type -eq "computer") {
        $searcher.filter = "(&(objectCategory=$type)(sAMAccountName=$CNObject$))"
    }
    Else {
        $searcher.filter = "(&(objectCategory=$type)(sAMAccountName=$CNObject))"
    }

    $ADObject = $searcher.findall()

    if ($ADObject.count -gt 1) {     
        $count = 0

        foreach($objFound in $ADObject) {
            write-host $count ": " $objFound.path 
            $count = $count + 1
        }

        $selection = Read-Host "Please select item: "
        return $ADObject[$selection].path
    }
    else {
        return $ADObject[0].path
    }
}
#endregion

#region: verify thata the logFile exists
if(!(test-path "$pwd\log\$logFile")) {
    New-Item "$pwd\log\$logFile" -ItemType File
}
#endregion

#region: Create hash CMTrace for splatting
$hshCMTrace = @{
    Message = ""
    Component = $(($MyInvocation.MyCommand.Name).TrimEnd('.ps1'))
    ErrorMessage = ""
    Type = 1 
    LogFile = "$pwd\log\$logFile"
}
#endregion

#region: Reading CSV file. Stop if file isn't found
Write-Verbose "Script started : $(Get-Date)`n"
Write-Verbose "Reading CSV File $csvFile"

if (test-path "$pwd\source\csv\$csvFile") {
    Write-Verbose "Importing csv file: $pwd\source\csv\$csvFile`n"
    
    $hshCMTrace.Message = "Importing csv file $csvFile"
    $hshCMTrace.Type = 1
    CMTraceLog @hshCMTrace
    
    $csvUsers = Import-CSV "$pwd\source\csv\$csvFile" -delimiter $delimiter -Encoding UTF8
    $LogDate = get-date -uformat "%d-%m-%Y"
    $exportFile = "$($LogDate)_$(($csvFile).TrimEnd('.csv'))_$(($MyInvocation.MyCommand.Name).TrimEnd('.ps1')).csv"
    $arrExportUsers = @()
    $usersCount = $(@($csvUsers).count)
    $indexUsers = 0
    $indexMissing = 0
    $indexFound = 0
} 
else {
    Write-Error -Message "File $csvFile not found...`n" -RecommendedAction "Please verify and try again...`n"

    $hshCMTrace.Message = "File $csvFile not found"
    $hshCMTrace.Type = 2 
    CMTraceLog @hshCMTrace

    exit
}

$hshCMTrace.Message = "Users count in $csvFile : $usersCount"
$hshCMTrace.Type = 1 
CMTraceLog @hshCMTrace

#endregion

#region: hashtable Progressbar
$progFindUsers = @{
    Activity = "Processing users in $($csvFile)"
    Status="Searching"
    CurrentOperation = ""
    PercentComplete = 0
}
#endregion

#region: hashtable UserAcccountControl
#Have a look @site http://maxvit.net/userAccountControl
$hshAccountControl =@{
    66048 = "NORMAL_ACCOUNT - ACCOUNT_ENABLED - DONT_EXPIRE_PASSWORD"
    66080 = "NORMAL_ACCOUNT - ACCOUNT_ENABLED - DONT_EXPIRE_PASSWORD - PASSWD_NOTREQD"
    512 = "NORMAL_ACCOUNT"
    514 = "NORMAL_ACCOUNT - ACCOUNTDISABLE"
    544 = "NORMAL_ACCOUNT - ACCOUNT_ENABLED - PASSWD_NOTREQD"
    546 = "NORMAL_ACCOUNT - ACCOUNTDISABLE - PASSWD_NOTREQD"
}
#endregion

Write-Verbose -Message "Script Started on $(get-date)"

$ADDomainInfo = Get-ADDomain
Write-Verbose -Message "Forest Distinguished Name: $($ADDomainInfo.DistinguishedName)"


foreach ($user in $csvUsers) {
    $progFindUsers.Status = "Processing user $($user.SamAccountName)"
    $progFindUsers.PercentComplete = ($indexUsers/$usersCount) * 100

    Write-Progress @progFindUsers
    
    #Get User DistinguishedName
    [ADSI]$ldapUser = Get-ADObjectDN -type "user" -DNSDomainName $ADDomainInfo.DistinguishedName -CNObject $user.SamAccountName

    if ($ldapUser -ne $null){

        $value = $ldapUser.userAccountControl
        
        $ouIndex = $($ldapUser.DistinguishedName).IndexOf("OU=")
        $OU = ($ldapUser.DistinguishedName).Substring($ouIndex)

        $objUser = new-object psobject -Property @{
            Name = $($ldapUser.Name)
            DistinguishedName =$($ldapUser.DistinguishedName)
            WhenCreated = $($ldapUser.WhenCreated)
            sAMAccountName =$($ldapUser.samACCountName)
            AccountControl = $hshAccountControl.Item($($value))
            HomeDir =$($ldapUser.HomeDirectory)
            OU= $($OU)
        }
        
        $indexFound++
    } 
    else {
        $objUser = new-object psobject -Property @{
            Name = $null
            DistinguishedName =$null
            WhenCreated = $null
            sAMAccountName =$($user.samACCountName)
            AccountControl = $null
            HomeDir =$null
            OU= $null
        }

        $indexMissing++

        $hshCMTrace.Message = "Missing: $($user.SamAccountName)"
        $hshCMTrace.Type = 2 
        CMTraceLog @hshCMTrace
    }

    $arrExportUsers += $objUser
    $indexUsers++
}

$hshCMTrace.Message = "Users found in $csvFile : $indexFound"
$hshCMTrace.Type = 1 
CMTraceLog @hshCMTrace

$hshCMTrace.Message = "Users mssing in $csvFile : $indexMissing"
$hshCMTrace.Type = 1 
CMTraceLog @hshCMTrace

if ($export) {
    "`r"
    Write-Verbose "Exporting results to $pwd\export\dsa\$exportFile"
    $arrExportUsers| select Name,SamAccountName,DistinguishedName,WhenCreated,AccountControl,HomeDir,OU |  Export-CSV -NoTypeInformation "$pwd\export\dsa\$exportFile" -delimiter $delimiter -Encoding UTF8
}
else {
    if (!($PSCmdlet.MyInvocation.BoundParameters["Verbose"].IsPresent)) {
        $arrExportUsers | select Name,SamAccountName,DistinguishedName,WhenCreated,AccountControl,HomeDir,OU  | Out-GridView -Title "Found Users $($csvFile) - $(Get-Date)"
    }
}

"`n"
Write-Verbose "End script : $(Get-date)"	

I’m all over the place! There’s a bit of everything!!! Trust me it works… Looking at it now, gotta ask myself, what was I thinking??? This was a script way back in the days when cmdlet performance was questionable… Rule of thumb if there’s a cmdlet use it! No need to reinvent the wheel…

So here’s a better (readable) version

$Prop =  @('canonicalname','homedirectory','mail','homedrive','scriptpath','initials','profilepath','userprincipalname')
import-csv -Path .\source\csv\moas-users.csv -Delimiter "`t" -Encoding UTF8 |
ForEach-Object {
    get-aduser -Filter "SamAccountName -eq '$($_.SamAccountName)'" -properties $Prop  |
    Select-Object Name,GivenName,SurName,Initials,mail,SamAccountName,Enabled,DistinguishedName,canonicalname,
        @{name='OU';expression={($_.DistinguishedName).SubString($_.DistinguishedName.indexof('OU='))}},
        homedirectory,homedrive,scriptpath,profilepath,userprincipalname
} |
Out-GridView

This will get me the same information with waaay much lesser code… No need  for complex verification,progress etc etc… Clean and simple… Processing each object as they go through the pipeline…

It is important to know beforehand what your objective is, is it a script or a tool? Don jones sums it up nicely stating:

“A scripts is something you usually make for yourself. Scripts are often quick and dirty, and although they might be long and complicated, they’re just a way for you to automate something that only you will ever do…”

Just because a script seems complicated doesn’t make it a tool, case in point. The former script is over engineered plain and simple. I got a bit carried away with all the possibilities in PowerShell.

If I had to start learning PowerShell today I’d advice myself to better understand and use the pipeline…

Simplicity

Hope it’s worth something to you!

Ttyl,

Urv

SDDL gives more NTFS insight

I’ve been doing migrations, oh say for the past 10 years (Hmmm, that’s long if I do say so myself) Data Migrations can be complex depending what needs to be achieved. I remember using ScriptLogic to map drives depending on which subnet a user was on, that was way before DFS was available… Good times…

I’ve had my share of headaches when it comes to Data migrations. The biggest challenge is interoperability, when Target Resources keeps on using Source Resources until all Source Resources have been migrated. Sometimes it’s just not possible to migrate all Source Resources at once (what we affectionately call ‘big bang’). If data is being mutated by different departments/projects that aren’t migrated at the same time then interoperability is your only choice… Still tricky though…

Ok so here’s the scenario: Migrate Resources from one AD Forest to another (with a trust in place). I’ll take you through the Data part 🙂

The key component is to use SIDHistory. SIDHistory will help resolve whether you have access or not to a Source Resource. My favorite replication tool has to be robocopy! It wasn’t love at first sight, but once I figured out all the parameters, then there isn’t much you can’t accomplish with it!

For interoperability we usually redirect Target Resources to the Source. This way Data mutation can still be achieved without disturbing Production. In the mean time data is being synced to the Target Domain with ACLs intact! Why? We’ll get to that later… Or might as well get into it now… 🙂

Ok so ACL (Access Control List) is that list you get when you open up a file or folder security tab. The accounts are referred to as ACE (Access Control Entry). That’s where you’d grant/remove an account read/write/full/etc access to said file or folder. When using SIDHistory you’re token access will resolve correctly, but here’s where it gets tricky

I’ve copied Data with robocopy keeping security intact. When I opened a folder security tab I noticed the Target account name being displayed. That threw me off because I didn’t reacl the target resource yet.

Quick sidestep ReACL is a term I came across using Quest Active Directory Manager (now DELL). ReACL can be done by adding the Target Account (doubling the amount of ACEs) or doing a cleanup by first adding the Target account and removing the Source Account. You can also rollback if needed but that one is tricky, especially if SIDHistory has more than one entry.

But you wouldn’t know that by looking at the folder Security tab.

If you really want to find out who has access, SDDL will let you know. SDDL uses an object SID to grant or deny access. Thing is SDDL is hard to read hence the Security tab. So the first time I ReACLed a folder adding the Target Account I saw that the ACEs did double, but I only saw the Target Account. I expected to see SOURCE\ACCOUNT;TARGET\ACCOUNT instead I was seeing the TARGET\ACOUNT twice. Here’s where looking at SDDL will give you more insight… Suffice to say we’ll be doing this the PowerShell way… Oh come on! don’t act so surprised! 😛

So first let’s get the ACL of the folder you want to inspect (try this on your folder):

$acl = get-acl '\\162.198.1.129.\g$\GRP\DATA\DEPT-001-XYZ'

To find out who has access  type $acl.Access. This will give you a list of all ACEs in the ACL. This is the list you’d also see in Explorer security tab (advance mind you, I noticed that). Now for the fun part $acl.sddl… Tada!!!

$acl.Sddl

O:S-1-5-21-103234515-1370883554-928726630-1008G:S-1-5-21-103234515-1370883554-928726630-513D:P(A;OICI;FA;;;SY)(A;OICI;FA;;;BA)(A;OICI;0x1301bf;;;S-1-5-21-103234515-1370883554-928726630-4307)(A;OICI;0x1301bf;;;S-1-5-21-103234515-1370883554-928726630-4308)(A;OICI;0x1200a9;;;S-1-5-21-103234515-1370883554-928726630-4309)

Seems complicated, well yes it is, still it’s worth figuring out… Have a look at MSDN for more information.

The tell tale is the Domain SID, every Account begins with it. Looking at the Domain SID tells you who actually has access (or not) to said resource and which Domain that account belongs to.

The Domain SID for the current domain I’m inspecting is:
DomainSID : S-1-5-21-602145358-1453371165-789345543
You can get the Domain SID using Get-ADDomain cmdlet… 😉

I picked an ACE from the $acl.access list:

FileSystemRights : Modify, Synchronize
AccessControlType : Allow
IdentityReference : SOURCE\DEPT-001-XYZ-RXWR
IsInherited : False
InheritanceFlags : ContainerInherit, ObjectInherit
PropagationFlags : None

Let’s get some AD properties from this acount

Get-ADGroup -Identity DEPT-001-XYZ-RXWR -Server source.nl -Properties SID,SIDHistory
..
SamAccountName : DEPT-001-XYZ-RXWR
SID : S-1-5-21-602145358-1453371165-789345543-35829
SIDHistory : S-1-5-21-103234515-1370883554-928726630-4307

Here’s the sddl string once more:

O:S-1-5-21-103234515-1370883554-928726630-1008G:S-1-5-21-103234515-1370883554-928726630-513D:P(A;OICI;FA;;;SY)(A;OICI;FA;;;BA)(A;OICI;0x1301bf;;;S-1-5-21-103234515-1370883554-928726630-4307)(A;OICI;0x1301bf;;;S-1-5-21-103234515-1370883554-928726630-4308)(A;OICI;0x1200a9;;;S-1-5-21-103234515-1370883554-928726630-4309)

This group has access using SIDHistory!!!

Ok now what? Well in an ideal situation the data would have been ReACLed using the current SID instead of the SIDHistory. The reason for that is to cleanup your SIDHistory to avoid tokenbloat. Here’s an excellent blog by the dirteam discussing the perils of tokenbloat.

This only scratched the surface of what you could investigate! There aren’t many tools (Free) that can help. Ashley Mcglone has an excellent series on the matter definitely worth reading.

I’m currently doing a Data migration (surprise!) so I’ll be adding more tips/tricks/gotchas as the Data migration progresses so stay tuned!

Hope this will steer you in the right direction when it comes to figuring out who has access…
The rabbit hole goes deep…

Ttyl,

Urv