Category Archives: Uncategorized

Custom Intellisense for AD cmdlets with SearchBase parameter

Sup’ PSHomies!

You gotta love the PowerShell community! Found this little gem in my twitter feed (again) :-). Trevor Sullivan demonstrates how we can create custom intellisense for cmdlets if they haven’t been provided as yet. Great video! Trevor really does a great job explaining this.

The first thing that came to mind was Active Directory! I can’t tell you how often I needed the DistinguishedName of an OU. Now imagine having a dynamic list generated for you! No more errors,  just select and you’re good to go! Excited??? I sure am!

Sometimes you need to limit your searchbase depending on you AD size. Let’s say I want to retrieve all users starting from a specific point

Get-ADUser -Filter * -SearchBase 'OU=Users,OU=IT,DC=pshirwin,DC=local'

A simple typo will generate an error. Distinguished names are notorious for being lengthy…

Now the obvious AD cmdlets would be Get-ADUser,Get-ADGroup & Get-ADComputer. So that got me thinking , just how many AD cmdlets have SearchBase as a parameter?

Get-Command -Module ActiveDirectory |
ForEach-Object{
   $psItem.Name |
   Where-Object {
        (Get-Command $psItem).ParameterSets.Parameters.Name -eq 'SearchBase'
   }
}

Turns out there are quite a few using SearchBase

  • Get-ADComputer
  • Get-ADFineGrainedPasswordPolicy
  • Get-ADGroup
  • Get-ADObject
  • Get-ADOptionalFeature
  • Get-ADOrganizationalUnit
  • Get-ADServiceAccount
  • Get-ADUser
  • Search-ADAccount

So I can have Intellisense on all these cmdlets? Awesome!!!

Intellisense completed the DistinguisedName on -SearchBase for me. No need to type it in, no errors, just select and go!

TabExpansionSearchBase

Here’s the result:

TabExpansionSearchBase-result

I’m sure you guys will find your own use for this… Thanks again Trevor for bring this to our attention! Good looking out for the community! Be sure to watch Trevor’s video for in depth explanation.

Hope it’s worth something to you…

Ttyl,

Urv

Configuring DNS zones and resource records

“Hey Irwin! Do you have a script to configure DNS autodiscover?” Eh, no not really… I’m assuming you’re doing something with PowerShell DNS cmdlets? 😉 And so begins another PowerShell journey…

My colleague Ofir is our Exchange guy.

“Ok so what exactly do you want to accomplish Ofir? I’m trying to automate registering some Resource records in DNS, but it isn’t working. I’d like to add some IPAddresses to a specific zone… “Ok let’s see what you’ve got!

Add-DnsServerResourceRecordA -Name "autodiscover" -ZoneName "domain.local" -AllowUpdateAny -IPv4Address "IP address 1","IP address 2","IP address 3" -TimeToLive 01:00:00

“So when I use the cmdlet directly it works. When I use varaiables it doesn’t… Ideally the code should be re-usable…”  Ofir’s words not mine… I’ll admit, I teared up a bit… Kids, they grow up so fast… Hehe…

I think we can do this…

So Ofir was using read-host to get ZoneName and IPvAddress. Ah! What a lovely opportunity to demonstrate params!

“Ok Ofir, instead of using read-host, we’re better off using parameters. Using [CmdletBinding()] gives you the possibility to use Write-Verbose, no extra charge!”

[CmdletBinding()]
Param(
   [string]$fqdn ='domain.local',
   [string[]]$ServerIPAddress
)

Now because the Resource record could be multi-valued we’ll go on and define a string array variable [string[]]$ServerIPAddress

“To make you’re code more readable we’ll just go ahead and create a hashtable we can use to splat your parameters.”

$dnsSplat = @{
   Name = 'AutoDiscover'
   Zonename = $fqdn
   AllowUpdateAny =  $true
   IPv4Address = $ServerIPAddress
   TimeToLive = '01:00:00'
}

“Now all we need to do it run the appropriate cmdlet and we’re good!”

Add-DNSServerResourceRecordA @dnsSplat

Ok, so this got Ofir started… Mind you there’s no error handling or anything of that sort…
We did some tinkering on the fly and this was the end result:

<#

Author: I.C.A. Strachan
Version:
Version History:

Purpose: Create AutoDiscover Zone and add ResourceRecord

#>
[CmdletBinding()]
Param(
   [string]$fqdn ='domain.local',
   [string[]]$ServerIPAddress= @('192.168.1.4', '192.168.1.5')
)

BEGIN{
    $dnsRRA = @{
       Name = 'AutoDiscover'
       Zonename = "autodiscover.$($fqdn)"
       AllowUpdateAny =  $true
       TimeToLive = '01:00:00'
    }

    $dnsPZ = @{
        Name = "autodiscover.$($fqdn)"
        ReplicationScope = 'Forest'
        DynamicUpdate = 'Secure'
    }

    Import-Module DNSServer -Verbose:$false
}

PROCESS{
    #Only Add Zone is count is zero (doesn't exists)
    If (@(Get-DnsServerZone $dnsPZ.name -ErrorAction SilentlyContinue ).Count -eq 0 ){
        Write-Verbose "Creating DNS Zone: $($dnsPZ.name)"
        Add-DnsServerPrimaryZone @dnsPZ
    }

    #Get string equivalent of all A records
    $RR = Get-DnsServerResourceRecord -ZoneName autodiscover.domain.local -RRType A |
    Out-String

    $ServerIPAddress | ForEach-Object {
        If (!$RR.Contains($_)){
            Write-Verbose "Adding resource record $_ to $($dnsPZ.name)"
            Add-DNSServerResourceRecordA @dnsRRA -IPv4Address $_
        }
    }
}

END{}

Ofir was quite happy! Nice! Another satisfied customer. So the other day asked him to send me the code for future reference…

This is what he sent me:

What??? This wasn’t the code I expected! Turns out Ofir had a lot more he needed to configure. Just pointing him in the right direction was sufficient to generate this! Awesome! Give a man a fish… 😉

Go Ofir! It’s fun to see colleagues get excited about PowerShell. Sometimes all that’s needed is just a nudge in the right direction…

Ttyl,

Urv

Creating PowerShell GUIs

I’m old school. I’m a big fan of the “Real men don’t click” club!

My PowerShell bestie, Michaja van der Zouwen, is all about the GUI! We would go back and forth about to GUI or not to GUI… Good times…

We got a great introduction in creating PowerShell GUI app at the recent DuPSUG meeting by none other than June Blender! Just google June Blender… We’re being spoiled here!

Back in the days June was also part of the “Real (wo)men don’t click” club. So what changed? I can totally relate to June’s story about providing a script that could resolve a contractors problem. Now you would think the manager would be grateful eh? Nope! They weren’t interested in learning or using PowerShell even if it solved their problem. The idea was too daunting for them. Enter GUI.

GUI took away that initial fear of learning something new. “Just run this and click this button!” Doesn’t get easier than that eh? So should we all be creating GUIs? Well you should at least know how to 😉 Hence the workshop!

So I’m not against GUI, but it is a different mindset when creating a GUI app. You really need to think in events. You still need to validate parameters, but you need to anticipate what a user’s next move could be. The user’s move needs to be processed by ‘event-handlers’. With a script I have some parameters, I validate them and I’m good! With a GUI you need to think ahead of what could happen if…

We used Sapien PowerShell Studio to create a small GUI app. June gave us some excellent Gotcha and Aha tips! Sapien PowerShell Studio makes creating the GUI easy! Once the GUI interface was created we added the script logic. The event-handlers are basically script blocks!

Here’s where you need to think ahead:

What do I want happen if the textbox value is empty? Then you shouldn’t be able to click the button. Ok… But what if someone enters spaces only? We should validate that and make sure the value isn’t empty or spaces. But what if there’s a lingering space somewhere? Make sure Trim your textbox value.

Sapien PowerShell Studio makes it easy to export the complete script. Just go to Deploy -> “Export To Clipboard”. Paste in ISE and run it! Works! Have look at the script, there’s a lot going on under the hood… Imagine creating that by hand…

All the script logic can be found under ‘User Generated Script’ comment block. The rest PowerShell Studio took care of…

As an introduction to creating PowerShell GUI scripts, mission accomplished!

Here’s the link to June’s github repository for more information.

Thanks June it was a pleasure meeting you in person! I’m more open to the idea of creating a GUI around scripts, only after putting up a fight though… Old habits die hard… Hehe…

Ttyl,

Urv

 

Revisiting Exporting Data

Sup PSHomies!

I’ve been playing with different formats lately. I’d like to share a few thoughts on the subject if I may… For demo purposes I’ll be using the following cmdlets: Export-Csv, Export-Clixml and ConvertTo-Json!

Export-Csv

I’ve talked about my love for exporting to csv in the past. Here’s thing, exporting to CSV treats everything as a string. For reporting purposes this might not be an issue. When it comes to nested objects… Yeah… Then you’re better off exporting to XML. Incidentally Jeff Hicks has a great blog on this topic, you should definitely check it out! CSV is still my goto format because of reporting in Excel, although, I’ve been using Doug Finke’s ImportExcel module more and more! Doug’s module cuts out the middle man and can export to Excel without having to export as a CSV first. It does a whole lot more! Worth looking into!

Export-Clixml

Exporting a nested object is pretty straightforward using Export-Clixml. The structure isn’t pretty though. That was the main reason I didn’t use the cmdlet. Export-Clixml is great when used in combination with Import-Clixml, it restores your nested object without a hitch! You can export your results, send the file and import elsewhere for further processing if needed. When I think of xml, I immediately conjure up ideas of html reporting. The xml tags are too cryptic for any css style, I wouldn’t even know where to begin. I recently discovered PScribo (Thanks to the PowerShell Summit in Stockholm), a module by Ian Brighton! This made html reporting a breeze! All I did was import my XML file back into PowerShell to retrieve my nested object and I did the rest in PowerShell! That was awesome!

ConvertTo-Json

The ConvertTo-Json cmdlet has been introduced in PowerShell version 3.0. Back then I was a stickler for XML so I briefly looked at it and forgot all about it… That is until Azure Resource Manager came along. If you’re doing anything with Azure Resource Manager then Json should be on your radar. If you’re not convinced just look at the ARM Templates out there. Json is a lot easier on the eyes for sure. Still not conviced? Just google Json vs XML.

Ok here’s some code you can play with to get a general idea of what the possibilities are when exporting to different formats. Have a look at the Json and Xml, which would you prefer? That was rhetorical… 😉

Bottom line

Export-Csv is best when you need to report  anything in Excel workbooks and you’re not worried about type. Everyone gets Excel.

Export-Clixml isn’t pretty but excellent when it comes to keeping the data metadata in tact. You can always import preserving the metadata and process further in PowerShell.

Use Json if you want to have a structured data set à la XML. Json is a lot friendlier than XML. I was also surprised that the cmdlet interpreted values to it’s best avail. False became Boolean as you would expect. Json is growing on me…

Hope it’s worth something to you…

Ttyl,

Urv

What I picked up at DevOpsNL

Just got back from DevOpsNL Hackathon days. It was very a humbling experience, I mean that in a good way! Developers are definitely a different breed! I remember Lee Holmes asking at the PowerShell Summit: “Are you a developer or a scripter? Guess what? You’re more of a developer than you think!” Well after these days, I realize I’m no developer by far!

The DevOpsNL hackathon days objective was to introduce the participants in DevOp practices/tooling, in which they succeeded!. Microsoft did an excellent job in providing nourishment and accommodation for all participants interested! Our DevOps guides were Thiago Almeida and Rasmus Hald. They recently did a DevOps Hackathon in Belgium.

DevOps is definitely a way of working. It’s all about the alignment of People,Processes & Products.

Here’s a a quick rundown of the primary DevOps practices:

  • Infrastructure as Code (IaC)
  • Continuous Integration
  • Automated Testing
  • Continuous Deployment
  • Release Management
  • App Performance Monitoring
  • Load Testing & Auto-Scale

After a quick DevOps Presentation, teams were created to work on some creative ideas, using DevOps practices and tools. It’s one thing to know the practices, it’s another thing working that way.

We used Microsoft’s Ecosystem to better understand DevOps practices. Each team got a 100$ Azure pass for the DevOpsNL days.

My team was predominantly operation guys, with one developer. Operations guys are quick to click and ask questions later. One of the DevOps pratice is spending more time on working the idea out, once the idea is crystal, use the KanBan Method to further implement and manage the project. Ideally you would have a project manager to fill this position. Developers are definitely a (bit) more disciplined bunch. They get Automated testing, Continuous Deployment Release management, etc. In hindsight, we should have spend more time here.

Infrastructure as Code (IaC)

Now we’re talking! Infrastructure I get. Here’s a lil’ telltale that will let you know you’re not quite a developer, if Infrastructure still excites you!  Developers were like… “meh”. What got the developers going? Azure Application Insight! Having all that information and metrics about their application to better help them understand their application. There where live discussions of having dashboards to represent proper functionality of an app. It was fun to witness their excitement.

Suffice to say IaC caught my attention. Now I have some affinity with Azure, and I’ve dabbled in automating to Azure, but using Azure Resource Manager was new, specifically using VisualStudio to deploy to Azure was new. Want another telltale to let you know you’re not a developer? If you start VisualStudio and your trying to figure out where everything is. I enjoy using PowerShell ISE with ISE-Steroids, but when in comes to ARM, you need VisualStudio if only to better understand Azure Json templates.

Deploying a complete Infrastructure in Azure to support Release management

The IaC we needed was a website (with SQL at the BackEnd) having a development, quality assurance and production environment. Operation guys love deploying stuff! Ask a Ops guy for a website and he’ll probably deploy a VM, install OS & Updates and last but not least install (and if you’re lucky) configure IIS before releasing it for dev/qa/prod. Azure really makes the DevOps Continuous Integration,Continuous Deployment and Release Management a lot easier to implement. Deploying an AzureWebSite is easy, using ARM was a bit more challenging.

For the different environments we were advised to use slots. Thiago gave me a quick run down what the general idea was. You have several options to implement your dev/qa/prod environment:

  1. Keep it all seperate.
  2. Use slots to distinguish your enviroment.
  3. Use slots as a way speed up prelease of your webapp

Deploying the slots was a little bit more than I bargained for, something I definitely will look into in the near future. One of the DevOps practice is knowing what to prioritize, deploying the slots automatically didn’t make the cut.

My take away from the DevOpsNL days

Here’s what I picked up at the DevOpsNL days:

  • Develop the mindset of a DevOps. You may think you get it, but really spend some more time better understanding the DevOps mindset. There’s always room for improvement.
  • Spend more time on collaboration. It’s easy to dive head first into implementing solutions
  • Make sure everyone knows what has priority and what doesn’t.
  • Don’t forget to enjoy the journey. I got caught up in IaC and missed out on interacting with the other teams.

For Infrastructure guys:

  • There’s more to an Infrastructure than just deploying Servers/VM.
  • You need to understand the basics of websites even if you’re not a webdeveloper
  • Learn how to deploy to Azure using ARM. At a certain point I was tempted to switch back to ASM. Rasmus said it best: ” That will only be a temporary fix, ARM is the way of the future”
  • Json, learn it! Better still, learn by looking at the templates out there. Trying to figure Azure Json Templates from scratch is quite an undertaking! The templates will get you there quicker.
  • Get proficient at using Visual Studio more.
  • Realize that you’re just one of the cogs in this big wheel we call DevOps!

DevOps is more than just a soundbite, it’s a mindset. I think I can better understand a developer’s dilemma in the future…  Hope it’s worth something to you…

Ttyl,

Urv

See what I did there? 😉

I’ve talked about creating your own script repository in the past. I like having some sort of structure. GitHub goes further. It also helps with version control and so much more!. My idea of version control is adding -v# after any script, don’t judge me…

These days everything seems to be in GitHub! The PowerShell DSC Resource Modules are there. Azure PowerShell Module & Templates are there. You can even deploy Azure Resources from GitHub! You can also use Git for continuous deployment for website content/apps.

Visual Studio supports GitHub as well for continuous delivery

I just discovered Gist! Doug Finke created a GUI to help create a template to extract Data from text files. I noticed that the code was straight out of GitHub. So I did a little bit of googling and sure enough, I found out how! Gist lets you share snippets of code with others. All you need to do is copy & paste the url and you’re good to go!

Best part of Gist is having you code “As-Is”. Using WordPress plugins sometimes gets the “<#” , “#>” or “&” wrong, replacing it with HTML equivalent. This way you’ll never have to worry about your code being mutilated anymore! Sweet!!!

Here’s an article to help you on your way getting started with GitHub.

wpid-wp-1440101002843.jpg

GitHub is (going to be) one of those things that you need to master…

Hope it’s worth something to you…

Ttyl,

Urv

‘Sup PSHomies

So APP-V has this file, AppxManifest.xml, where the Version,VersionID,PackageID is stored amongst other things. My last interaction with software virtualization was when APP-V still went by the name SoftGrid… Yeah… So I was kinda reluctant, that was until I heard that the APP-V client is all about PowerShell! This pleases me… 😉 I guess that’s why my colleague thought of me in the first place… I do love me some PowerShell… Hehe…

So he explained that the .appv file is just a zip file… Wait what?  So I can extract the content just like a zip file? Good to know!!! I remember seeing this tactic once used by Bartek Bielawski in his excel module. It would seem that a .xlsx file is also just a zip file.

I’ve been using PowerShell version 5 for quite some time. There are two cmdlets available for zipping and unzipping respectively,Compress- and Expand-Archive. This will get you all the files. My first attempt was to extract all the files and then select the AppxManifest.xml file, but that seemed like quite a lot of unnecessary work for just one file… Hmmm… Guess it’s  .NET then…

Before the archive cmdlets the only way to deal with zip file was with you needed to use .NET to extract files from zip. Just load the assembly

Add-Type -assembly 'system.io.compression.filesystem'

Rule of thumb is to use a cmdlet if available, but in this case I only wanted that one file… I checked if expand supported extract a single file, it doesn’t, so using .NET is justified (for the moment)

My first attempt was to copy the appv file and rename it to zip. But then I would need to clean up files/folders afterwards… Why not just read the file eh? Just one thing, you need a handle on the file you opened in order to dispose of the process later down… The function will extract the AppxManifest file and prepend the appv filename to the AppxManifest.xml file, using underscore as a separator, in the subfolder manifest… Whew! Still follow me?

Ok… enough chit-chat here’s the code…


<#
   Author: I. Strachan
   Version: 1.0
   Version History:

   Purpose: Extract The AppxManifest.xml file from the .appv file.
            Get the Version,PackageID and VersionID from AppxManifest.xml
#>
function Get-AppxManifest {
   [CmdletBinding()]
   param (
      [Parameter(Mandatory = $true)]
      [ValidateScript({ Test-Path -Path $_ -PathType Container })]
      [string]$Source
   )

   begin{
      #Load Assembly for Zip
      Add-Type -assembly 'system.io.compression.filesystem'

      #get AppV Files from parent Folder
      $appVFiles = Get-ChildItem -Path $Source -Recurse -Filter ('*.appv')
      $manifestFolder = "$($PSScriptRoot)\Manifest"
  
      if(!(Test-Path -Path $manifestFolder)) {
         Write-Verbose 'Manifest folder does not exist. Creating folder'
         New-item -Path $manifestFolder -Type Directory | Out-Null
      }
   }

   process{
      
      #get AppxManifest.xml content
      $appVFiles | 
      ForEach-Object {
         $FileName = $_.BaseName
         $FilePath = $_.FullName

         #Create handle in orde to dispose File later on
         $hndZipFile = [System.IO.Compression.ZipFile]::OpenRead($_.FullName)

         $hndZipFile.Entries | 
         Where-Object { $_.FullName -like 'AppxManifest.xml' } |
         ForEach-Object {
            $File = Join-Path $manifestFolder "$($FileName)_$($_.FullName)"
            [IO.Compression.ZipFileExtensions]::ExtractToFile($_, $File, $true)

            #Typecast to Get XML 
            [XML]$xml = Get-Content $File

            [PSCustomObject]@{
               Version = $xml.Package.Identity.Version
               PackageId = $xml.Package.Identity.PackageId
               VersionId = $xml.Package.Identity.VersionId
               DisplayName = $xml.Package.Properties.DisplayName
               AppVPath = $FilePath
            }
         }

         #Close handle on File
         $hndZipFile.Dispose()
      }
   }

   end{}
}

Just point the function to the folder hosting the appv files and you’ll get an overview of the Version,VersionID,PackageID,DisplayName and AppVPath

Get-AppxManifest

The fun part for me was using .NET to extract that single file. I forgot that you needed to keep a handle on things yourself when using .NET

Hope it’s worth something to you…

Ttyl,

Urv

‘Sup PSHomies?

Got a lil’ somethin’ for ya… Get-RCLogSummary! As you know I’m a big fan of RoboCopy! I thought I’d share one of the perks of using RoboCopy: the LogFile.

Here’s a list of RoboCopy Logging options, courtesy of ss64.com

   Logging options
                /L : List only - don’t copy, timestamp or delete any files.
               /NP : No Progress - don’t display % copied.
          /unicode : Display the status output as Unicode text.  ##
         /LOG:file : Output status to LOG file (overwrite existing log).
      /UNILOG:file : Output status to Unicode Log file (overwrite)
        /LOG+:file : Output status to LOG file (append to existing log).
     /UNILOG+:file : Output status to Unicode Log file (append)
               /TS : Include Source file Time Stamps in the output.
               /FP : Include Full Pathname of files in the output.
               /NS : No Size - don’t log file sizes.
               /NC : No Class - don’t log file classes.
              /NFL : No File List - don’t log file names.
              /NDL : No Directory List - don’t log directory names.
              /TEE : Output to console window, as well as the log file.
              /NJH : No Job Header.
              /NJS : No Job Summary.

My preference when it comes to logging is to have seperate logfiles instead of appending to one big file. The option /NP is a no brainer, displaying ‘%’ will give you an indication how long it took for that specific file/folder, but who wants that right?It will only increase your logfile size taking more time to parse it down the line. I recently used /NDL and I must say this will keep your logfile footprint small. I did include /FP to still have an idea where the file is being copied from. I’d go with /NDL in combination with /FP when doing a delta-sync. A delta-sync is a robocopy job that will copy the differences once a full-sync has taken place. If the file hasn’t changed robocopy will skip it. Only new and newer files will be copied… Ok enough background, let get scripting shall we? 😛

Function Get-RCLogSummary{
  param(
    [String]$LogFileName,
    [String[]]$LogSummary
  )

  $objLogSummary = @{
    rcLogFile = $LogFileName
    Speed = ''
  }

  Foreach($line in $logSummary) {
    switch ($line){
      #Header
      {$_ | select-string '   Source :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Source',$_.Substring(11).Trim())
        }
      {$_ | select-string '     Dest :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Target',$_.Substring(11).Trim())
        }
      {$_ | select-string '  Started :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Start',$($_.Substring(11).Trim()))
        }
      #Footer
      {$_ | select-string '    Dirs :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('TotalDirs',$_.Substring(11,10).Trim())
          $objLogSummary.Add('FailedDirs',$_.Substring(51,10).Trim())
          $objLogSummary.Add('CopiedDirs',$_.Substring(21,10).Trim())
        }
      {$_ | select-string '   Files :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('TotalFiles',$_.Substring(11,10).Trim())
          $objLogSummary.Add('FailedFiles',$_.Substring(51,10).Trim())
          $objLogSummary.Add('CopiedFiles',$_.Substring(21,10).Trim())
        }
      {$_ | select-string '   Bytes :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('TotalBytes',$_.Substring(11,10).Trim())
          $objLogSummary.Add('FailedBytes',$_.Substring(51,10).Trim())
          $objLogSummary.Add('CopiedBytes',$_.Substring(21,10).Trim())
        }
      {$_ | select-string '   Ended :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('End',$($_.Substring(11).Trim()))
        }
      {$_ | select-string '   Speed :'}
        {
          $_= $_.ToString()
          $objLogSummary.Speed = $($_.Substring(11).Trim())
        }
      {$_ | select-string '   Times :'}
        {
          $_= $_.ToString()
          $objLogSummary.Add('Time Total',$($_.Substring(11,10).Trim()))
        }
      }
    }

  #return $objLogSummary
  [PSCustomObject]$objLogSummary
}

#region:array with all LogSummary Object Properties
$arrRCProperties = @(
  'rcLogFile',
  'Source',
  'Target',
  'TotalDirs',
  'TotalFiles',
  'TotalBytes',
  'FailedDirs',
  'FailedFiles',
  'FailedBytes',
  'CopiedDirs',
  'CopiedFiles',
  'CopiedBytes',
  'Start',
  'End',
  'Time Total',
  'Speed'
)
#endregion

#region: Get all robocopy LogFiles in specified folder and get Summary
get-childitem '.\log\rc\home\22-06-2015' -File |
ForEach-Object {
  #region: Get File Header & Footer
  $arrSummary  = (Get-Content $_.FullName)[5..8] #Header
  $arrSummary += (Get-Content $_.FullName)[-11..-1] #Footer
  #endregion

  Get-RCLogSummary -LogFileName $_.Name -LogSummary $arrSummary
}|
Select-Object $arrRCProperties |
Out-GridView
#endregion

First I’ll get a list of logFiles and retrieve the first 5-8 lines and the last 10 lines of each file for processing. The LogFileName & array Summary are then passed as parameters to Get-RCLogSummary. I did a select to get the parameters in a certain order. It was a toss up between using [Ordered] Hash or  defining a [PSCustomObject] beforehand. I figured you could minimize the Properties you want by tweaking the $arrRcProperties yourself. last but not least use Out-Gridview or Export-Csv to see the endresult.

I’m working on my pipeline skills, trust me my previous version was more ‘elaborate’, and by elaborate I mean over engineered…

So I guess you’ve noticed that regular expression is missing? Robocopy labels are fixed which is a good thing for me. I’m looking into it…

wpid-wp-1435926794192.jpg

This regular expression isn’t as easy as it seems… This works, just don’t include /Bytes in your robocopy parameter list. In that case you’ll definitely need regular expression. Version 2.0 I guess…

Hope it’s worth something to you

Ttyl,

Urv

Robocopy is a DataMigration specialist’s best friend!

‘Sup PSHomies? 😛

I’ve been doing migrations for as long as I can remember now. Data migrations can be a challenge! If there is one tool I can rely on to always come through it has to be RoboCopy!!! The more I use it the more my appreciation grows… Here are some of the crazy things you can do with RoboCopy.

Delete Folder content

Ever been in need of clearing a folder of content but those pesky pop-up keep appearing in explorer? Oh and don’t even get me started on the time it takes… Enter RoboCopy!

Just use an empty folder as Source and do a /MIR! No longpath error or anything like that. Just make sure you point it to the right folder you want to clear… Yeah… learned that the hard way…

Get FolderSize

This one is courtesy Joakim Svendsen aka SvendsenTech! His PowerShell (is there any other kind… 😛 ) will get you a FolderSize, as promised… Fast! I’ve made use of it quite a few times!

Logging

RoboCopy’s logging capability is simply awesome and easy to parse using… I’ll let you fill it in… Hehe… I’ve spoiled many managers with RoboCopy LogSummaries in Excel, so much so that they wouldn’t have it any other way now! What I like about having all the RoboCopy jobs in Excel is that it’s easy to filter what went wrong, which job took the longest, even the speed is available! Now All I need to figure out is how Pivot-tables work…

Blog RCLogSummary

More on Logging

I recently needed to make a report on just how PSTs where part of the Data Migration. PSTs are the worst! Once opened in Outlook it’s a full sync all over again. Sure I used RoboCopy to list the PSTs. Looking at the log output I thought: “Wouldn’t it be nice just to have the files?” Guess what? You can!

Using /NDL (No Directory List – don’t log directory names) you’ll only have the file as log output. I added /FP (Include Full Pathname of files in the output) for good measure. To keep to logfile size to a minimum just add /NP (No Progress – don’t display % copied)

rc-ndl

Now I had all the necessary log information without any clutter! Nice!

Security

Depending on your migration strategy you might need to migrate Data with security intact, that’s where /COPY:DATS can help. Omit the S and security isn’t processed. And if the security didn’t process at first, you can always do a /SECFIX. We usually use a Synology NAS to transport data. Synology is great for capacity, ACLs… well I’m sure with some tinkering… probably, I can’t really say I know how… Glad that RoboCopy could help. Incidently Ralf Kerverzee figured out how /SECFIX worked! Go Ralfie!!!

Exclude Files/Folders

Depending on your window, you might need to exclude some files during initial sync. And if you’re merging folders to one target then there’s no way around that. I’ve used /XF and /XD to reduce Data Sync time from 20 hours to 3! There was one migration where they were effectively doing a full sync every time, remember /MIR is unforgiving! It took some figuring out, but was well worth the time. I usually create an Excel worksheet with all necessary RoboCopy Parameters which I then use in csv form as input for… Yep… you guessed it! 😉

Press repeat

Data migration is usually done in low peek hours. You can have RoboCopy start during a certain time frame /RH:hhmm-hhmm (Run Hours – times when new copies can be started). No need to create scheduled job.

But wait! There’s more!

Here’s a link to all the possible parameters you can use with RoboCopy. Definitely worth reading! I haven’t encounter a situation (yet) that RoboCopy couldn’t handle.

Well, that’s my tribute to RoboCopy!

wpid-wp-1434546591037.jpg

Hope it’s worth something to you

Ttyl,

Urv

Revisiting NTFS Longpaths issue

wpid-wp-1433338305313.jpg

‘Sup PSHomies! It will be a thing… trust me…

Before the NTFSSecurity module there was a little known trick I used to get the NTFS rights on longpath folders: \\?\

There’s just one catch… It only works with windows command utilities. ICACLS, MD, ATTRIB can all make use of this prefix.

Let’s say I wanted to read the NTFS rights on a longpath folder using icacls. The command for this would be:

icacls “\\?\<Driveletter:>\LongPath” 

Notice the <DriveLetter:> syntax? Well that’s because \\?\ can only be used as a prefix in front of a DriveLetter. It did mean that I had to make a drivemapping, an inconvenience I didn’t mind one bit!

Ok I’ll admit that I tried using a UNCPath, but that didn’t work… can’t blame a brother for trying eh? 😉

While I was using the NTFSSecurity module (Did I mention how much I love this module?) I got an access denied on a folder (No that’s not the exciting part). What I did noticed was a familiar prefix of sorts: \\?\UNC\

So you can use the UNCPath as well! All I had to do was omit the ‘\\’ from the path. Say I wanted to read the NTFS rights on ‘\\SERVERX\SHARE\Longpath’ the syntax would be:

‘\\?\UNC\SERVERX\SHARE\LongPath’

Awesome right?!!! Now you might be asking yourself: ‘Well that’s all gravy Urv, but why bother? Why not just use the NTFSSecurity module?”

There’s an old saying that goes: “A poor workman always blames his tools…”

There may be times when PowerShell isn’t readily available to you (I know perish the thought!). Sometimes you have to make do with what’s available! When it comes to manipulating NTFS rights you should be proficient in using different tools/utilities. I’ll admit while I love PowerShell Set-Acl has to be my least favorite cmdlet!

So what’s next?

Well imagine enumerating a longpath folder without errors and retrieving the NTFS rights.

example ICACLS

Looking at the output, you should be able to do some neat formatting using regular expressions!  Hey! I didn’t say you weren’t allowed to use PowerShell at all… Regular Expressions… Not my forte… Any takers in the PowerShell community? 🙂

Having alternative methods for retrieving NTFS rights is always a good thing. You can always use an alternative methods for verification. Get-Acl (Get-Acl2 if using NTFSSecurity), ICACLS and the security tab (Always use Advanced mode) are my methods of choice for verification.

Hope it’s worth something to you

Ttyl,

Urv

PS. powershell.com has some great posts how to handle NTFS the PowerShell way! 😉