Using PowerShell jobs to run parallel operations

Posted: March 11, 2011 in File Operations, Scripts

Here is some sample code to run a bunch of jobs in parallel. It’s not a particularly good use of parallel computing, as it actually takes longer to run this way, but it gives me the shell to use for future jobs.

The key aspect to remember is that the variables are internal to the script block code. Hence the reading in of the parameters and the re-assignment to new variables.
This example searches each file and folder on C:\ non-recursively and reports back Read-only folders.

$list = Get-ChildItem “C:\”
foreach ($file in $list){
start-job -ArgumentList @($file.FullName, $file.Attributes) -scriptblock {
if ($Attr -eq 17){
Write-Host $TargetFile

Wait-Job *
Receive-Job *
Remove-job *

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s