Running Batch SSRS (SQL Server) Reporting Services Reports using Powershell

We recently needed to run a one page report 10,000 times. Using previous methods of nesting reports using Report Builder and SSRS didn’t work. We encountered timeouts and memory issues on the server. We could have spent time changing timeout values and figuring out how to overcome this using SSRS. But we found an easier way….

Previously I had written a small C# application using the Microsoft Report Viewer Runtime (a winforms component) to run a batch of reports, create a folder structure and output to PDFs in the respective folders. I decided to re-use this solution, but tailor it to the new reports that we needed.

I encountered an issue with my visual studio and needed to get this batch report sorted, so we looked into using powershell. Below is an anonomised version of the script we used.

It uses the Microsoft Report Viewer Runtime 2012 (dll is v10) to call the report and export to PDF. Using Powershell to interact with .NET is easy and allows you to leverage the short hand coding nature of powershell with the wide array of easy to use libraries that .NET provides.

## Reference ReportViewer Library
[void] [System.Reflection.Assembly]::Load("Microsoft.ReportViewer.WinForms, `
Version=, Culture=neutral, `

## Report viewer object
$rv = New-Object Microsoft.Reporting.WinForms.ReportViewer

## Set Report Processing to server mode
$rv.ProcessingMode = "Remote"

## Set the report server base url
$rv.ServerReport.ReportServerUrl = "http://servername/reportserver"

## Set the report path
$rv.ServerReport.ReportPath = "/base folder/folder/ReportName"

## Create an array with all the required ids
$ids = (0..10000)

## Loop through all the ids
foreach ($id in $ids)
# Parameters array
$params = new-object 'Microsoft.Reporting.WinForms.ReportParameter[]' 1
$params[0] = new-Object Microsoft.Reporting.WinForms.ReportParameter("ParameterName", $id, $false)

# Set Report Parameters

# Refresh / Run Report

# Out vars
$mimeType = $null
$encoding = $null
$extension = $null
$streamids = $null
$warnings = $null

# PDF file generation
$bytes = $rv.ServerReport.Render("PDF", $null,
[ref] $mimeType,
[ref] $encoding,
[ref] $extension,
[ref] $streamids,
[ref] $warnings);

## Output to file
$file = "C:\FolderName\$id.pdf";

$fileStream = New-Object System.IO.FileStream($file,
$fileStream.Write($bytes, 0, $bytes.Length);


Remote Access (VNC) to OSD Build During WinPE

We have been seeing more and more issues with OSD and the latest model Dell computers, i work for an organisation with quite a few small sites and there are not I.T. specialists at each site.

Trying to explain to someone on the phone how to copy files via command line to a USB stick (when the log could be in 1 of about 7 places! – we use MDT) is difficult and time consuming.

We needed a better way….

At first I started looking into a tool I heard about a few years back called DaRT

There was some confusion in my organisation about whether we would / wouldn’t be licensed for the Microsoft Desktop Optimization Pack (which DaRT is part of). It also requires changing the boot image used during OSD, which is not ideal as we use one TS globally and trying to schedule the distribution of a new boot image is difficult.

So i put it on the back-burner, then today we had more issues with OSD and this requirement became more urgent for me. I suddenly realised that VNC server can run in a User Context, so maybe it can be run during WinPE.

A quick google found this excellent article :

Within 30 minutes I had VNC server running during our OSD build and I could connect to it successfully. We had already been using BGInfo to stamp the hostname and IP details on the background, so connecting was easy!

I have found some issues with this method, if the hard disk is not formatted when you start the TS then it doesn’t copy the content for VNC from the package and the commandline step fails. I have tried a few things to fix this with no success. Next is to not bother putting the VNC content into a package and copying it directly from a network share.

Status Messages For an OSD Deployment

Troubleshooting OSD remotely can be difficult if the person initiating the build does not know how to get the logs from a failed machine.

This is a useful status message query to show all messages from the OSD process. It requires a Deployment ID, which is from the deployment of the task sequence you have a problem with.

from SMS_StatusMessage
left join SMS_StatMsgInsStrings
on SMS_StatMsgInsStrings.RecordID = SMS_StatusMessage.RecordID
left join SMS_StatMsgAttributes
on SMS_StatMsgAttributes.RecordID = SMS_StatusMessage.RecordID
where SMS_StatMsgAttributes.AttributeID = 401 and SMS_StatMsgAttributes.AttributeValue = "!!DEPLOYMENTID!!"
and SMS_StatMsgAttributes.AttributeTime >= ##PRM:SMS_StatMsgAttributes.AttributeTime## order by SMS_StatMsgAttributes.AttributeTime DESC

Replace !!DEPLOYMENTID!! with the deployment ID of the TS.

Deploy Adobe Reader 11.0.10 with SCCM 2012 R2

Howto install Adobe Reader 11.0.10 using System Center Configuration Manager (SCCM) 2012 R2

Run the following to extract the msi :

AdbeRdr11010_en_US -nos_ne

This will extract the contents to : C:\ProgramData\Abobe\Setup

Create an admin install :

msiexec /a AcroRead.msi

When prompted enter a location to save the files e.g. : C:\AdobeReader\

Apply the update patch to the admin install :

msiexec /p C:\ProgramData\Adobe\Setup\AdbeRdrUpd11010.msp /a C:\AdobeReader\AcroRead.msi

Open the Adobe Customization Tool, open the AcroRead.msi file in C:\AdobeReader\ – make some changes e.g :

  • Disable Auto Update
  • Automically accept the EULA
  • Remove Shortcuts

Use the transform menu to generate a transform file and save it e.g : C:\AdobeReader\Install.mst

Copy the files to your SCCM package source, create and application and install using the transform!

Grouping Data in Powershell

I needed to create a report based on data from two different databases and servers. My options were to create a SQL datawarehouse which drags in the data from the different databases – or utilize Powershell.

I decided to use Powershell, as I found it easier to estimate how long it would take me to produce the report. My data is in this format :

FullName MachineName IPAddress PackageName Cost State Requested Completed Installed
Fake Fred PCFGH78BH Adobe Acrobat 90 Approved 15/04/2015 22:28 15/04/2015 22:28 15/04/2015 22:36

I wanted to create a table using Powershell grouping the costs per application with count. (As a summary) I thought there may be a builtin cmdlet to do it, but unfortunately i could not find one.

Google led me to this helpful Gist which i used as inspiration for the grouping. I understand that this may not be the most computationally efficient way of doing this, but my data will never be more than a few hundred rows.

This method involves creating a new object with the grouped data, and then adding 2 properties to that object, one for Cost and one for Count.

$dataGrouped = $data | Group-Object -Property PackageName

$test = @()
$test += foreach($item in $dataGrouped)
    $item.Group | select -Unique PackageName,
    @{Name='Count';Expression = {(($item.Group) | measure -Property PackageName).Count}}
    @{Name='Cost';Expression = {(($item.Group) | measure -Property Cost -sum).Sum}}


Collect a Log File From Multiple Machines

Today I needed to investigate a single log file on multiple machines from different sites – to achieve this, i used the following Powershell…

It reads a list of hostnames / machines then copys the log file from each to a central organised folder.

## Read a list of hostnames and attempt to gather a specific log file from each

## Path of the log file we want to gather - !!REPLACE!! will be replaced by the computername
$remoteLogFilePath = "\\!!REPLACE!!\c$\log.txt"

## Local working folder
$localFolderPath = "C:\Working\"

## List of Computers to Gather Log File From
$computers = get-content -Path "C:\computers.txt"

## Create a new directory in the working folder named todays date
$date = get-date -format m
$path = $localFolderPath + '\' + $date
new-item $path -ItemType directory

foreach ($computer in $computers)
$remotePath = $remoteLogFilePath -replace "!!REPLACE!!","$computer"
$localPath = $path + '\' + $computer + '.log'

copy-item -Path $remotePath -Destination $localPath