OneGet – Powershell Package Management

I have been writing a script to gather some permission information from out main Data drives. We have a lot of paths longer than 255 characters. Powershell’s default Get-Acl cmdlet cannot handle long file paths and errors.

Whilst looking for a solution, I came accross this .NET library which is supposed to help with issues related to long file names.

Microsoft.Experimental.IO

Considering I found it on nuget.org and I’m running Windows 10 (Windows Management Framework v5) I thought about using the new built in package management tool – previously called OneGet. The idea is simple – copy the linux package managers. Great, its just what Windows has needed always! I am not going to fully explain the concept behing a packagement tool in this post. This post has a great explanation.

So how do we install this Experimental library from Nuget.org? First, I tried the following command :

Install-Package Microsoft.Experimental.IO
Install-Package : No match was found for the specified search criteria and package name 'Microsoft.Experimental.IO'.
At line:1 char:1
+ Install-Package Microsoft.Experimental.IO
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : ObjectNotFound: (Microsoft.Power....InstallPackage:InstallPackage) [Install-Package], Exception
    + FullyQualifiedErrorId : NoMatchFoundForCriteria,Microsoft.PowerShell.PackageManagement.Cmdlets.InstallPackage

Hmmm, I guess the nuget repository isn’t loaded by default – how do we load a repo? Register-PackageSource https://technet.microsoft.com/en-us/library/dn890701.aspx Got a bit lucky here, the example at the bottom of the TechNet documentation for this cmdlet is how to load the nuget repo. I made one change, only a fool would use http instead of https.

Register-PackageSource -Name Nuget.Org -Location https://www.nuget.org/api/v2 -ProviderName nuget

Name                             ProviderName     IsTrusted  IsRegistered IsValidated  Location
----                             ------------     ---------  ------------ -----------  --------
Nuget.Org                        NuGet            False      True         True         https://www.nuget.org/api/v2

Now the repo is loaded we can install the Experimental.IO library. I ran the following command to list the packages containing “experimental”

find-package -contains Experimental | out-gridview

find-package-experimental-io

Converting VMware Machine to Hyper-V

As described in a previous post, I have installed MediaWiki on Server 2012 R2 Core. Now I need to convert the machine I built from ESXI to Hyper-V.

I needed to keep the original VM, with its Snapshots, and create a copy of it for a LAB environment. My setup :

  • ESXi (Free) 6.0.0 – Source
  • Server 2012 TP4 with Hyper-V – Destination

Steps I used to copy the VM :

  • Take a Snapshot of the Virtual Machine as a backup
  • We need a machine without snapshots to import into Hyper-V
  • Task Snapshot of functioning machine
  • Run “sysprep /generalize /oobe /shutdown” to generalize the machine and shut it down
  • sysprep

  • Use Vmware converter to convert machine from ESXI format to Workstation format (this will consolidate the snapshots, but leave the original VM untouched)
  • Download and install Microsoft Virtual Machine Converter (I used chocolatey to get MVMC)
  • Once conversion is finished you should have a vmdk and a vmx file in a folder.
  • Using the GUI of Microsoft Virtual Machine Converter it’s not possible to convert this ‘offline’ machine
  • We need to use powershell :
  • Import-Module 'C:\Program Files\Microsoft Virtual Machine Converter\MvmcCmdlet.psd1'
    ConvertTo-MvmcVirtualHardDisk -SourceLiteralPath "C:\PathToVMDK\disk.vmdk" -DestinationLiteralPath "C:\DestinationFolder\" -VhdType DynamicHardDisk -VhdFormat Vhdx
    Disable-MvmcSourceVMTools -DestinationLiteralPath "C:\DestinationFolder\disk.vmdk"
    
  • Now, create a new VM in Hyper-V manager, I used gen 1 as my ESXI machine was using legacy / bios firmware. You may need to use Gen 2 if the original VM was running in UEFI mode.

I was pleasantly surprised by how quick and easy this was – whenever I have tried to do VM conversion in the past its never worked correctly!!

Enable .NET 3.5 In Windows 10 With Powershell

Depending on how you installed Windows 10, in order to enable .NET 3.5 features you may need to use an alternative source location. (The ‘sources’ folder from a W10 ISO file)

Run Powershell as Admin
powershell_admin
Run the following command (replacing the path with the path to your extracted or mounted ISO):

Enable-WindowsOptionalFeature -Path "D:\ISO\Windows\Desktop\Windows 10 Ent\SW_DVD5_WIN_ENT_10_64BIT_English_MLF_X20-26061\sources" -FeatureName NetFX3

command
net35_result

Collecting Custom Properties / Fields Using the Registry and Hardware Inventory

I recently implemented a mechanism for collecting warranty end dates using a script which added the date to a registry key, then using hardware inventory to collect the key into SCCM SQL DB.

This is useful if there is information about a client machine which is not stored in WMI.

It worked great and proved a good method of collecting data, In this post i will cover the whole process in detail :

Step 1 – The Script

The first stage is creating a powershell, vbscript or bat file to grab the custom property you are looking for and write it to a registry key. I won’t go into the detail of doing that in this post. I am using a script which reads the setupact.log file in C:\Windows\Panther and returns whether the machine firmware is BIOS or EFI (UEFI) – then writes the info to a reg key. I couldn’t find a field in the registry or in WMI on Windows 7 which stores this information. (Hence the script and this post!)

Step 2 – The Deployment

The script needs to be deployed and run on every machines we want to collect information from. I find it easiest to create a source folder, place the powershell script in there and create a bat file ‘wrapper’ to call the powershell script and temporarily change the execution policy.

powershell.exe -ExecutionPolicy Bypass -NoLogo -NonInteractive -NoProfile -WindowStyle Hidden -File "script.ps1" >> "%temp%\script.log"

For the detection method, use the following :

  • Setting Type: Registry
  • Hive: HKEY_LOCAL_MACHINE (or other if you are using different hive)
  • Key: SOFTWARE\CompanyName\YourKey (this is the key you have written your info too)
  • Value: NameOfKey
  • Data Type: String (or other if you are using different data type)
  • I used “This registry setting must satisfy the following rule to indicate the presence of this application”
  • Operator “Not equal to”
  • Value: <blank>

Sometimes my script would fail to gather the info required but would create the key with null value, i didnt want my detection rule to be satisified by this. (I know i was lazy and should have added more error checking the script! But this had the same outcome….)

Step 3 – The Mof File

Great! We have the info in the registry, but now we want to add it to the SCCM DB – for ease of querying and reporting etc…

This method was inspired by this brilliant post – https://sccmguru.wordpress.com/2014/04/24/hp-and-lenovo-warranty-information-in-configuration-manager-2012-r2/

I suggest using the methods described in the above post to create your MOF file and import the changes into ConfigMgr.

Step 4 – The Result

Once you have deployed the hardware inventory custom settings, the deployment has run on some machines and hardware inv has run you should have some results! If not, something is wrong, retrace your steps and check for mistakes…

ConfigMgr Distribution Point Priority

Using System Center – Configuration Manager 2012 R2 we recently had some issues with routing, iscsi interfaces and content distribution. We needed to speed up distribution to a specific distribution point for some corrupt packages in order to get software installed at an Office in the Middle East.

So we needed the content to ‘jump’ the queue. There are no options in the SCCM Console to change DP priority order (that I have found), but luckily Microsoft have provided a way of doing it.

Note : As noticed by flatfour67 in the comments this may not work for DP’s assigned to secondary sites.

The following PoSH (Powershell) will list your Distribution Points and the Priority Assigned to them.

$dpinfo = Get-WmiObject -Query "SELECT NALPath, Priority, SiteCode, TransferRate, Description FROM SMS_DistributionPointInfo" -Namespace "ROOT\SMS\site_SITECODE" -ComputerName "MP_ServerName" | select NALPath, Priority, SiteCode, TransferRate
$dpinfo | Out-GridView
## Yes it could be done on one line, but I like doing it this way. After i have closed the GridView window i am still able to access the data without having to query WMI on the server again...

Default priority is 200, anything lower has a higher priority (content will be sent here first) and vice versa. Available values are 1 -> 300.

So How Do I Change the Priority?

Continue reading “ConfigMgr Distribution Point Priority”

Running Batch SSRS (SQL Server) Reporting Services Reports using Powershell

We recently needed to run a one page report 10,000 times. Using previous methods of nesting reports using Report Builder and SSRS didn’t work. We encountered timeouts and memory issues on the server. We could have spent time changing timeout values and figuring out how to overcome this using SSRS. But we found an easier way….

Previously I had written a small C# application using the Microsoft Report Viewer Runtime (a winforms component) to run a batch of reports, create a folder structure and output to PDFs in the respective folders. I decided to re-use this solution, but tailor it to the new reports that we needed.

I encountered an issue with my visual studio and needed to get this batch report sorted, so we looked into using powershell. Below is an anonomised version of the script we used.

It uses the Microsoft Report Viewer Runtime 2012 (dll is v10) to call the report and export to PDF. Using Powershell to interact with .NET is easy and allows you to leverage the short hand coding nature of powershell with the wide array of easy to use libraries that .NET provides.

## Reference ReportViewer Library
[void] [System.Reflection.Assembly]::Load("Microsoft.ReportViewer.WinForms, `
Version=10.0.0.0, Culture=neutral, `
PublicKeyToken=b03f5f7f11d50a3a")</code>

## Report viewer object
$rv = New-Object Microsoft.Reporting.WinForms.ReportViewer

## Set Report Processing to server mode
$rv.ProcessingMode = "Remote"

## Set the report server base url
$rv.ServerReport.ReportServerUrl = "http://servername/reportserver"

## Set the report path
$rv.ServerReport.ReportPath = "/base folder/folder/ReportName"

## Create an array with all the required ids
$ids = (0..10000)

## Loop through all the ids
foreach ($id in $ids)
{
# Parameters array
$params = new-object 'Microsoft.Reporting.WinForms.ReportParameter[]' 1
$params[0] = new-Object Microsoft.Reporting.WinForms.ReportParameter("ParameterName", $id, $false)

# Set Report Parameters
$rv.ServerReport.SetParameters($params)

# Refresh / Run Report
$rv.RefreshReport()

# Out vars
$mimeType = $null
$encoding = $null
$extension = $null
$streamids = $null
$warnings = $null

# PDF file generation
$bytes = $rv.ServerReport.Render("PDF", $null,
[ref] $mimeType,
[ref] $encoding,
[ref] $extension,
[ref] $streamids,
[ref] $warnings);

## Output to file
$file = "C:\FolderName\$id.pdf";

$fileStream = New-Object System.IO.FileStream($file,
[System.IO.FileMode]::OpenOrCreate);
$fileStream.Write($bytes, 0, $bytes.Length);


$fileStream.Close();
}

Grouping Data in Powershell

I needed to create a report based on data from two different databases and servers. My options were to create a SQL datawarehouse which drags in the data from the different databases – or utilize Powershell.

I decided to use Powershell, as I found it easier to estimate how long it would take me to produce the report. My data is in this format :

FullName MachineName IPAddress PackageName Cost State Requested Completed Installed
Fake Fred PCFGH78BH 10.0.0.1 Adobe Acrobat 90 Approved 15/04/2015 22:28 15/04/2015 22:28 15/04/2015 22:36

I wanted to create a table using Powershell grouping the costs per application with count. (As a summary) I thought there may be a builtin cmdlet to do it, but unfortunately i could not find one.

Google led me to this helpful Gist which i used as inspiration for the grouping. I understand that this may not be the most computationally efficient way of doing this, but my data will never be more than a few hundred rows.

This method involves creating a new object with the grouped data, and then adding 2 properties to that object, one for Cost and one for Count.

$dataGrouped = $data | Group-Object -Property PackageName

$test = @()
$test += foreach($item in $dataGrouped)
{
    $item.Group | select -Unique PackageName,
    @{Name='Count';Expression = {(($item.Group) | measure -Property PackageName).Count}}
    @{Name='Cost';Expression = {(($item.Group) | measure -Property Cost -sum).Sum}}
}