PowerCLI – Disabling ESXi OpenSLP service for VMSA-2021-0002

OpenSLP has cropped up again as an ESXi vulnerability, and if you want to disable the service the KB article given only has details for doing so via the ESXi command line.

Far easier, if you have many hosts, is to use PowerCLI, and while it’s relatively simple I thought I would share this to help anyone else wanting to do so.

Disabling the service
Connect to the environment with ‘connect-viserver’ and then run:

Get-VMHost | %{
	$_ | Get-VMHostFirewallException -Name "CIM SLP" | Set-VMHostFirewallException -Enabled:$false
	Stop-VMHostService -HostService ($_ | Get-VMHostService | ?{$_.Key -eq "slpd"}) -Confirm:$false
	$_ | Get-VMHostService | ?{$_.key -match "slpd"} | Set-VMHostService -Policy "off"
}

Checking the status
Connect to the environment with ‘connect-viserver’ and then run:

Get-VMHost | %{
	$rule = $_ | Get-VMHostFirewallException -Name "CIM SLP"
	$serv = $_ | Get-VMHostService | ?{$_.Key -eq "slpd"}
	$_ | select Name,@{N="Rule";E={$rule.enabled}},@{N="ServiceRunning";E={$serv.Running}},@{N="ServiceEnabled";E={$serv.Policy}}
}

Edit : As per the comment from Zeev, I’d missed disabling the service, I’ve updated the Disabling and Checking scripts above to include the correct information now.

vRealize Orchestrator Name/IP lookups

I’ve started looking at upgrading our standalone VRO instances from 7.x to 8.x, and one thing that has changed significantly is that we can no longer use the appliance linux environment to run dig or nslookup

There are a couple of System calls:

System.resolveHostName(hostname);
System.resolveIpAddress(ip);

These allow the usual forward and reverse lookups, but have significant limitations

System.resolveHostName :

  • Only returns one record at a time, so if there are multiple records you would have to write a loop to collect them all
  • Only returns the IP address, no ability to return record type, TTL, SOA record

System.resolveIpAddress :

  • Only returns one record at a time, so if there are multiple records you would have to write a loop to collect them all
  • Only returns the host name, no ability to return record type, TTL, SOA record
  • ONLY WORKS IF A FORWARD RECORD EXISTS THAT MATCHES

This final point took some significant figuring out, and in combination with the rest of the points resulted in me having to change some of the workflows to do an SSH to a linux server to do normal dig and nslookup commands, rather than using the System calls.

Windows: Converting PEM and PKey to PFX file

I’ve just been working on our Chef recipe that installs a CA signed cert for the Windows RDP service. Originally I wrote the recipe to interact with Windows PKI, but it was a bit clunky and I was never really happy with it.

With Hashicorp Vault now being available in our environment I started looking at migrating over to using this, as the integration with Chef was far superior. However, the one stumbling block I came across was that Vault would export the certificate as a PEM file and a Private Key file, whereas Windows could only install and use this pair as a PFX file. A couple of utilities have been available for a while for doing the conversion, either openssl or Pvk2Pfx but I’ve always shied away from installing new software if at all possible to simplify maintenance.

Fortunately I’ve discovered that certutil now has an option to do this conversion ‘-MergePFX’

Simply put the certificate and key in the same folder with the same name (but different extensions), such as rdpcert.cer and rdpcert.key and run:
certutil -MergePFX rdpcert.cer rdpcert.pfx
and this will combine the files into a PFX file for import.

VRO – Listing out items from Content Library

Shortly after starting a series of posts on using VRO to upload templates to content library, I was asked how you would retrieve a list of content library items.

It isn’t quite as straightforward as you might think as the List() function only provides a list of IDs rather than full details. However armed with this knowledge, you can get the list of IDs and use that to iterate through and produce a full list of items.

There are four main sources that I’ve been using for the API information
(the links here are direct to the Content Library section)

Overview

The steps we need to go through here are

  • Find and connect to an endpoint
  • Find the content library(s) on the endpoint and connect
  • Get a list of items in the content library
  • For each item in the list, get its details
  • Return the array of items with their full details

Find and connect to an endpoint

The code for this uses the VAPI plugin with the getAllEndpoints() method. An array of all configured endpoints on the VRO instance is returned. We do a quick check to ensure we have something in the array before using it with a for each command

var endpoints = VAPIManager.getAllEndpoints();
var endpoint = endpoints[0]
if (endpoint == null) {
throw "Unable to locate a VAPI endpoint";
}
var ovfLibraryItem = new Array();
for each(var endpoint in endpoints){

System.log("Searching endpoint " + endpoint);
var client = endpoint.client();
...

You can see there that I’ve also initialised the array in which the list of library items will be stored.
The connection to the endpoint is handled by the client() method

Find the content library(s) on the endpoint and connect

Firstly we get a list of the content libraries on the endpoint, with the com_vmware_content_library call. The list() method is used to return an array of their identifiers

var clib = new com_vmware_content_library(client);
System.log("The number of libraries on this endpoint is: " + clib.list().length);
System.log(clib.list());
if(clib.list().length >= 1){

Get a list of items in the content library

To query items in the library, we need to create an object with the com_vmware_content_library_item call

var itemSvc = new com_vmware_content_library_item(client);

Once we’ve created that, we can iterate through the list of content libraries and get a list of items within that library. The list() method is used again here to get an array of items.
for each(var clibrary in clib.list()){
var items = itemSvc.list(clibrary);
System.log(items);

For each item in the list, get its details

This just iterates through the array of items, and uses the get() method on com_vmware_content_library_item to get the detail for each item in turn. The results are pushed into the ovfLibraryItem array that we created near the start.
for each(item in items) {
var results = itemSvc.get(item);
System.log(results.name);
ovfLibraryItem.push(results);
}

Return the array of items with their full details

This is as simple as making the ovfLibraryItem array an output from the scriptable task and the workflow, remembering to close down the endpoint client at the end of the script

Putting it all together

The full code for this comes together like this

// Set the VAPI endpoint to the first endpoint returned
var endpoints = VAPIManager.getAllEndpoints();  
var endpoint = endpoints[0]

if (endpoint == null) {  
  throw "Unable to locate a VAPI endpoint";
}
var ovfLibraryItem = new Array();

for each(var endpoint in endpoints){
  System.log("Searching endpoint " + endpoint);
  var client = endpoint.client();  
  var clib = new com_vmware_content_library(client);  
  System.log("The number of libraries on this endpoint is: " + clib.list().length);
  System.log(clib.list());

  if(clib.list().length >= 1){
    var itemSvc = new com_vmware_content_library_item(client);

    for each(var clibrary in clib.list()){
      var items = itemSvc.list(clibrary); 
      System.log(items);

      for each(item in items) {
        var results = itemSvc.get(item); 
        System.log(results.name);
        ovfLibraryItem.push(results);
      }
    }
  }
  client.close();
}

No inputs are required, and the only output is ovfLibraryItem which is an array of VAPI:com_vmware_content_library_item__model


VFRC Cache Stats with PowerCLI

I’ve recently set up VMware Flash Read Cache on a couple of ESXi servers. They were bought with the SSDs to do this, as they only had internal disk rather than and external array, but for some reason the configuration never happened.

I’ve written a script to perform the configuration, but it’s not quite ready for release, however when monitoring the effectiveness of the cache, I’d been using esxcli to check the stats. Enabling SSH and logging on to the hosts was tiresome so I whipped up a quick few lines of PowerCLI to do the job:

$esxcli = get-vmhost <hostname> | get-esxcli
$caches = $esxcli.storage.vflash.cache.list() 
foreach ($cache in $caches) {
    $stats = $esxcli.storage.vflash.cache.stats.get($cache.Name,"vfc")   
    $cache | select Name, @{N="CacheUsage%";E={$stats.Cacheusagerateasapercentage}},@{N="HitRate";E={$stats.Read.Cachehitrateasapercentage}}
}

The output looks something like:

Name                          CacheUsage% HitRate
----                          ----------- -------
vfc-2915015888-VM1            99          8
vfc-2910392434-VM2            99          12
vfc-2910723509-VM3            99          11
vfc-2914146967-VM4            99          11

Extracting Dell Original Configuration with PowerShell

This would probably have taken less time if I’d just input each tag to the website and done “Export to csv” however I hate repetitive tasks and thought I’d be able to reuse some existing PowerShell scripting that does HTML mechanisation.

The Dell Support web page allows you to submit a server tag, and view the original configuration for the server when it was delivered. This is displayed in a series of expandable sections, but with an option to export to csv.

I had a list of servers that were available for re-use, with serial numbers, but no hardware specs, so decided to download the specifications to decide whether any of them would be suitable for my requirement.

The first hurdle was the I could only run the PowerShell from my Mac, so that meant using PowerShell Core. In turn I quickly discovered that I couldn’t use the more useful HTML parsing methods as they utilise the Internet Explorer engine.

Next I found that with the UserAgent field left as default, the request was being intercepted by the outbound proxy, fortunately Invoke-WebRequest allows you to spoof the UserAgent, enabling the page to be retrieved.

After a few abortive attempts, I discovered that there is a field used by the “Export to csv” function in the web page which appears to contain the hardware inventory in an html encoded JSON string.

Firstly I had to string this specific line out of the raw HTML, then strip the data out of the HTML code. Next I had to fix a corruption caused by the entry relating to the 2.5″ drives (the escaping of the double quote wasn’t working), and then HTML decode the text and convert from JSON to a PowerShell object.

The PowerShell object had an entry for each different type of component, but within this it had an embedded list on some lines, for example a hard drive might have parts for Carrier, Label, Drive, Screws, so this list then had to be parsed into a readable text line.

The resultant code is here:

$tag = "<dell_tag>"
$url = "http://www.dell.com/support/home/us/en/04/product-support/servicetag/$tag/configuration"
$useragent = "Mozilla/5.0 (iPad; U; CPU OS 3_2_1 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Mobile/7B405"
$html = Invoke-WebRequest -uri $url -UserAgent $useragent
$str = $html.Content.Split( [Environment]::NewLine ) | Select-String 'hdnParts'
$json = ([string]$str).Split("=")[8].split("/>")[0] -replace "2.5\\&quot","2.5inch" | convertfrom-json
$table = [System.Web.HttpUtility]::HtmlDecode($json) | convertfrom-json
$table | Select SkuNumber,SkuDescription,@{N="Qty";E={$_.Parts | %{"$($_.Qty) $($_.Description)"}}} | export-csv -path "$($Tag).csv" -includetypeinformation:$false

A lot of work for a few lines of code, but re-usable (until Dell change their website) and more interesting than a bunch of copy/paste/click

 

 

Auto-install of .Net 3.5 on Windows 2012 R2

The “standard” way of installing .Net 3.5 into a Windows 2012 R2 server, is to mount the install DVD (or ISO image), and use Add Features to install it. Obviously this is a massive pain if you’ve got a lot to do, as you either need to copy 4.5Gb of image around, or use some out-of-band method of mounting the image, neither of which are ideal. The only sane option would be to extract the ISO to a CIFS share, and make that available to all servers, but this wasn’t an option here.

For automation, we would normally use the PowerShell command:

Install-WindowsFeature Net-Framework-Core -source \\image-path\sources\sxs

… so already, it looks like we don’t need all the image to do the install, just the “sources\sxs” directory.

A quick check shows that the “sources\sxs” directory is 289Mb, so much more manageable, but surely we can do better than this, as it includes a lot of other features.

Running a filter with procmon during the feature install allows you to capture all the file accesses to the sources\sxs directory, which can be exported as a CSV file:
"Time of Day","Process Name","PID","Operation","Path","Result","Detail"
"1:59:25.0591131 PM","TiWorker.exe","2896","ReadFile","D:","SUCCESS","Offset: 0, Length: 4,096, I/O Flags: Non-cached, Paging I/O, Synchronous Paging I/O, Priority: Normal"
"1:59:25.1140710 PM","TiWorker.exe","2896","ReadFile","D:","SUCCESS","Offset: 4,096, Length: 4,096, I/O Flags: Non-cached, Paging I/O, Synchronous Paging I/O, Priority: Normal"
"1:59:25.1183118 PM","TiWorker.exe","2896","ReadFile","D:","SUCCESS","Offset: 8,192, Length: 4,096, I/O Flags: Non-cached, Paging I/O, Synchronous Paging I/O, Priority: Normal"
"1:59:25.1197002 PM","TiWorker.exe","2896","CreateFile","D:\sources\sxs","SUCCESS","Desired Access: Read Attributes, Disposition: Open, Options: Open Reparse Point, Attributes: n/a, ShareMode: Read, Write, Delete, AllocationSize: n/a, Impersonating: NT AUTHORITY\SYSTEM, OpenResult: Opened"
"1:59:25.1215631 PM","TiWorker.exe","2896","QueryBasicInformationFile","D:\sources\sxs","SUCCESS","CreationTime: 3/21/2014 2:27:47 PM, LastAccessTime: 3/21/2014 2:27:47 PM, LastWriteTime: 3/21/2014 2:27:47 PM, ChangeTime: 3/21/2014 2:27:47 PM, FileAttributes: RD"
"1:59:25.1215789 PM","TiWorker.exe","2896","CloseFile","D:\sources\sxs","SUCCESS",""
"1:59:30.8039209 PM","TiWorker.exe","2896","ReadFile","D:","SUCCESS","Offset: 0, Length: 4,096, I/O Flags: Non-cached, Paging I/O, Synchronous Paging I/O, Priority: Normal"
...etc....

This can then be condensed with (I’m using a *nix command line to do this as it’s more familiar to me):

cat Logfile.CSV | cut -d"," -f5 | cut -d"\\" -f4 | sort -u | grep -v -e "^$" -e "^\""

To give a list of subfolders in the “sources\sxs” directory that are required. These can be used to copy the relevant source files to a zip archive:
cat Logfile.CSV | cut -d"," -f5 | cut -d"\\" -f4 | sort -u | grep -v -e "^$" -e "^\"" | while read folder
do
zip -r net35.zip /Volumes/Win2012R2ISO/sources/sxs/$folder
done

…which generates an (approx) 88Mb zip file, much more suitable for installing via automation.

It’s then a fairly straightforward task to use your automation framework (Chef, Puppet etc) to copy the zip file down to the server, extract and run the powershell command to install.

PowerCLI code snippet to get storage driver details

This is just a brief post to share a code snippet that I built to display the storage driver in use.

The driver and it’s version are critical for VMware VSAN, and I needed a quick and easy way of checking it. I might revise the code at a later date to run across multiple hosts in a cluster, and output the results in a table, but for now, here’s the basics.

connect-viserver <vcname>
$esxcli = Get-EsxCli -vmhost <esxihostname>
$adapter = $esxcli.storage.core.adapter.list() |
select Description,Driver,HBAName | where {$_.HBAName -match "vmhba0"}
$driver = $adapter.Driver -replace "_", "-"
$esxcli.software.vib.list() |
Select Name,Version,Vendor,ID,AcceptanceLevel,InstallDate,ReleaseDate,Status |
Where {$_.Name -match ($driver + "$")}

This displays output such as

Name            : scsi-megaraid-sas
Version         : 6.603.55.00-1OEM.550.0.0.1331820
Vendor          : LSI
ID              : LSI_bootbank_scsi-megaraid-sas_6.603.55.00-1OEM.550.0.0.1331820
AcceptanceLevel : VMwareCertified
InstallDate     : 2016-05-03
ReleaseDate     :
Status          :

This works for the servers I’ve tried it on (Dell) but as usual YMMV…

Automating NSX from PowerCLI

I’ve been working on an NSX-based project recently, and given the task of automating the addition of new DLR Logical Switches and Edge devices.

After a discussion around the alternatives with colleagues, we decided the best way forward (for now) was to do it in PowerShell/PowerCLI, and a quick google found Chris Wahl’s post here

This was a great basis to work from (Thanks Chris!), but lacked a number of things I needed: Creating DHCP Pools, attaching a Logical Switch to an existing Edge device, and some relatively minor amendments to DLR/Edge configurations.

Of these the attachment of a new LS to an existing Edge proved the most intellectually taxing, as Chris’ scripts work with building new raw XML to PUT/POST with the REST API, and I soon discovered that the only way to amend an Edge configuration through the REST API is to pull the existing config as an XML, amend it, and PUT it back.

On top of this, the XML retrieved through the “Invoke-WebRequest” PowerShell cmdlet is of type “System.Xml.XmlElement” whereas to do things like “CreateElement” – which we need to do to add new entries into the configuration –  it needs to be of type “System.Xml.XmlDocument”.

After a number of failed workarounds, I found that dumping the XML to a file, and reimporting, gave me the XML in the correct object type – this is a little ugly though, and while the automation is for something that would only be used occasionally, I don’t like ugly hacks in my code!

A little more effort, and I had a suitable alternative – casting the XML to a string object and back to an XML object yielded the result I was looking for.

$edge = Invoke-WebRequest -Uri “$uri/api/4.0/edges/$routerid” -Headers $head -ContentType “application/xml” -ErrorAction:Stop
 [xml]$edgexml = $edge.Content
 $textxml = $edgexml.innerxml
 [xml]$body = $textxml

I could then work with $body as a normal XmlDocument object in PowerShell.

The next issue I had was making the amendments to the XML.

First – make sure the new Logical Switch is not already attached, and then find the first unused interface:

foreach ($vnic in $body.edge.vnics.vnic) {
     if ($vnic.name -match $config.newLS.name) {
          $attached = "true"
          Write-Host -BackgroundColor:Black -ForegroundColor:Red "Warning: $($config.newLS.name) already attached. Skipping."
          break
     if ($vnic.isConnected -match "false") {

Second – setting values for XML entities that were already in the XML. Easy:

$vnic.name = $config.newLS.name
$vnic.isConnected = "true"

Third – adding a new XML entity that wasn’t already there:

$elem = $body.CreateElement("portgroupId")
$vnic.AppendChild($elem)
$vnic.portgroupId = $switchvwire.get_Item($config.newLS.name)

Finally – adding entities to an empty node “<addressGroups />”. Not so easy! This took some considerable time, including many false starts! In the end I discovered that to “find” the empty node using SelectSingleNode I had to set up a namespace. Then I could find it and remove it (this seemed easier than trying to attach entries to the empty node). Then I could create some raw XML and attach it into the Edge configuration XML using ImportNode and AppendChild.

$ns = New-Object -TypeName System.Xml.XmlNamespaceManager -ArgumentList $body.NameTable
$ns.AddNamespace("ns",$body.DocumentElement.NamespaceURI)
$oldAddressGroups = $vnic.SelectSingleNode("//vnic[index=$inf]/addressGroups")
$vnic.RemoveChild($oldAddressGroups)
[xml] $addr = "<addressGroups>
  <addressGroup>
    <primaryAddress>$($config.newLS.edgeip)</primaryAddress>
    <subnetMask>$($config.newLS.mask)</subnetMask>
  </addressGroup>
</addressGroups>"
$vnic.AppendChild($body.ImportNode($addr.addressGroups, $true))

Once that was done all I had to do was send the XML back using Invoke-WebRequest

# Attach new logical switch to existing Edge
try {$r = Invoke-WebRequest -Uri "$uri/api/4.0/edges/$routerid" -Body $body -Method:Put -Headers $head -ContentType "application/xml" -ErrorAction:Stop -TimeoutSec 30} catch {Failure}
if ($r.StatusCode -match "204") {Write-Host -BackgroundColor:Black -ForegroundColor:Green "Status: Successfully attached new Logical Switch to $($config.edge.name)."}
else {
$body
throw "Was not able to add new Logical Switch to existing Edge. API status code was not 204."
}
break}

I’ve no doubt that there are probably some better ways of achieving some of what I’ve done here, but I thought I would post it up in case anyone is looking to do something similar.