• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Tachytelic.net

  • Sco Openserver
    • Sco Unix Support
    • SCO Openserver Installation ISOs
    • SCO Openserver Binaries
    • Add a Hard Drive to an Openserver 5 Virtual Machine
    • Install SCO Vision SQL-Retriever ODBC Driver on Windows 10
    • License Expired on Virtual SCO Openserver Installation
    • How to reset the root password on SCO Openserver 5
  • Scripting
    • PowerShell
      • Add leading zeros
      • Check if a File Exists
      • Grep with Powershell
      • Create Environment Variables
      • Test for open Ports
      • Append to a Text File
    • VBScript
      • Check if a File Exists
      • lpad and rpad functions
      • Windows Update E-Mail Notification
  • Office 365
    • Connect to Office 365 with PowerShell
    • Add or remove an email alias using Powershell
    • Change Primary email address of Active Directory user
    • How to hide an AD user from the Global Address List
    • How to hide mail contacts from the Global Address List
    • Change the primary email address for an account with PowerShell
    • Change Primary email address of an AD User
    • Grant a single user access to access to all calendars
    • Forward email to an external address using Powershell
    • Convert shared mailbox to user mailbox with Powershell
  • Get in Touch
  • About Me
    • Privacy Policy

How To

How to install and boot a Dell PowerEdge from a PCIe NVMe drive

October 5, 2020 by Paulie Leave a Comment

PCIe NVMe storage can provide an incredible speed boost to any server but booting from it is not natively supported on some older Dell PowerEdge servers.

11th and 12th generation servers like the Dell PowerEdge R710 and R720 are very popular amongst the home lab community and could benefit from a fast boot device.

This procedure should work on any Dell PowerEdge Server that can boot from a USB device.

Booting from NVMe storage is simple to do. In this post I am going to explain how it’s done and show the benchmarks from both a Dell PowerEdge R310 and T320.

Hardware you will need:

  • Two USB Flash drives:
    • One to run clover bootloader. I used this tiny Sandisk Ultra Fit Flash Drive.
    • One for your bootable Windows ISO.
  • A PCI NVMe Adapter and a NVMe Drive:
    • I used this cheap NVMe to PCIe adapter from Amazon.
    • With a Samsung 970 Evo Plus also from Amazon

I also tested the process on an 1.2Tb Intel DC P3520 PCIe card, which also worked fine.

Software you will need:

  • A Windows Server Installation ISO
  • Rufus to create the bootable Windows Installation.
  • Boot Disk Utility

PCIe NVMe Boot Process

When this procedure is complete, the PowerEdge server will boot from the internal USB storage and run the Clover EFI Bootloader. Clover will contain the NVMe boot driver and boot the installed operating system from the NVMe storage.

If your server has internal SD card storage, you could boot from that instead.

Install the NVMe Adapter and Drive

First install the NVMe adapter and drive into your Dell PowerEdge server. I used this cheap adapter from Amazon and a 500Gb Samsung 970 Evo Plus.

Here is the unit before I installed it into the server without the heatsink applied. It comes with regular and low profile PCIe bracket:

Samsung 970 Evo Plus NVMe SSD Installed into a PCIe adapter.

And here is the unit installed in the PowerEdge R310 with the heatsink and thermal pad applied:

PCI NVMe Adapter installed into an 11th Generation Dell PowerEdge Server

Create your bootable Windows Server Installation

The first step is to create your Windows Server Installation USB Stick. There are lots of guides on how to do this but I will show how I did it.

  • Download and Install Rufus.
  • Point Rufus to your Windows Server ISO.
  • Configure Rufus with the following options:
    • Partition Scheme: GPT
    • Target System: UEFI (non CSM)
      Image showing configuration of Rufus to make a Windows Server Bootable ISO

Install Windows in the normal way

Windows Server 2012 R2 and newer have Microsoft NVMe drivers built in, so it will see the NVMe storage and offer to install to that location.

When Windows setup is complete it will reboot. It will be unable to do because the Dell UEFI does not have any NVMe support. But don’t worry about that!

Setup the Clover EFI USB Boot Stick

Now setup the Clover USB Boot stick or SD Card.

  • Download and run Boot Disk Utility.
  • Insert the USB Stick that you are going to boot from into your PC.
  • Select your USB Stick and click format:
  • Open your newly formatted drive and copy \EFI\CLOVER\drivers\off\NvmExpressDxe.efi to:
    • \EFI\CLOVER\drivers\BIOS
    • \EFI\CLOVER\drivers\UEFI

Copying the NvmExpressDxe.efi to the drivers folder adds NVMe support to Clover which will enable booting from the Windows Installation that has just been completed.

My \EFI\CLOVER\drivers\UEFI looks like this:

Insert the Clover USB Flash Drive or SD Card into your server

Next simply insert the USB flash drive or SD Card into your server and set the UEFI boot order on the server to boot from it:

Sandisk USB Flash Drive installed into internal USB Port of a Dell PowerEdge Server to support booting of an NVMe drive.

Ensure your UEFI Boot order is set correctly and pointing to your Clover USB Stick or SD Card:

Image of Dell UEFI Boot Settings set to an Internal USB Device containing clover boot loader.

When booting from the internal Clover USB stick it will briefly display a boot screen:

Screenshot of the Clover Boot Manager.

The clover defaults worked right away for me and I didn’t have to configure anything.

You can modify the config.plist file (which is in the root of the USB Stick) to reduce the timeout if you want to speed things up a little bit:

<key>Boot</key>
<dict>
	<key>#Arguments</key>
	<string>slide=0 darkwake=0</string>
	<key>#DefaultLoader</key>
	<string>boot.efi</string>
	<key>#LegacyBiosDefaultEntry</key>
	<integer>0</integer>
	<key>#XMPDetection</key>
	<string>-1</string>
	<key>CustomLogo</key>
	<false/>
	<key>Debug</key>
	<false/>
	<key>DefaultVolume</key>
	<string>LastBootedVolume</string>
	<key>DisableCloverHotkeys</key>
	<false/>
	<key>Fast</key>
	<false/>
	<key>Legacy</key>
	<string>PBR</string>
	<key>NeverDoRecovery</key>
	<true/>
	<key>NeverHibernate</key>
	<false/>
	<key>RtcHibernateAware</key>
	<false/>
	<key>SignatureFixup</key>
	<false/>
	<key>SkipHibernateTimeout</key>
	<false/>
	<key>StrictHibernate</key>
	<false/>
	<key>Timeout</key>
	<integer>5</integer>
</dict>

Modify the “integer” value on line 36 to reduce the boot delay.

Windows should now proceed to boot normally directly from the NVMe drive.

Performance Results

I was really impressed with how much faster both machines PowerEdge are when booting from the NVMe drive. For the purposes of clarity the config of these systems are:

Dell PowerEdge R310
Intel XEON X3470 2.93GHz
16Gb Ram
Dell PERC H700 (512mb)

Dell PowerEdge T320
Intel XEON
32Gb Ram
Dell PERC H710 (1Gb)

Performance of the Samsung 970 Evo Plus NVMe Drive was excellent in both machines. But the drive performance is constrained in the R310 because it has a PCI Gen 2 x 4, whereas the T320 has PCI Gen 3.

Disabling C States in the BIOS increases performance in both machines.

Here are the results from a CrystalDiskMark from the R310 with C States Disabled:

Image of CrystalDiskMark showing performance results from an NVMe drive installed in a Dell PowerEdge Server with C States Disabled

Here are all the results from both machines with and without C States Enabled.

Test TypeC StateMachineRead Result (MB/s)Write Result (MB/s)
SEQ1M Q8T1EnabledR3101670.131636.13
SEQ1M Q8T1DisabledR3101811.271760.90
SEQ1M Q1T1EnabledR3101359.941346.70
SEQ1M Q1T1DisabledR3101529.411498.86
RND4K Q32T16EnabledR3101147.301351.01
RND4K Q3T16DisabledR3101149.571346.97
RND4K Q1T1EnabledR31035.9585.65
RND4K Q1T1DisabledR31037.9693.31
SEQ1M Q8T1EnabledR3203339.913124.31
SEQ1M Q8T1DisabledR3203576.683265.89
SEQ1M Q1T1EnabledR3202303.262510.44
SEQ1M Q1T1DisabledR3202421.822793.11
RND4K Q32T16EnabledR3201150.971557.42
RND4K Q32T16DisabledR3201145.251558.19
RND4K Q1T1EnabledR32033.23101.27
RND4K Q1T1DisabledR32043.98111.05

As a crude comparison here is the performance of a RAID 0 Array in the R310 comprising 4 x 7,200 RPM SATA Drives:

Image showing performance benchmark of a RAID 0 array on a Dell PowerEdge R310 with PERC H700 controller.

This R310 server also has a Samsung 860 EVO SSD in the DVD Drive bay, which is connected via a SATA 2 port on the motherboard:

Image showing performance of a Samsung SSD Installed in the optical drive bay of a Dell PowerEdge R310.

You can see the performance of the drive being constrained by the SATA2 port, but it still gives good random performance.

If you are using VMWare then you can just access the NVMe drive in the normal way if you are booting from a different storage device such as SD Card or USB Stick.

Conclusion – is it worth adding NVMe storage to a old Dell PowerEdge?

Given the low cost of both the adapter and Samsung SSD and the huge resulting performance boost, it is certainly worth experimenting.

I can’t say if I would use this setup in production yet, but so far, it seems to work fine. Here is an image of Samsung Magician Drive information:

Filed Under: How To Tagged With: Dell PowerEdge

Log RDS Sessions with a Power BI Streaming Dataset

November 14, 2019 by Paulie 1 Comment

After a conversation with a customer today, I needed to create an easy way to monitor RDP authentications. This turned out to be more difficult than I expected. We suspected that some accounts had been compromised.

I decided to combine a few things to enrich the data from the event log and make analysis simple:

  • A custom event log trigger to execute a PowerShell Script.
  • IPStack to enrich the client IP Information.
  • A streaming PowerBI Dataset to record and visualise the data.
    This video on Streaming datasets from Patrick Leblanc was the inspiration.

It worked out pretty well, here is a screenshot of the Power BI Report:

Power BI report showing details of Successful Remote Desktop Logons
The customer does not have any users outside of the UK!

Step one: Triggering an event when a user successfully authenticates an RDS Session

When a user logs on to a terminal server a number of events are recorded in the event log. I found event 4624 with a logon type of 10 is the easiest to attach to and provides a good source of data.

Image showing Event 4624, Logon Type 10 in the Windows Event Viewer, which indicates a successful user logon via terminal services.
The event data includes the username, client IP address, date & time etc.

There is a good post on Technet which describes how to trigger a PowerShell script when a Windows Event is logged. Follow those instructions but make the following changes to the XML export routine.

Trigger the event to fire on Event ID 4624 and Logon Type 10

Because multiple 4624 events occur whenever an RDS session is logged on, you need to create a custom XML Filter in the event log to further narrow down the results to show only those that have Logon Type set to 10, as per the following:

<QueryList>
  <Query Id="0">
    <Select Path="Security">
		*[System[(EventID=4624)]]
		and
		*[EventData[Data[@Name='LogonType'] and (Data='10')]]
	</Select>
  </Query>
</QueryList>
Custom XPath Filter to narrow down the 4624 Events to Logon Type 10

Although this filter works fine in the event viewer, if you create a trigger from it, it will only filter on Event ID 4624, which causes the PowerShell script to execute too many times.

We also need to extract the Event Record ID from the trigger to be sent to the script. I’ve put the example XML on to pastebin:

Click here to see the XML Code changes required

Once you have modified the XML Scheduled Task file, recreate the task as per the technet article.

Step Two: Tweak the scheduled task

After you have created the trigger on event ID 4624 you need to modify it slightly to allow parallel execution. If multiple people logon at the same time, you still want the script to execute.

Here is my scheduled task setup:

Step Three: Powershell Script to Post Successful Terminal Server Authentications to PowerBI

Next up is the actual PowerShell Script that posts event log data to Power BI. The basic flow of the script is like this:

  1. Receive the Event Record ID from the Scheduled Task and query the event log for full details of the event.
  2. Check the Logon Type was “10”. This is the logon type associated with a Terminal Server session.
  3. Extract the Information from the XML of the Event Log
  4. Connect to IP Stack and query it for information regarding the IP Address
  5. Create a JSON payload and send it to PowerBI for further analysis.

Here is the script:

param([string]$eventRecordID = "none")

$DebugPreference = "Continue"
#The PowerBI Streaming Data Set Endpoint and IP Stack API Key
$PBIendpoint = "Replace_With_PBI_Endpoint"
$IPStackAPIKey="Replace_With_IP_Stack_API_Key"

Write-Debug "Event Record ID is $eventRecordID"

if ($eventRecordID -ne "none") {
    #Get the event detail for event ID 4624 (Successful Logon)
    $evtPath = @"
        <QueryList>
          <Query Id='0' Path='Security'>
            <Select Path='Security'>
              *[System[(EventRecordID=$eventRecordID)]]
            </Select>
          </Query>
        </QueryList>
"@

    $event = get-winevent -LogName 'Security' -FilterXPath "$evtPath"
    $eventXML = [xml]$event.ToXML()
    $logonType = $eventXML.Event.EventData.Data[8].'#text'

    #Check the logon type is 10 (Remote Desktop)
    if ($logonType -eq "10")
    {
        Write-Debug "Logon Type is 10, sending data to Power BI"

	    $eventTime = $event.TimeCreated.ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
	    $eventUser = $eventXML.Event.EventData.Data[5].'#text'.ToString() 
	    $eventClientIP = $eventXML.Event.EventData.Data[18].'#text'.ToString()

	    
	    $ipInfo = Invoke-WebRequest `
            "http://api.ipstack.com/$eventClientIP`?access_key=$IPStackAPIKey" -Method POST |
            ConvertFrom-JSON

	    $payload = @{
	        "Time" ="$eventTime"
	        "User" ="$eventUser"
	        "Client IP" = "$($eventClientIP)"
	        "Continent" = "$($ipInfo.continent_code)"
	        "Country" = "$($ipInfo.country_code)"
	        "Region" = "$($ipInfo.region_name)"
	        "Zip" = "$($ipInfo.zip)"
	        "latitude" ="$($ipInfo.latitude)"
	        "longitude" ="$($ipInfo.longitude)"
	    }
	    Invoke-RestMethod -Method Post -Uri "$PBIendpoint" -Body (ConvertTo-Json @($payload))
    }
    else
    {
        Write-Debug "Logon Type is $logonType, not logging"
    }
}
else
{
    Write-Debug "No event information passed to script"
}

Once the data is in PowerBI, it is simple to create a report to see the following information about the users of your terminal server environment:

  • Who is logging on
  • When they are logging on
  • The country they are in
  • The location within that country

IPStack also provide a threat level based on the reputation of the IP, but I was using a free account, so that information was not available to me.

Create your PowerBI Streaming Dataset

Follow the video Patrick Leblanc from Guy in a Cube did on creating a streaming dataset in Power BI:

This is how to configure the Streaming dataset for use with the PowerShell script above:

Step Four: Build the Power BI Report

Building the report is pretty simple, and once it was running I could immediately identify three separate accounts that had been compromised:

Image showing Power BI Report
This customer does not have any users based outside of the UK!

Out of interest I shadowed the compromised accounts on the terminal server before securing them, just to see what they were getting up to. It was mainly a lot of mass emailing via gmail and activity on “tagged.com” communicating with guys looking to buy a “pet”:

I’m pretty pleased with how it turned out, now it is dead easy to monitor terminal server sessions via PowerBI and so much more usable than the event log. The data comes through to PowerBI virtually instantly, so it’s a really good use case.

Filed Under: How To Tagged With: Power BI, Remote Desktop

Monitor Disk Usage on Linux and Unix with Microsoft Flow

November 5, 2019 by Paulie Leave a Comment

I look after quite a number of servers, many of which are different flavours of Unix and Linux. Checking disk usage is easy with the df command but I wanted to setup something automatic, where I could get a view of all the systems at once. I wanted the disk usage from every system, regardless of operating system to end up in an Excel spreadsheet for review, something like this:

Use awk to produce JSON from the df command

The df command has a “-P” option which will produce Posix style output. This ensures the output is the same regardless of the operating system generating the output. Also to ensure the size output was the same I used the -k option. So the df command I used is:

df -P -k | grep -v "/snap/core"

Which produces output like this:

Filesystem     1024-blocks     Used Available Capacity Mounted on
udev               1988700        0   1988700       0% /dev
tmpfs               403972     1092    402880       1% /run
/dev/sda2         32893712 17614104  13585656      57% /
tmpfs              2019852        0   2019852       0% /dev/shm
tmpfs                 5120        0      5120       0% /run/lock
tmpfs              2019852        0   2019852       0% /sys/fs/cgroup
/dev/sdb1        263174148    61536 249690892       1% /u
tmpfs               403968        0    403968       0% /run/user/1000

I wanted to remove some of the file systems from the report, so they were filtered out with grep.

Convert the output of df to JSON

Next I converted the output of the df command to JSON format using awk so that it be consumed with Microsoft Flow. I passed some variables into the awk script make it easy to identify the machine later and to generate a unique key (for Microsoft Flow):

BEGIN {
  printf "{"
  printf "\"Server\": \"" machine "\","
  printf "\"Location\": \"" location "\","
  printf "\"FileSystems\":["
}

{
  if ($1 != "Filesystem") {
    if (i) {
      printf ","
      }
      printf "{\"mount\":\"" $6 "\",\"size\":\"" $2 "\",\"used\":\"" $3 \
              "\",\"avail\":\"" $4 "\",\"use%\":\"" $5 "\",\"key\" :\"" (machine " " location " " $6) "\"}"
      i++
  }
}

END {
  print "]}"
}

This converts the df input into JSON:

{
	"Server": "Ubuntu 18.04 Test",
	"Location": "4D-DC",
	"FileSystems": [{
		"mount": "/dev",
		"size": "1988700",
		"used": "0",
		"avail": "1988700",
		"use%": "0%",
		"key": "Ubuntu 18.04 Test 4D-DC /dev"
	}, {
		"mount": "/run",
		"size": "403972",
		"used": "1092",
		"avail": "402880",
		"use%": "1%",
		"key": "Ubuntu 18.04 Test 4D-DC /run"
	}, {
		"mount": "/",
		"size": "32893712",
		"used": "17614104",
		"avail": "13585656",
		"use%": "57%",
		"key": "Ubuntu 18.04 Test 4D-DC /"
	}, {
		"mount": "/dev/shm",
		"size": "2019852",
		"used": "0",
		"avail": "2019852",
		"use%": "0%",
		"key": "Ubuntu 18.04 Test 4D-DC /dev/shm"
	}, {
		"mount": "/run/lock",
		"size": "5120",
		"used": "0",
		"avail": "5120",
		"use%": "0%",
		"key": "Ubuntu 18.04 Test 4D-DC /run/lock"
	}, {
		"mount": "/sys/fs/cgroup",
		"size": "2019852",
		"used": "0",
		"avail": "2019852",
		"use%": "0%",
		"key": "Ubuntu 18.04 Test 4D-DC /sys/fs/cgroup"
	}, {
		"mount": "/u",
		"size": "263174148",
		"used": "61536",
		"avail": "249690892",
		"use%": "1%",
		"key": "Ubuntu 18.04 Test 4D-DC /u"
	}, {
		"mount": "/run/user/1000",
		"size": "403968",
		"used": "0",
		"avail": "403968",
		"use%": "0%",
		"key": "Ubuntu 18.04 Test 4D-DC /run/user/1000"
	}]
}

Now that the format of the df output was in JSON format, I wanted to put it all the data into an Excel Spreadsheet, which I could easily keep up to date with a cron job.

Consume the JSON with a HTTP request in Microsoft Flow

The next step was to consume the data into Microsoft Flow. I created a simple four step flow:

  1. Consume the JSON with a simple HTTP Post:
    Image of HTTP Request in Microsoft Flow
  2. Update Existing Rows in the Spreadsheet:
    Image showing import of JSON data into Excel Spreadsheet with Flow
  3. If step two failed (i.e) no matching row existed, then add new rows:
    Image of JSON Data being added to a spreadsheet with Microsoft Flow
  4. Terminate the flow:
    Image of Terminate Action from Microsoft Flow

This all works really well, I tested it on IBM AIX, Sco Openserver 5 and various versions of Ubuntu, all worked without a problem.

Putting it all together

There is some detail missing from the above, so I am going to paste all of the code here so that if you want to replicate the functionality you can do so easily:

freeSpaceFlow.sh

This shell script it what puts everything together, run it like this:

./freeSpaceFlow.sh machineName Location

This is the code

#!/bin/sh
df -P -k | \
        grep -v "/snap/core" | \
        /usr/bin/awk -v machine="$1" -v location="$2" -f ./freespace.awk | \
        curl -X POST \
        "https://prod-82.westeurope.logic.azure.com:443/workflows/...." \
        -H "accept: application/json" \
        -H "Content-Type: application/json" \
        -d @-

It’s fairly simple to see what it does, but I will quickly explain:

  • Runs the df command and pipes the output to grep.
  • Uses grep to remove unwanted filesystems and pipes to awk.
  • awk runs “freespace.awk” and creates json and pipes to curl.
  • curl consumes stdin of the awk script for the payload to posts the data to Microsoft Flow.

Lets’ see the flow in action:

The Excel document is just a simple table with some formulas to convert the values to Gigabytes.

It feels great to bring data in from Unix with curl over to Excel via flow, it seems to be really reliable too.

Filed Under: How To Tagged With: Linux, Microsoft Flow, Shell Scripting

Use PowerShell to generate push notifications to your SmartPhone

October 21, 2019 by Paulie Leave a Comment

This post is going to cover how to use Powershell to send push notifications to your smartphone using Powershell.

I recently discovered a web service called Notify17. It’s a really nifty little service which allows you to easily send push notifications to iOS/Andriod and Browsers.

It really is very easy to use. I started using it on Unix but I wanted the same functionality in one of my PowerShell Scripts. It couldn’t be easier:

$notifyParams = @{title="PowerShell Alert";content="Script has completed!"}
Invoke-WebRequest -Uri https://hook.notify17.net/api/raw/RAW_API_KEY -Method POST -Body $notifyParams

Most of the supplied examples use curl, which Windows does not have, but Invoke-WebRequest works perfectly.

This simple bit of code immediately generates a notification on my phone and my computer:

Image of iPhone receiving push notification from a PowerShell Script using Invoke-Webrequest

There are so many cool ways to use this, for example:

  • Backup Failures
  • Hardware Errors
  • Website Down Notifications
  • Notifications from Sensors etc
  • Call it from Zapier to generate an alert

What I like really like about this service:

  • So easy to use!
  • E-Mails often get missed and it’s really useful to be able to get an alert for the really important stuff.

I modified my own Dell Hardware Alert script and tested it with Notify17 and it worked great, much better than getting an email which you might not see for a while.

Thanks to Alberto Marchetti for building Notify17.

Filed Under: How To Tagged With: Powershell

Install an SSD in the Optical Bay of a 13th Generation PowerEdge

October 14, 2019 by Paulie 8 Comments

Installing an SSD to the optical drive by on a 13th Generation Dell PowerEdge is really easy. The only thing you really need to know is that the bay is 9.5mm high. I used this caddy from Amazon.

I installed the SSD into a PowerEdge R330, it works really well. Here is the SSD Installed in the Optical Drive Bay:

Image of a Dell PowerEdge server with an SSD Installed in the DVD Drive bay.
The SATA SSD Installed in the DVD/CD Drive bay.
Front image of an SSD Installed into the optical bay of a Dell PowerEdge Server
From the front it looks just like a Normal Optical Drive

ESXi is installed on 2 x 32Gb SD Cards, the SSD is free to be used as a data store:

Image showing ESXi datastore with an SSD installed into the Optical bay of a Dell PowerEdge Server.
Image of Install of Internal SD Card Into a Dell PowerEdge R330
SD Cards being installed by a certified PowerEdge expert.

There is a second SATA connector on the motherboard and space at the front where you could install another SSD. But I haven’t tried that.

I haven’t used the DVD Drive on a Dell server in years, an SSD makes better use of the space. You could use it as a normal data store or perhaps a flash based read cache in ESXi.

Filed Under: How To Tagged With: Dell PowerEdge

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 12
  • Go to Next Page »

Primary Sidebar

Link to my LinkedIn Profile
Go to mobile version