To NUC or not to NUC

Intel Next Unit of Computing or NUC, you’ve probably heard of them. Essentially they are a very small form factor computer that Intel produce and sell as a reference design to show that regular PCs can be small and relatively affordable. NUCs come with a range of processors (atom, Celeron, i3, i5) and a range of specs (here, here and here being some examples). All they need is ram and a Hard drive to get them going.

Recently my wife was in need of her own computer so decided to go with the Intel NUC as it would be small and tidy. The monitor had a VESA mount which allowed the NUC to be mounted on the back and with a wireless keyboard and mouse the end result was extremely tidy. Making everyone happy.

The spec of the NUC was as follows

NUC: Intel DN2820FYKH Barebone Desktop (Celeron N2820 2.39GHz, HD Graphics, WLAN, Bluetooth 4.0)

Hard Drive: Intel 530 Series 120GB SSD

Ram: Kingston ValueRam 8 GB DDR3L 1600 MHz SODIMM CL11 Memory Module

Monitor: BenQ GL2250HM 21.5-inch Widescreen LED Multimedia Monitor

Couple of gotchas about the Intel NUCs generally and this model specifically

  1.  The ram must be the low voltage ( 1.35v) Sodimm modules. Normal voltage ram will not work in an Intel NUC. There is a supported memory list which lists out the full specs of the ram :
  2. For this specific model of NUC you have to use a 2.5 inch hard drive as opposed to a m-SATA hard drive. However to fit the drive bay of the NUC the RAM must be no thicker than 9.5mm.

The installation of a Windows 8.1

Honestly this was the most painful part of this process, the windows installer loaded fine all the pain came from the NUC.


  1. SSD not detected
  2. BIOS screen not displaying on monitor

To install and operating system on the intel NUC DN2820FYKH you need to upgrade the BIOS from the factory shipped version of 0015 to a minimum of 0025 along with making some fairly minor changes to the BIOS settings. As you can probably imagine the inability to see the BIOS screen made flashing the BIOS less straightforward but not impossible. Intel supply the BIOS updates in 4 packages :

  • Recovery BIOS update : used for flashing directly from the BIOS screen
  • iFlash BIOS update : DOS based utility for updating the BIOS
  • Self-extracting windows based update file
  • Self-extracting windows PE update file

In my case the simplest fix was to make a windowsPE boot USB instructions here , then copy the appropriate BIOS version to the drive. Then boot into the windows PE and run the .exe which then updated the BIOS and all proceeded smoothly from there. The SSD was detected and OS installed.


Server 2012 R2 KMS error STATUS_SUCCESS

Found a interesting one when activating my Server 2012 R2 KMS host with the KMS key.

All was going well and got as far as the commiting the changes when the following error popped up and it is rather strange


Despite the error text the commit was not successful and your configuration changes have not been saved

Turns out to be a rather simple fix. On the commit page for some reason the wizard defaults to 0 as the KMS TCP Listening port number. For KMS this should be 1688 changing this port number to 1688 resolves the error and allows the configuration to be saved.


System state backups failing

Im currently working on a script to automate system state backups and in my testing I encountered a issue, namely System state backups fail on my 2008 domain controller with the following error message

ERROR – Volume Shadow Copy Service operation error ( 0x800423f4). The Write experienced a non-transient error. If the backup is retried the error is likely to occur

Where to start with this one……..

Well the Hex error code indicates that the problem is with VSS failing to complete the read of data so the next port of call is check VSS. This can be done via powershell and the following command run from a administrative shell

vssadmin List Writers

which produces the following output


Which as you can see this confirmed that the NTDS VSS writer failed, which would be expected as we were backing up the system state. The first step in troubleshooting VSS failures is basic enough, restart the services and test. If that doesnt help then restart the server. This had no effect on the problem so it was time to dig a little deeper.

As always the best place to start is the event logs, Microsoft have really increased the level of logging on the servers and it is far more useful than in 2003.A quick perusal of the event logs showed that the backup ran until it tried to use the Extensible Storage Engine API ( ESENT) to read the shadowcopy headers of the Active directory database it then logged the following error

Log Name: Application
Source: ESENT
Date: <date & time>
Event ID: 412
Task Category: Logging/Recovery
Level: Error
Keywords: Classic
User: N/A
Computer: <computer name>
Lsass(640) Unable to read the header of logfile \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1\Windows\NTDS\edb.log  Error -546.

This error points to a known issue with Windows server 2008 ( which my Domain controller is) and applications that use ESENT. Microsoft have released a hotfix for this issue  :

Once this hotfix was applied there were no further ESENT Errors logged and the VSS portion of the backup completed successfully.


Configuring application Crash dumps with Powershell

In a windows environment applications crash for many reasons and the best way to troubleshoot them is to collect a application crash dump. An application crash dump is a snapshot of what the application was doing when it crashed.

From Windows Vista and Windows Server 2008 onwards Microsoft introduced Windows Error Reporting or WER . This allows the server to be configured to automatically enable the generation and capture of Application Crash dumps. The configuration of this is discussed here . The main problem with the default configuration is the dump files are created and stored in the  %APPDATA%\crashdumps folder running the process which can make it awkward to collect dumps as they are spread all over the server. There are additional problems with this as but the main problem I always had with it was that its a simple task that is very repetitive but easy to do incorrectly. Therefore is it a perfect task to be automated.

I wrote this little script in Powershell


This script does 3 things :

  1. Creates a folder to put the crash dumps in
  2. Gives the appropriate accounts access to this folder
  3. Configures the registry with the appropriate settings

Part 1 : Creating the folder

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') | Out-Null
$Folder=[Microsoft.VisualBasic.Interaction]::InputBox("Specify where to store crashdumps (not network location)", "Path", "c:\Crashdump")

New-Item $Folder -Type Directory -ErrorAction SilentlyContinue

### Verify the folder the user specified was a valid folder. Else failback to c:\Crashdump

$validatepath=Test-Path $Folder
	if ($validatepath -eq $false)
	New-Item C:\Crashdump -Type Directory
	Set-Variable -Name Folder -value C:\Crashdump -Scope Script

This piece of code asks the user for input as to where to put the folder, makes the folder. If it cannot make the folder the user suggested then it has a default path of c:\Crashdump

Part 2 : Specifying the permissions

$Acl= get-acl $Folder
$machinename = hostname
$querydomain = [System.DirectoryServices.ActiveDirectory.Domain]::GetCurrentDomain()
$domain = $

#Setting ACLs

$Acl.SetAccessRuleProtection($true, $false)
$acl.AddAccessRule((New-Object System.Security.AccessControl.FileSystemAccessRule("Network","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")))
$acl.AddAccessRule((New-Object System.Security.AccessControl.FileSystemAccessRule("Network Service","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")))
$acl.AddAccessRule((New-Object System.Security.AccessControl.FileSystemAccessRule("Local Service","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")))
$acl.AddAccessRule((New-Object System.Security.AccessControl.FileSystemAccessRule("System","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")))
$acl.AddAccessRule((New-Object System.Security.AccessControl.FileSystemAccessRule("Everyone","FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")))

Set-Acl $folder $Acl

This code just defines some variables and assigns the following user accounts permissions to write to the folder

Network, Network Service, Local Service, System  and the domain account everyone. It then writes the ACL back to the folder.

Part 3: Actually configuring WER

$verifydumpkey = Test-Path "HKLM:\Software\Microsoft\windows\Windows Error Reporting\LocalDumps"

	if ($verifydumpkey -eq $false )
	New-Item -Path "HKLM:\Software\Microsoft\windows\Windows Error Reporting\" -Name LocalDumps

##### adding the values

$dumpkey = "HKLM:\Software\Microsoft\Windows\Windows Error Reporting\LocalDumps"

New-ItemProperty $dumpkey -Name "DumpFolder" -Value $Folder -PropertyType "ExpandString" -Force
New-ItemProperty $dumpkey -Name "DumpCount" -Value 10 -PropertyType "Dword" -Force
New-ItemProperty $dumpkey -Name "DumpType" -Value 2 -PropertyType "Dword" -Force

This Script checks if the Local Dumps Registry key exists, if it doesnt it creates it and then adds the necessary values. You have probably noticed a potential gotcha with powershell and registry entries. Powershell treats registry values as properties of the Key that they are in as discussed here

The full script can be downloaded from here . To run this script you need to allow unsigned scripts by running the following command from a administrative powershell window.

Set-ExecutionPolicy -ExecutionPolicy Unrestricted

alternatively you can sign the script, which isn’t very difficult but has a number of steps. The Technet Scripting Guy blog has a very good guide here and part 2 here .

I use this script in a couple of ways but mainly to sumplify the task of enabling users/admins with collecting application crash dumps for analysis.

Hope this helps