Skip to content

Blog

Let’s Encrypt on Windows with ACMESharp and letsencrypt-win-simple

The march of freely available TLS certificates for domain validation continues in the form of the Let’s Encrypt project and I’m very pleased that it does.

I’m very happy with the Certbot client on most systems where I need to deploy Let’s Encrypt, but on hosts facing the big wide world that are Windows-based, Certbot obviously is not an option!

Fortunately, I’ve had success with the ACMESharp library for PowerShell. What’s cool about the library is that it does break down the process into individual commands, meaning you can automate, script and report on your certificate status with a great deal of flexibility.

For simpler scenarios, though, the letsencrypt-win-simple client offers a nice friendly command line interface to the ACMESharp library and is a nice easy way to quickly retrieve and install a Let’s Encrypt certificate on a public-facing IIS instance. Automating the renewal process is easy too — just create a Task Scheduler task.

Yes, it’s a command line client, and there are Windows folks who may not be comfortable with that, but it walks you through every part of the process. No memorising of switches and flags are needed!

There really is no excuse — now is the perfect time to get everything on HTTPS!

Appeasement is not Acceptable

I have avoided overt political statements on this blog, unless they fell within the sphere of technology and I felt very strongly. 

But this is beyond political. 

I am appalled at my government’s appeasement of Mr Trump. I am appalled at their willingness to do deals with this new US administration.

Through the factually verifiable acts of this new administration (for example, the dismissal of the Attorney General after opposing the president), there is a clear attempt to dismantle checks and balances that are an integral and essential part of a democratic state. There is an obvious contempt for the rule of law. 

Functioning democracies do not behave in this way.

We must not wait for this US administration to start ‘disappearing’ people who lawfully oppose the administration before we act to say, loudly and clearly, that enough is enough

Safeguarding British values (as those values are defined by my government) demands that we condemn and oppose this behaviour.

I demand that my government condemn the Trump administration’s rejection of democratic norms and utilise any and all diplomatic pressure to make this clear.

History will judge us very poorly if we sit around waiting for it to get ‘bad enough’ before we take a stand.

Hopes for 2017

I hope for a world where we are able to actually keep calm and carry on in the face of significant challenges, rather than just displaying the aforementioned in poster form.

I hope for a world where those with all different political persuasions will have the courage to stand up for what is right, even when it is hard.

I hope for a world where we always remember to treat each other like human beings.

Happy New Year everyone.

The Investigatory Powers Act

I sincerely hope the UK Government plans to actually debate the “Repeal the new Surveillance laws (Investigatory Powers Act)” petition in Parliament now that it has reached 100,000 signatories, including myself.

Of course, the commitment they made is carefully worded such that attracting that number of signatures merely means it will be “considered” for debate.

Recent events in the United States and elsewhere demonstrate that maintaining the right balance of power between the state and the individual is more important than ever. I would not normally get political here, but the circumstances are anything but normal — the frightening jolt the western world seems to be making towards extreme right-wing authoritarianism means that maintaining that balance is nothing short of absolutely critical.

The list of organisations who can access internet connection records is enormously wide and includes bodies as mundane as the Food Standards Agency! This is way beyond something that could be argued as essential to maintaining the UK’s operational intelligence capabilities for preventing domestic acts of mass violence.

This law would be deeply, deeply troubling at any time, but is even more so as the US election shows us the threat of home-grown extremism that rises through established political bodies and gains the powers of high office.

Personally, I urge everyone to support efforts to mount legal challenges to this legislation.

Please consider supporting organisations like Open Rights Group.

Running SpinRite 6.0 on a Mac

SpinRite logo

SpinRite is a fantastic tool for repairing and maintaining hard drives, and I am proud to say that its purchase price has been more than recouped on drives that it has brought back into service that would otherwise have needed replacing!

Running it on an Intel Mac hasn’t been possible with version 6.0. It actually boots fine, but there is no way to give keyboard input, and thus there is no way to kick off a scan.

Reports that people had succeeded at getting SpinRite to work on various weird and wonderful platforms indirectly, using VirtualBox and its raw disk access mode, led me to experiment with this to run SpinRite on a Mac. This is particularly useful on iMacs where pulling the hard drive out of the case is… undesirable(!)

This is an advanced, technical process.

Performing the wrong operations when you have raw access to the disk, a technique this process uses, can cause you to lose data. You must have a backup.

Obviously, I do not accept any responsibility and cannot help if you break things by using these notes. Hard hats must be worn beyond this point. All contractors must report to the site office.

Boot from another disk

You’ll need a working MacOS install on another disk that you can boot from, as we need to unmount all the volumes on the disk to be scanned in order to gain raw access to the disk. I use SuperDuper to make bootable backups, and these work great for this purpose too.

Prepare the Environment

Make sure you have VirtualBox installed, with the optional Command Line Tools.

Turn off screen savers, sleep timers and screen lock, just in case the VM has taken keyboard input away from you and you are unable to unlock the Mac to check on SpinRite’s progress. It’s certainly not an ideal situation to have to pull the plug on the computer while that VM has raw access to your target disk!

Identify the Target Disk

It is critical that you identify the BSD device name for the whole disk that you want to operate on. In my case, I’d booted from disk1 and the SpinRite target disk was disk0.

Determine the correct disk identifiers with:

diskutil list

diskutil list

» Read the rest of this post…

Morning

QuickArchiver on Thunderbird — Archiving Messages to the Right Folder with One Click

QuickArchiver icon

Even despite the dominance of webmail, I have long used a traditional desktop email client. I like having a local mail archive should “the cloud” have trouble, as well as the ability to exert control over the user interface and user experience. (That might be partly a euphemism for not having to see ads!)

Apple’s Mail.app built into macOS (going to have to get used to not calling it OS X!) has served me pretty well for quite some time now, alongside Thunderbird when I’m on Linux, and while Mail.app offered the most smooth interface for the platform, it didn’t always have all the features I wanted.

For example, being able to run mail rules is more limited than I wanted in Mail.app. I could have rules run automatically as messages arrived in my inbox, or disable them entirely. But actually how I wanted to use rules was to be able to cast my eye over my inbox, and then bulk archive (to a specific folder) all emails of a certain type if I’d decided none needed my fuller attention.

Recently, I moved to Thunderbird on my Mac for managing email and discovered QuickArchiver.

As well as letting you writing rules yourself, QuickArchiver offers the clever feature of learning which emails go where, and then suggesting the right folder to which that message can be archived with a single click.

It’s still early days, but I am enjoying this. Without spending time writing rules, I’m managing email as before, and QuickArchiver is learning in the background what rules should be offered. The extra column I’ve added to my Inbox is now starting to populate with that one-click link to archive the message to the correct folder!

It’s just a nice little add-on if, like me, you (still??) like to operate in this way with your email.

WordPress, Custom Field Suite and the WP REST API as a Middleware Platform

WordPress logo

Over the last five years or so, I’ve worked a lot with WordPress — developing custom plugins as well as piecing together pre-existing components to build (hopefully) really great websites.

But WordPress is more than just a blogging tool, and can be more just a tool for websites.

My most recent WordPress-related endeavour has been in my day job.

I have been looking at taking various bits of information about business processes that thus far have been disparate and disconnected and structuring and centralising that information so it can be more useful.

I’ve been using custom post types in WordPress for different types of information. Custom Field Suite makes describing the metadata we want to store a breeze, and effortlessly provides a beautiful and usable interface for “mere mortals” to input and manipulate the data later in the WP-Admin interface.

I work in an education environment; a simple example of one of these entities is the lunch menu (formerly just a Word document with no meaningful machine-readable structure at all). This was a nice, easy and public entity to start with.

So, we have a:

  • Custom post type for a lunch menu
  • A Custom Field Suite field group attached to the custom post type
  • Members plugin to control read and write access to that custom post type 

The final piece of the puzzle is using the WP REST API to be able to expose this data to other systems.

With a very small amount of code, the REST API can be convinced to enable access to these custom entities — and of course we still retain WordPress’ access control (with a little help from the Members plugin) to ensure we’re not too free with our data!

Now we have somewhere where non-technical users can go to input data and the ability then to export that data through the REST API into any other application. Because we’ve formalised the structure of the information, we have the flexibility to display it in all sorts of different ways that are appropriate for the medium.

So our lunch menu can be:

  • Exposed via the web
  • Displayed on a screen in public areas
  • And more!

The lunch menu design was an exciting proof of concept of the idea. I’m now moving on to slightly more ambitious projects which involve using a little bit of custom ‘glue’ in PowerShell (but whichever programming language is appropriate could be used!) to write data from other external systems into WordPress for later use.

Getting information out of big proprietary information systems using their provided tools that require… shall we say patience… has been a challenge. But, once liberated, this information is now stored, structure, and now can be queried simply and securely for all sorts of uses.

Back in 2011 when I started developing for WordPress with Chris from Van Patten Media, I remember thinking to myself, “yeah, I can probably figure this out”. It perhaps wouldn’t have been so obvious then that building a skill set with a ‘blogging tool’ would prove useful five years later in a quite different context, but this is testament to the versatility of the WordPress platform and what it has become!

The Windows 10 Experience

New Windows logo

I haven’t said much about Windows 10 here on this blog, but my day job brings me into contact with it quite extensively.

There is a huge amount about the Windows experience that this release improves, but also there are elements of Microsoft’s new approach to developing and releasing it that are problematic.

The Good

Installing Windows 10 across a variety of devices, it is striking just how much less effort is required to source and install drivers. In fact, in most cases no effort is required at all! Aside from the occasional minor frustration of bloated drivers that are desperate to add startup applications, this makes such a positive difference. Unlike in the past, you can typically just install Windows, connect to a network, and everything will work.

This is particularly notable in any environment where you have a large number of devices with anything more than a little bit of hardware diversity. Previously in an enterprise environment, hunting for drivers, extracting the actual driver files, removing unwanted ‘helper application’ bits and building clean driver packages for deployment was tedious and wasteful of time. Now, much of the time, you let Windows Update take care of the drivers for you over the network, all running in parallel to the actual provisioning process that you have configured!

There are numerous other pockets of the operating system where there really feels like there has been a commitment to improve the user experience, but from my “world of work” experience of the OS, this is the most significant. It’s true as well that many of the criticisms you could make about past versions of Windows no longer apply.

The Bad

I guess that the coalescing of monthly Windows Updates into a single cumulative update helps significantly with the ‘236 updates’ problem with (and atrocious performance of) Windows Update in 7. However, Microsoft’s recent history of updates causing issues (the recent issues with KB3163622 and Group Policy, for example) combined with the inability to apply updates piecemeal leaves some IT departments reluctant to apply the monthly patch. The result, if Microsoft continues to experience these kind of issues, or doesn’t communicate clearly about backwards-incompatible changes, is more insecure systems, which hurts everybody.

This leads me to my other main complaint. There have been reports about the new approach Microsoft is taking with software testing. An army of ‘Insiders’ seem to be providing the bulk of the telemetry and feedback now, but my concern is that this testing feedback doesn’t necessarily end up being representative of the all of the very diverse groups of Windows users. Particularly when deploying Windows 10 in an Enterprise environment, it has felt at times like we are the beta testers! When one update is a problem, you then have to put people at risk by rejecting them all. 🙁

(Yes, there is LTSB, but it hangs back a very long way on features!)

The Ugly

Windows 10 'Hero' image

At least you can turn it off on the login screen officially now. 🙂

Reverse Proxying ADFS with Nginx

In my recent trials and tribulations with ADFS 3.0, I came up against an issue where we were unable to host ADFS 3.0 with Nginx as one of the layers of reverse proxy (the closest layer to ADFS).

When a direct connection, or a cURL request, was made to the ADFS 3.0 endpoints from the machine running Nginx, all seemed well, but as soon as you actually tried to ferry requests through a proxy_pass statement, users were greeted with HTTP 502 or 503 errors.

The machine running ADFS was offering up no other web services — there was no IIS instance running, or anything like that. It had been configured correctly with a valid TLS certificate for the domain that was trusted by the certificate store on the Nginx machine.

It turns out that despite being the only HTTPS service offered on that machine through HTTP.sys, you need to explicitly configure which certificate to present by default. Apparently, requests that come via Nginx proxy_pass are missing something (the SNI negotiation?) that allows HTTP.sys to choose the correct certificate to present.

So, if and only if you are sure that ADFS is the only HTTPS service you are serving up on the inner machine, you can force the correct certificate to be presented by default, which resolves this issue and allows the Nginx reverse proxied requests to get through.

With that warning given, let’s jump in to what we need to do:

Retrieve the correct certificate hash and Application ID

netsh http show sslcert

You’ll need to note the appid and the certificate hash for your ADFS 3.0 service.

Set the certificate as the default for HTTP.sys

We’ll use netsh‘s interactive mode, as I wasn’t in the mood to figure out how to escape curly brackets on Windows’ command line!

You want the curly brackets literally around the appid, but not the certhash.

netsh
netsh> http
netsh http> add sslcert ipport=0.0.0.0:443 appid={appid-from-earlier} certhash=certhash-from-earlier

Verify the proxy_pass settings

Among other configuration parameters, we have the following in our Nginx server stanza for this service:

proxy_redirect off;
proxy_http_version 1.1;
proxy_request_buffering off;
proxy_set_header X-MS-Proxy the-nginx-machine;
proxy_set_header Host the-hostname-in-question

And, with that, we were successfully reverse proxying ADFS 3.0 with Nginx. 🙂