Running XSL Transforms Whenever You Change Your XSL File

I don't often have to write XSL transforms these days; it just doesn't come up as often as it did 10-15 years ago. On the rare occasion I do have to tinker with XSLT, I'd like instant feedback. I want to see the results of my changes right away, without having to manually run the transform (either from the command line or a menu item).

There are tools which can do this, but I don't do it often enough to justify paying for another software license. So I've cobbled together a couple of PowerShell scripts and a Chrome extension which give me instant feedback. Even if you don't have my specific problem (converting an XML report into a nice HTML version), you might find at least one part of this low-rent XSL studio useful.

I do my XSLT editing in Notepad++, but any editor will do for this setup.

The first thing we need is a way to run the XSL transform. Luckily, that's dead simple to do in PowerShell because we have access to the .NET System.Xml namespace. Loading an XSL file from disk, compiling the transform, running it against an XML document, and sending the output to another file can be done in a few lines of code. Here's the entire xsl.ps1 script:

[CmdletBinding()]
Param(
  [Parameter(Mandatory=$True)]
  [string]$xmlFile,
  [Parameter(Mandatory=$True)]
  [string]$xsltFile,
  [Parameter(Mandatory=$False)]
  [string]$outputFile = "output.xml"
)

$xmlFile = [System.IO.Path]::GetFullPath([System.IO.Path]::Combine((Get-Location), $xmlFile))
$xsltFile = [System.IO.Path]::GetFullPath([System.IO.Path]::Combine((Get-Location), $xsltFile))
$outputFile = [System.IO.Path]::GetFullPath([System.IO.Path]::Combine((Get-Location), $outputFile))

# Create the transform
$xslt = New-Object System.Xml.Xsl.XslCompiledTransform( $false )

# Create a couple of other argument objects we'll need
$arglist = new-object System.Xml.Xsl.XsltArgumentList
$xsltSettings = New-Object System.Xml.Xsl.XsltSettings($false,$true)

# Load the XSL file
$xslt.Load($xsltFile, $xsltSettings, (New-Object System.Xml.XmlUrlResolver))

# Open a file for output
$outFile = New-Object System.IO.FileStream($outputFile, [System.IO.FileMode]::Create, [System.IO.FileAccess]::Write)

# Run the transform
$xslt.Transform($xmlFile, $arglist, $outFile)

# Close the output file
$outFile.Close()

I should add a disclaimer that this script supports my specific scenario; a more idiomatically PowerShell-ish script would probably allow you to pipe the XML content in and return the result instead of just writing it to a file. But I'm lazy and this gets the job done. To use it, you simply specify the source XML file, the XSLT file, and optionally the path to the output file:

xsl .\example.xml .\example.xslt .\example.html

Next, we need a way to watch the XSL file so we can re-run the previous script every time it changes. The .NET Framework provides the handy FileSystemWatcher class for exactly this kind of thing. You can point it at a path and it will raise events when that path changes. And PowerShell has a command called Register-ObjectEvent which allow us to subscribe to framework object events and run PowerShell script blocks when they occur. Here's my watch.ps1 script which watches a target path and runs the specified script block whenever the file at that path changes:

[CmdletBinding()]
Param(
  [Parameter(Mandatory=$True)]
  [string]$target,
  [Parameter(Mandatory=$True)]
  [string]$action
)

if(-not [System.IO.Path]::IsPathRooted($target)) {
    $target = [System.IO.Path]::GetFullPath([System.IO.Path]::Combine((Get-Location), $target))
}

if($action -eq "stop") { 
    Write-Host "Stopping watch on $target"
    Unregister-Event FileChanged 
    Write-Host "Stopped"
} else {
    $fsw = New-Object IO.FileSystemWatcher ([System.IO.Path]::GetDirectoryName($target)), ([System.IO.Path]::GetFileName($target)) -Property @{IncludeSubdirectories = $false; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'} 

    Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -MessageData $action -Action { 
        try {
            iex ($Event.MessageData)
        } 
        catch [Exception] {
            write-host $_.Exception.ToString();
        }
    } 

    $fsw.EnableRaisingEvents = $True 
}

This is pretty straightforward. The script creates a FileSystemWatcher and then registers for the Changed event; when that event occurs, it invokes the script block specified by the -Action parameter. Note that we're not directly passing our $action parameter to Register-ObjectEvent; rather, we're using Invoke-Expression to call it and wrapping it in a try/catch block. This way, we can output any errors we get from the $action and still keep responding to later events. This is important for the XSLT process, because I make a lot of mistakes when trying to get XSL transforms working correctly. If I make a mistake in my XSL document, the exception information will pop up in my terminal window as soon as I hit "Save".

Firing up the watch script with the XSL transform is simple:

watch.ps1 .\example.xslt {xsl .\example.xml .\example.xslt example.html}

The watcher will continue to run in the background until we manually stop it:

watch.ps1 .\example.xslt stop

The final piece in my bootleg XSL studio is the LivePage extension for Chrome. It can watch an HTML document and automatically reload it whenever it changes. Just point it at the output HTML file (and make sure to check "Allow access to file URLs" for LivePage in the Chrome Extensions settings), and as you make your XSL changes you'll see the result in the browser.

Just a side note - it is technically possible to do most of this in Chrome without the PowerShell scripts; you just have to link your XML document with your stylesheet and run Chrome with the --allow-file-access-from-files switch. But it's kind of a pain to have to add the stylesheet reference to your XML document and manually refresh Chrome all the time. So I prefer my (admittedly) convoluted setup. Your mileage my vary.

Upload to Dropbox from the command line in Windows

I've got an instance of TeamCity running on an Azure VM that I use for build and deployment automation, and a few weeks ago I finally got around to setting up the build, tests, and packaging of a Windows desktop application I've been working on for a while. Up to this point I'd been building the installers manually and shipping them to my partners for testing via a shared Dropbox folder. Once I had the project building on check-ins and running all the automated tests, I really wanted to get to the point where I could release a new version with a button push. My partners are already comfortable with receiving installers via Dropbox and I didn't feel like setting up and maintaining an FTP server (in addition to setting them up with FTP clients). So I needed a way to get TeamCity build artifacts into Dropbox easily.

The obvious option is to install the Dropbox client on the build server and simply copy the installer artifact to the shared folder. The problem with that scenario is that the Dropbox user needs to be logged into the machine for the sync to take place. I'm not willing to leave my account logged in all the time, so that's a non-starter. (Plus, what happens when the build server needs to reboot for maintenance/updates?)

Some googling reveals a few articles about installing the Dropbox client to run as a Windows Service, but in addition to being kludgy I'd have to run the service with my credentials. It seemed like there had to be a better way.

After some more googling I found a project for a command-line Dropbox-Uploader project; it's basically a shell script for handling the upload through Dropbox's REST API. Which is a good start, but it would require me to set up a Dropbox API application entry and, since I want this to run on Windows, would also require a Cygwin install. Cygwin is nice and all, but in these days where most of what Cygwin offers is available in Powershell anyway, I hate installing it on a machine just to handle one small requirement like this.

In the end, I decided to write my own dead-simple Windows command-line Dropbox uploader: PneumaticTube. PneumaticTube1 literally does just that one thing - once you've authorized it with your account, you can upload a file to any path in your Dropbox folder. Using it with TeamCity is a snap - just set up a shell command step in your build to call it with the artifact you want to upload.

The only catch is that you have to do the authorization2 as the account your build agent runs under, which means you actually have to run your agent under a user account (as opposed to running the agent under System, for example)3.

If you want to use it, there's a chocolatey package available to make installation easy. Suggestions, bug reports, questions, and pull requests are welcome - just hit up the issues page and let me know. I hope this helps someone else whose deployment pipeline includes Dropbox.


  1. Dropbox app naming guidelines basically make you pick a name that doesn't include "Dropbox", so "Dropbox Uploader For Windows" was out. The first thing that came to mind when I thought about "put something in and it goes to a destination" were those tube systems at the banks. I love those things. 

  2. It uses the standard pattern of popping open a browser window so you can enter your Dropbox credentials and authorize the app. 

  3. I'll probably fix this in future versions by allowing you to specify the user secret and user token on the command line (instead of getting them from the user's settings), allowing you to store the secret and token in the build settings. Or maybe someone else will take care of it (pull requests are always welcome).