Tuesday, 11 October 2016

Docker on Windows - Don't use TP5


I'm new to the world of Docker and being from a Windows background I've not really had the opportunity to have a look, that is until Windows Server 2016 which has native support for Docker.  Wahoo!

As Windows Server 2016 is now RTM I thought I'd go and have a play on Azure.

Unfortunately this is where I found that Azure doesn't (yet) have a RTM version of Server 2016:


As a result I decided to go for the Technical Preview 5 image and enable Docker myself.

Adding the Windows Feature was straight forward:


Then obviously as it is Windows it required a reboot.

I then ran:

I also added C:\ProgramFiles\Docker to my Path variable.

I then registered the service:


The started the service:

At this point I got an error:
Start-Service : Failed to start service 'Docker Engine (docker)'.At line:1 char:1+ Start-Service docker+ ~~~~~~~~~~~~~~~~~~~~



After Googling I found that you can't use Windows Server 2016 TP 5, you need to use the fully fledged version of Windows Server 2016 (which isn't available in Azure).
This is stated on this page (under Pre-Reqs):  https://msdn.microsoft.com/en-gb/virtualization/windowscontainers/quick_start/quick_start_windows_server

For fun I thought I would try and upgrade the Azure vm I was running but this didn't seem to work, leaving me with a dead vm.

Hopefully a RTM version of Windows Server 2016 will be on Azure soon.

Thursday, 25 August 2016

Upgrading Octopus Deploy from v2.6 - Give it some memory!

I've been tasked with upgrading Octopus Deploy to the latest version, this is for many reasons but mainly to look at the replacement to snapshotting, channels.

To test the upgrade before it is applied to live I have been using a test environment which has the same instance of Octopus that we have in live, 2.6.

The first thing to say about Octopus 3.x is that it no longer uses a NoSql database (RavenDB), it now uses SQL Server.  This has widely been blogged about but from what I've seen the SQL data structure that it uses is still similar to a NoSQL database with a NVarChar(Max) column filled with JSON.

The installation of Octopus 3.3.24 is straightforward and no really note worthy, the wizard run after installation will create the database and provide an empty installation of Octopus Deploy.
After this has been installed the next step is to migrate your existing database using a backup (with the master key).

Clicking on the "Import Data" brings up a wizard which allows you to select the Octopus backup file and enter the Master key.

The preview option will simulate the process but unfortunately the "Task logs" option does not work with the preview mode.

Our production backup file is 750MB and we use Octopus for all our deployments in our CD pipeline so we have a fair amount of deployment data.

The process to import the task logs takes a long time, I got the memory increased on our test server to 16GB and run the process and it had not completed after 17 hours.  It had consumed all of the memory but not particularly the processor.  It is the step of upgrading the documents that appears to be taking the time.

Upon Googling I found that there is a parameter than can be used to execute the upgrade process so that it limits the history that is brought over; -maxage=

This made the command line:
"C:\Program Files\Octopus Deploy\Octopus\Octopus.Migrator.exe" migrate --instance "OctopusServer" --file "C:\Octopus\20160729-140413.octobak" --master-key "abcdefghijklmnopqrstuvwxyz" --include-tasklogs -maxage=60

After checking the Migration log file I found the last entry was:
2016-08-13 20:57:07.7216      1  INFO  Step took 00:00:00s2016-08-13 20:57:07.7216      1  INFO  2016-08-13 20:57:07.7216      1  INFO  Convert documents
This didn't get updated and all of the memory on the machine (16GB) was pretty quickly consumed.
As this was running on a VM I left the machine running for a few days and the log file didn't get updated.

After raising a call with Octopus Support we found that the process requires quite a lot of memory.
Giving my VM 32GB RAM seemed to allow the migration to complete (in 20 minutes) although it was still very close to maxing out the memory.

In short, if you have a large Octopus Raven database (ours was approximately 3GB - when Windows counted the size of the Raven database) you'll need a lot of memory to upgrade, maybe more than 32GB!

Thanks to Vanessa Love (@fly401) for all the help!

Tuesday, 28 June 2016

Why is git capitalising my branches?

I'm quite new to git and I'mstill getting to grips with git and I've been a little confused that when I create a branch git seems to add uppercase letters, even though I specifically created it in lowercase.

For example:

This responds with:
Switched to a new branch 'grahamr/test'

But when I check the branches it has a different casing:

This caused me some problems when I was trying to push the branch as it that is case sensitive.

Upon doing some digging this is due to the first branch that I created I created began with 'GrahamR'.  The way git creates branches and how they are stored.  Git stores the branch in a single file which contains the hash to the commit object that the branch points to.
In the example above creating a branch 'grahamr/test' will create a folder called 'grahamr' and store the branch in a file called test.
As I originally created a branch that began 'GrahamR' the folder was created and even though the branch was deleted the folder remains.

So to resolve this issue browse to the '.git/refs/heads' folder and then delete the folder (ensure that you have deleted the branches first and the folder is empty):
So now create a new branch:

Then when I check the branch status I see this:

For more info look here:  http://stackoverflow.com/questions/15371866/why-is-git-capitalizing-my-branch-name-prefix

Thursday, 21 April 2016

Extracting perfmon stats using PowerShell


I recently got asked to extract certain performance metrics from multiple servers (at the same time) and put them in an Excel graph.  As I didn't have any software to do this I decided to extract the results from PerfMon using Powershell and collate them into a CSV.

The fist step was creating the PowerShell command.  As we wanted to change the servers easily as well as the counters being used we created a generic one line command:


In the Counters.Txt contained:
\Processor Information(_Total)\% Processor Time
\Memory\% Committed Bytes In Use
 \network interface(*)\bytes total/sec
 \logicaldisk(_total)\current disk queue length

The Servers.txt file contained a list of all the servers that needed to be monitored:
Server1
Server2
Server3
Running the PowerShell produced a CSV file but when loaded into Excel the results in the file clearly needed some amending before we could create the graphs we needed:


The values for each server were stored in a single cell for each timestamp (as shown above).

So staying in Excel I created one of the most horrible Excel queries I've ever written it does the job but like all Excel queries it is not pretty!


This can be broken down to make it a bit more readable:
The first part:

This reformats the string by removing the carriage returns, slashes and replacing them with commas if necessary.
The final part of the formula inserts carriage returns for each server, so the final part replaces the server names we are extracting from and ensures that there is a return code first then it inserts the time (from the B1 cell in this case):
Placing the formula for into the D1 cell and copying it down to every cell creates a spreadsheet like this:

Copy the contents of column D into a new sheet and Paste it into column A (Paste using values) and then save that worksheet as a CSV file.
Accept the messages stating that formatting won't be saved and that only the selected workbook will be saved.
Also save the entire workbook as an Excel file, just in case.

I found that the file seems to have quotes in each cell making Excel look like the file is empty when it loads the file so I opened the file in Notepad++ and did a search and replace simply replacing the " with nothing:

This provides a file that can be loaded into Excel:
I then sorted the data by Column B (ServerName) and then by Column C (Setting) and finally by Column A (Time).  I also changed the formatting of Column A to a Time.
It now looks like this:
Lastly I then highlighted the Time (Column A) and the Value (Column E) for a given metric (in the example below Memory) and then clicked Insert and Scatter graph (I picked the one with straight lines) and you've got a graph!
You can cut and paste it onto a different worksheet if required.
Not the most straight forward process but could be useful?

Tuesday, 8 March 2016

SQL 2016 - Performance of Temporal Tables

One of the new features in SQL 2016 is Temporal Tables, whilst confusing Google when you search on it (Google seems to want to point you to Temporal Tables occasionally) they are very useful.

Suppose you had a table (called Person):
PersonID
Firstname
Lastname
Notes
1TestPerson
2TestPerson
3TestPerson
Temporal Tables will create another table that stores all of the changes over time and uses slowly changing dimensions so you can track back what it was at a certain point in time.
Slowly changing dimensions are date/time fields that allow you to see when that information was used, normally a From and To date is used.
Considering the table above, the dates would be need to be appended so they would appear in the history table.
PersonID
Firstname
Lastname
Notes
ValidFrom
validTo
1TestPerson2016-02-29 15:26:212016-02-29 17:19:11
2TestPerson2016-02-29 15:26:212016-02-29 17:19:11
3TestPerson2016-02-29 15:26:212016-02-29 17:19:11

The history table is maintained automatically, you update the main table and the history table is dealt with via SQL, nice huh?

So how do I create this?

When creating your original table there are a couple of new additions:


For the ValidFrom and ValidTo columns I've used DATETIME2 it is the bit after it that is interesting:
validFrom DATETIME2(0) *GENERATED ALWAYS AS ROW START HIDDEN* NOT NULL,
GENERATED ALWAYS AS ROW START (and ROW END) allow SQL to keep a track of the time the row was first used and when it was changed.  The StartTime is the time of the transaction itself, so if a query was to update every row in the table it would all have the same time.
HIDDEN - this allows the fields to be added to the main table but they are not returned in a SELECT * query, although adding the columns to the SELECT clause will  return them:





So what is the performance like?

 I wasn't sure either so I thought I'd find out.  Firstly I created a new Azure VM (Standard D2 0 2 cores and 7GB and SSD) with the SQL image CTP3.3 applied.
Firstly I created a new database and then inserted some data into it (1 million rows).  Then I created a loop and did 1 million updates to random rows in the table.
I did the same thing to a non temporal table and compared the results:
Temporal Query:

Non-Temporal Query:

 The results!

Non Temporal Table
Temporal Table
Insert Data01:43:5401:39:46
Update data01:46:3901:52:50

This surprised me greatly as I expected the Temporal table updates to be significantly slower (as there was twice the work required), however the results seem to imply that they are almost identical, impressive huh!.

Friday, 4 March 2016

Upgrading SonarQube from 5.0 to 5.3 (Updated)

Overview

We are running SonarQube version 5.0 and have encountered a problem with some of our builds failing.  The build log states:

[10:12:42][Exec] 10:12:42.873 DEBUG - Loaded 2058 properties from l10n bundles
[10:12:43][Exec] 10:12:43.642 INFO - Load project referentials...
[10:12:43][Exec] 10:12:43.648 DEBUG - Download: http://localhost:9000/batch/project?key=xxxx.xxx&preview=false (no proxy)
[10:13:03][Exec] 10:13:03.637 INFO - Load project referentials done: 19995 ms
[10:13:03][Exec] INFO: ------------------------------------------------------------------------
[10:13:03][Exec] INFO: EXECUTION FAILURE
[10:13:03][Exec] INFO: ------------------------------------------------------------------------
[10:13:03][Exec] Total time: 27.272s
[10:13:03][Exec] Final Memory: 20M/316M
[10:13:03][Exec] INFO: ------------------------------------------------------------------------
[10:13:03][Exec] EXEC Error during Sonar runner execution
[10:13:03][Exec] org.sonar.runner.impl.RunnerException: Unable to execute Sonar
[10:13:03][Exec] at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
[10:13:03][Exec] at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
[10:13:03][Exec] at java.security.AccessController.doPrivileged(Native Method)
[10:13:03][Exec] at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
[10:13:03][Exec] at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
[10:13:03][Exec] at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
[10:13:03][Exec] at org.sonar.runner.api.Runner.execute(Runner.java:100)
[10:13:03][Exec] at org.sonar.runner.Main.executeTask(Main.java:70)
[10:13:03][Exec] at org.sonar.runner.Main.execute(Main.java:59)
[10:13:03][Exec] at org.sonar.runner.Main.main(Main.java:53)
[10:13:03][Exec] Caused by: java.lang.IllegalStateException: Unable to request: /batch/project?key=xxxx.xxx&preview=false

After Googling we found the following page on Stack Overflow:  http://stackoverflow.com/questions/26419286/read-timed-out-on-batch-project-sonarqube-4-5
Eventually we came across two Sonar defects both which state they are fixed in version 5.1:

So we needed to upgrade Sonar from version 5.0 to (at least) 5.1

Given the latest version of Sonar is 5.3 it made sense to upgrade to the latest version, it couldn't be that hard could it?
Given that Sonar is a Java application (and therefore all of the files needed are in a directory rather than a Windows install) the upgrade is fairly straight forward:
  1. Backup the SonarQube database
  2. Snapshot the VM
  3. Extract SonarQube-5.3.zip
  4. Run D:\Sonar\sonarqube-5.0\bin\windows-x86-64\UninstallNTService.bat to uninstall the existing Sonar Service.
  5. Copy the plugins from D:\Sonar\SonarQube-5.0\Extensions\Plugins to D:\Sonar\SonarQube-5.3\Extensions\Plugins
  6. Update the  D:\Sonar\SonarQube-5.3\conf\sonar.properties file with the changes for SQL connectivity, http proxy and LDAP configuration from the SonarQube-5.0 sonar.properties file.
  7. Run D:\Sonar\sonarqube-5.3\bin\windows-x86-64\InstallNTService.bat to uninstall the existing Sonar Service.
  8. Start the SonarQube service 

Let's do it

After getting the VM and database backed up I was ready to begin, so I extracted the SonarQube-5.3 zip file on the server and run the batch file to unregister the service that we use to start SonarQube (taking note of the user that it runs under).
The next step was to update the sonar.properties file (found in the conf folder) with the information from the existing version (as the Sonar 5.0 install was in a different folder this was fairly straight forward).
  • Entered the SQL connection string
  • In the Update Center part I had to enter our proxy server details
  • As we use the LDAP plugin (so we can login using our network IDs) I needed to copy of this configuration.
Next I copied the plugins from the Sonar 5.0 folder over.

Starting Sonar

Now that the config was correct I registered the service and set it to run as the same user.  Then I started the service and then I looked at the logs/sonar.log file to see what was happening:
2016.03.02 17:56:28 INFO web[o.s.s.p.ServerImpl] SonarQube Server / 5.3 / 8db783e62b266eeb0d0b10dc050a7ca50e96c5d1
2016.03.02 17:56:28 INFO web[o.sonar.db.Database] Create JDBC data source for jdbc:sqlserver://xxxxxxx;databaseName=SonarQube;integratedSecurity=true
2016.03.02 17:56:30 WARN web[o.s.s.p.DatabaseServerCompatibility] Database must be upgraded. Please backup database and browse /setup
2016.03.02 17:56:30 INFO web[o.s.s.p.DefaultServerFileSystem] SonarQube home: D:\Sonar\sonarqube-5.3
2016.03.02 17:56:30 INFO es[o.e.cluster.metadata] [sonar-1456941379001] [rules] creating index, cause [api], templates [], shards [1]/[0], mappings []
2016.03.02 17:56:31 INFO es[o.e.cluster.metadata] [sonar-1456941379001] [rules] create_mapping [rule]
2016.03.02 17:56:31 INFO es[o.e.cluster.metadata] [sonar-1456941379001] [rules] create_mapping [activeRule]
2016.03.02 17:56:31 ERROR web[o.a.c.c.C.[.[.[/]] Exception sending context initialized event to listener instance of class org.sonar.server.platform.PlatformServletContextListener
org.sonar.api.utils.MessageException: File is not a plugin. Please delete it and restart: D:\Sonar\sonarqube-5.3\extensions\plugins\sonar-ant-task-2.3.jar
2016.03.02 17:56:31 INFO web[jruby.rack] jruby 1.7.9 (ruby-1.8.7p370) 2013-12-06 87b108a on Java HotSpot(TM) 64-Bit Server VM 1.7.0_45-b18 [Windows Server 2008 R2-amd64]
2016.03.02 17:56:31 INFO web[jruby.rack] using a shared (threadsafe!) runtime
2016.03.02 17:56:35 ERROR web[jruby.rack] initialization failed
org.jruby.rack.RackInitializationException: java.lang.NullPointerException
at org.jruby.rack.RackInitializationException.wrap(RackInitializationException.java:31) ~[jruby-rack-1.1.13.2.jar:na]
at org.jruby.rack.RackApplicationFactoryDecorator.init(RackApplicationFactoryDecorator.java:98) ~[jruby-rack-1.1.13.2.jar:na]
at org.jruby.rack.RackServletContextListener.contextInitialized(RackServletContextListener.java:50) ~[jruby-rack-1.1.13.2.jar:na]

Ok, so it doesn't like the ant plugin that we use, upon looking this doesn't look to be an actual Sonar plugin but something that is required on the local machines so I move it out of the directory and started the service again.

Thankfully it looked to be ok but needed the database to be upgraded.  The logs state to browse to the URL followed by /setup to do this.
This was a new page that you had to click on the upgrade button and state that you have a backup of the database, upon clicking the link it quickly stated that the database upgrade failed.  Oh joy.

Looking at the sonar.properties file once again to be sure that the connection settings were ok I spotted a problem:
#----- Microsoft SQLServer 2008/2012/2014 and SQL Azure
# A database named sonar must exist and its collation must be case-sensitive (CS) and accent-sensitive (AS)
# Use the following connection string if you want to use integrated security with Microsoft Sql Server
# Do not set sonar.jdbc.username or sonar.jdbc.password property if you are using Integrated Security
# For Integrated Security to work, you have to download the Microsoft SQL JDBC driver package from
# http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=11774
# and copy sqljdbc_auth.dll to your path. You have to copy the 32 bit or 64 bit version of the dll
# depending upon the architecture of your server machine.
# This version of SonarQube has been tested with Microsoft SQL JDBC version 4.1
We already use the JDBC driver so that wasn't an issue, the problem is that our database is called SonarQube and not Sonar.
Thankfully the DBAs could resolve this pretty quickly for me.
I amended the config (and also turned on extra logging), started the service and tried to upgrade the database again, still no joy.
The log file stated:
2016.03.02 18:46:53 INFO web[o.s.d.v.v.FixMsSqlCollation] Updating columns from table schema_migrations
2016.03.02 18:46:53 ERROR web[o.s.s.d.m.DatabaseMigrator] Fail to execute database migration: org.sonar.db.version.v53.FixMsSqlCollation
java.lang.IllegalStateException: Fail to execute DROP INDEX unique_schema_migrations ON schema_migrations
at org.sonar.db.version.DdlChange$Context.execute(DdlChange.java:70) ~[sonar-db-5.3.jar:na]
at org.sonar.db.version.v53.FixMsSqlCollation$UpdateTableCollation.removeIndexes(FixMsSqlCollation.java:504) ~[sonar-db-5.3.jar:na]
at org.sonar.db.version.v53.FixMsSqlCollation$UpdateTableCollation.execute(FixMsSqlCollation.java:492) ~[sonar-db-5.3.jar:na]

Thankfully someone had the same problem on Stack Overflow and the solution was to manually add the index:


After trying the upgrade again the page stated that a problem had occurred but looking at the log file it stated that the upgrade of the database was successful.
I was able to login to Sonar and appeared to look good so I ran a build through TeamCity to confirm that it was ok, I initially tried the Finance build but this failed (due to some NuGet packages being out of date - not related) so I tried a Product build.  The build was previously green so it should go through with no errors.

Unfortunately this wasn't the case and looking at the build log file I was presented with this error:
[19:12:31] : [Exec] INFO: ------------------------------------------------------------------------
[19:12:31] : [Exec] INFO: EXECUTION FAILURE
[19:12:31] : [Exec] INFO: ------------------------------------------------------------------------
[19:12:31] : [Exec] Total time: 6.395s
[19:12:31]E: [Exec] EXEC Error during Sonar runner execution
[19:12:31] : [Exec] Final Memory: 9M/156M
[19:12:31] : [Exec] INFO: ------------------------------------------------------------------------
[19:12:31] : [Exec] org.sonar.runner.impl.RunnerException: Unable to execute Sonar
[19:12:31] : [Exec] at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
[19:12:31] : [Exec] at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
[19:12:31] : [Exec] at java.security.AccessController.doPrivileged(Native Method)
[19:12:31] : [Exec] at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
[19:12:31] : [Exec] at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
[19:12:31] : [Exec] at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
[19:12:31] : [Exec] at org.sonar.runner.api.Runner.execute(Runner.java:100)
[19:12:31] : [Exec] at org.sonar.runner.Main.executeTask(Main.java:70)
[19:12:31] : [Exec] at org.sonar.runner.Main.execute(Main.java:59)
[19:12:31] : [Exec] at org.sonar.runner.Main.main(Main.java:53)
[19:12:31] : [Exec] Caused by: java.lang.IllegalStateException: Unable to register extension org.sonar.plugins.javascript.JavaScriptSquidSensor
Upon checking our Javascript plugin we were using version 2.5 and the latest version was version 2.10, so I downloaded the new version and put it into the 'extensions\plugins' folder and tried again.
This time I had the same problem with Sonar Build Breaker problem which needed to be upgraded version 1.1 to 2.0 and I tried again...
Next it was the C# plugin that needed to be upgraded from version 3.3 to 4.5.  However upgrading this version gave another side effect, it required .Net 4.5.2 to be installed on the build agents.
I then tried the build on a build agent with .Net 4.6 installed and crossed my fingers....
[19:53:58] : [Exec] 19:53:58.454 INFO - Analysis report generated in F:\TeamCity\buildAgent\work\a2e9533bd348fbbc\source\.\.sonar\batch-report
[19:53:58] : [Exec] INFO: ------------------------------------------------------------------------
[19:53:58] : [Exec] INFO: EXECUTION FAILURE
[19:53:58] : [Exec] INFO: ------------------------------------------------------------------------
[19:53:58] : [Exec] Total time: 12:28.997s
[19:53:59] : [Exec] Final Memory: 7M/122M
[19:53:59] : [Exec] INFO: ------------------------------------------------------------------------
[19:53:59]E: [Exec] EXEC Error during Sonar runner execution
[19:53:59] : [Exec] org.sonar.runner.impl.RunnerException: Unable to execute Sonar
[19:53:59] : [Exec] at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
[19:53:59] : [Exec] at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
[19:53:59] : [Exec] at java.security.AccessController.doPrivileged(Native Method)
[19:53:59] : [Exec] at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
[19:53:59] : [Exec] at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
[19:53:59] : [Exec] at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
[19:53:59] : [Exec] at org.sonar.runner.api.Runner.execute(Runner.java:100)
[19:53:59] : [Exec] at org.sonar.runner.Main.executeTask(Main.java:70)
[19:53:59] : [Exec] at org.sonar.runner.Main.execute(Main.java:59)
[19:53:59] : [Exec] at org.sonar.runner.Main.main(Main.java:53)
[19:53:59] : [Exec] Caused by: java.lang.IllegalStateException: Report processing did not complete successfully: FAILED
[19:53:59] : [Exec] at org.sonar.plugins.buildbreaker.QualityGateBreaker.getAnalysisId(QualityGateBreaker.java:138)
[19:53:59] : [Exec] at org.sonar.plugins.buildbreaker.QualityGateBreaker.execute(QualityGateBreaker.java:94)
[19:53:59] : [Exec] at org.sonar.plugins.buildbreaker.QualityGateBreaker.executeOn(QualityGateBreaker.java:81)
This was more of a problem.  The error was clearly with the Build Breaker problem (which was already upgraded) and we need so that Team City can be aware of broken builds.
It was at this point (after over 3 hours of trying and the security guard trying to kick me out of the building) that I decided to abort the upgrade and try again another day.

Update

I eventually found that the cause of my problem was due to a defect with the Sonar Build Breaker plug-in.
We had Sonar configured with two checks on Technical Debt.
We had a rule stating that Technical Debt could not be greater than 5 days, we also had a rule stating that technical debt could not have increased by 0.5 day from the previous build.
This is due to be resolved in Sonar 5.6, in the meantime we just removed the second check.
For more details on the defect check out the issue (Sonar-7276)

Friday, 19 February 2016

Configure IIS to enable Directory Browsing using Octopus Deploy


I recently came across a scenario where we wanted to configure a folder in IIS to be visible for directory browsing so that accessing the log files was easier for development as they were on a server in a different domain (a DMZ).
The log files are (currently) being created in a separate Logs folder within the site:
As the deployment is completely controlled by Octopus Deploy we wanted to set the logs folder to be accessible so that the files could be viewed.

PowerShell

As Octopus Deploy allows you to import script templates (Powershell scripts) and there is a community library where scripts written by others exist my first stop was to look there to see if anything existed for what I wanted to do.
Sadly the script library couldn't help (this time):
So it was time to get my hands dirty with a little bit of PowerShell.

Enable Directory browsing in IIS

So the first step was to enable directory browsing in IIS.  After Googling I found lots of pages that suggested running AppCmd.exe to perform the task but I was hoping for something a bit more elegant than that.
The WebAdministration Cmdlet documentation provided me with enough information to check what the setting was:

The next step was to add a check (using Test-Path) to be sure that the path existed then replace the hard coded path with a variable and I had my script:

Adding a MIME type

As the NServiceBus log files are a .log extension a new MIME extension needed to be added so that they could be served up by the browser.
Google also helped me here and I came across a blog by Seth Larson which helped here:

It needed a slight tweak to set the directory and it also returned an error if the type already existed.  I also added some variables (to help me with Octopus) and my script was:

Octopus Deploy

Adding the Step templates

The scripts needed to be added to Octopus so that they could be used, this is a relatively simple process (but may require Admin permissions - I'm not 100% sure):
Login to Octopus, select Library and then Step Templates:
Select Add step template in the top corner:
Then select Powershell from the list of options
On the Step paste in the Powershell script:


The $OctopusParameters['IisPath'] command gets the parameter that was set from Octopus

On the Parameters Tab I created two new parameters.  The parameters mean that they can be configured for each project that uses them simply in Octopus.
IisPath:
EnableDirectoryBrowsing:
This process needed to be repeated for the 'Adding MIME type' script:

For this I added three parameters:

Now that the steps were added it was time to use them in our projects.

Adding the steps to our projects

Select the project and select Process on the left hand side:
At the bottom of the steps select 'Add step', Octopus will automatically show the parameters that were configured in the step template.  For the Directory browsing there were two parameters:
  • IIS Path
    • IIS:\Sites\#{ExternalWebsite.Name}\Finance\#{Octopus.Environment.Name}\Logs
  • Enable directory browsing
    • true
The IIS path uses some internal variables that we have configured to ensure that it uses the correct setting as the server changes during the CD pipeline.
 I set the environments so that this would not run on our Production environment (and arguably I shouldn't have set it to run on PreProd), this is because we don't want the logs to be visible to everyone in live!
Adding the step for the MIME type was almost the same:

Anything else?

The only step that needed to be added was to create the logs folder as it is not created as part of our deployment:
The order of the steps was:

So what could go wrong?

When I ran this I encountered a couple of problems:
Ensure that the IIS path is correct (especially when copying from one project to another).
I originally left the name Finance in it when I set it up on Product and didn't error but also didn't work!

When adding the MIME type it checks the Web.config file of the project to be sure that it is valid.
TeamCity returned the error when adding the MIME type:
Error: The 'autoDetect' attribute is invalid. Enum must be one of false, true, Unspecified
When I looked at the Web.Config file was set to:
<proxy autoDetect="True" />
Changing the setting to 'true' with a lowercase T resolved the issue.