Automate changes in SSIS packages

We use Sql Server Integration Services (SSIS) for importing and exporting data. The number of SSIS packages is stil growing and on writing this post we have 35 of them. Changing the database connectionstring for development means opening and editing all packages with Visual Studio until we decided to automate the changes.

In Visual Studio you have a nice visual editor for SSIS packages with the plugin from Microsoft. ( But have you noticed that you can view the code of the package from the solution explorer context menu? Then you see the XML that described the package. XML can be edited with a lot of tools. When we automated we prefer powershell.

We created a powershell script that uses xpath to locate the parts to change. The complete script is at the end of this post. During development of this powershell script we experienced some challenges.

How to read, edit and write XML in powershell

We use Select-Xml to read the complete XML file and locate the node with xpath. Editing can be done on the Node property of the result. To set an attribute you can use the properties of the Node.

To get the xpath to work for SSIS we had to add the DTS namespace. The uri for this can be found in the root node of the SSIS package: DTS = “”

After editing the Save method of the OwnerDocument is called. The parameter should be the full path of the file. The Path property of the Select-Xml result can be used for this. Just make sure you’ve passed the full path to the Select-Xml eg $fileItem.FullName.

Keep formatting to easy view changes

After the script has run you want to view the changes it has made. The default formatting of XML in the save method removes the formatting Microsoft uses when creating and editing in Visual Studio. This makes viewing changes hard.

We’ve found that XmlWriter can be configured to use formatting that resembles the Microsoft way. For this we use the XmlWriterSettings.

Powershell script

We’ve added an extra step to increase the version of the package every time the script has run. Operations will love us for this – oh wait – we are devops … make sure you have the information you need and automate everything šŸ˜‰

$xpath_version = "/DTS:Executable"
# we use Package Configuration, this will update the connection that is used to load the configuration from a database
$xpath_config = "//DTS:ConnectionManager[@DTS:ObjectName='ConfigurationConnection']/DTS:ObjectData/DTS:ConnectionManager"
$Namespace = @{
    DTS = ""

# 1. Find dstx files
$files = Get-ChildItem -Recurse -Path '[PATH_TO_SOURCES]' -Filter '*.dtsx'

foreach($fileItem in $files) {
    $file = $fileItem.FullName

    # 2. edit connection configmanager
    $x = Select-Xml -Path $file -Namespace $Namespace -XPath $xpath_config
    $x.Node.ConnectionString = $config

    # 3. increase version
    $x = Select-Xml -Path $file -Namespace $Namespace -XPath $xpath_version
    $x.Node.VersionBuild = (([int]$x.Node.VersionBuild) + 1).ToString()

    # 4. pretty print
    $xml = [xml](Get-Content -Path $file -Encoding UTF8)
    $StringWriter = New-Object System.IO.StringWriter
    $settings = New-Object System.Xml.XmlWriterSettings
    $settings.Indent = $true
    $settings.NewLineOnAttributes = $true
    $XmlWriter = [System.Xml.XmlWriter]::Create($StringWriter, $settings);
    $StringWriter.ToString() | Out-File -FilePath $file -Encoding utf8
Posted in Uncategorized | Tagged , , | Leave a comment

Parse IIS logs with powershell

With the log4j CVE we’re checking our IIS logs in detail. To automate this we’ve created a powershell script that parses the logs and provides query access. Below a script that searches for 502 responses and prints the fields we need for investigation.

$logFolder = 'C:\inetpub\logs\LogFiles\W3SVC1'

# sort with oldest file first
$files = Get-ChildItem -Path $logFolder -Filter '*.log' | sort name
foreach($file in $files) {
   # skip the 3 header lines and remove the #Fields: part to be able to use the first line as headers 
    $log = get-content "$logFolder\$file" | select-object -skip 3 | foreach-object { $_ -replace '#Fields: ', ''} | convertfrom-csv -Delimiter ' '
   # now search for 502 status and print the fields we need in a table
    $log | where sc-status -eq '502' | Select-Object -Property date, time, s-ip, cs-method, cs-uri-stem, cs-username, sc-status | format-table

No screenshot – because of security šŸ˜‰

Posted in Uncategorized | Tagged | Leave a comment

R Shiny dashboard inspection with profvis

We’re building a R Shiny dashboard to display big datasets. In our production environment we sometimes experience slow performance. Profiling with the profvis package is our plan to fix this.

The profvis package needs to be installed. Then it can be used by encapsulating the startcode in a profvis call.

# only once

# in your startcode

After the R Shiny dashboard starts we followed a test scenario and stopped the app. Then R Studio showed us the profile data in two tabs (images below). One tab with a flame graph so you can see the spikes in (wait) time / memory during the test scenario. The other tab shows the same data in a table per file / function.

Flame Graph tab with memory and time per code line
Data tab with memory and time per file/function

We now have a better understanding where our dashboard is performing less optimal. After changing the code we can start another profvis session to see if the changes fix or add to the problem.


Posted in Uncategorized | Tagged , | Leave a comment

Developing inside a Container

I’m contributing to Sonarqube-Sql-Plugin ( and creating pull requests to merge back my changes. Getting my environment setup was easy using homebrew ( but still required some work. Here is how VSCode makes this much easier with developing inside a container.


With docker you can start an isolated environment that is created to do one thing – host the software. The container is setup with all the dependencies and settings needed during the initial build, so we can just use them. Installation is easy with docker desktop (

VSCode now has an extension that you can use to run the development environment inside a container. I installed the remote containers extension ( and added the .devcontainer folder. The dockerfile contents I copied from Microsoft’s sample container for java ( Now I can dev – build – repeat in a container.

VSCode project

After reopening the project in the container (F1 > Remote-Containers: Rebuild and Reopen in Container) VSCode started loading everything I needed to develop. Me being a Microsoft developer, I didn’t know the pom.xml contained the build target information. The maven build targets are offered with a run button next to them for easy use. I was able to build the jar file and test it within minutes.

The .devcontainer folder can be added to Github so everyone can use it. No more “works on my machine” šŸ˜‰


developing inside a container – vscode (

Posted in Development, Tooling | Tagged , , , , | Leave a comment

Code Notes snippet manager

Like many developers I have some snippets with code. These snippets are dear to me and I use them all the time. Without a real place to store them I would be lost. This is where Code Notes comes in.

image from

Code Notes ( is a standalone application that stores my snippets. It is simple and intuitive to use and deploy – just put the files in your My Documents and use it from there. There is the option of connecting to github for their gist feature – but I have no internet connection from work :(. Maybe someday …

You never miss this tool until you’ve used it. Now I can’t work without it. Go download it, use it and contribute to it (

Posted in Development, Tooling | Tagged , | Leave a comment