MacBook Air M1

My company provided me with a MacBook Air M1. So the trusty MacBook Pro 15 inch from 2015 has been replaced for everything work related. Is this a happy day? The good, the bad and the ugly.

Good

Boot time, battery time, build quality, portability – everything good in these departments. The battery time is so good I can work two days without plugging in – that is unparalleled 😎 and a huge step up.

Colleagues that picked the windows alternative are complaining about drivers. – no comment –

Bad

Moving from 15 to 13 inch is a big step down. Every review online talks about this. In the pictures below you see them side-by-side and stacked to illustrate te difference. You’ll have to experience it to know what this means for your setup.

I most notice it when using Visual Studio. There is just more room to work with om the 15 inch. Now the solution explorer, output window and other panes must be on auto-hide or I’ll loose to much space for coding.

Ugly

The machine is managed by the company. This means some extra software is installed for remote monitoring, security and no-admin-rights-for-me. I’m allowed to install software from the app store and tweak some user settings – but not all My 2019 developer and power user tools. I’m still not over this one 😦

MacBook Air only has two usb-c ports. For everything not usb-c like network, monitor, external mouse/keyboard we use a hub. This is something I will not bring with me and stays home connected to wires on my desk.

Conclusion

Would I buy this machine for myself? No, my personal life is on iPhone and iPad and the MacBook Pro stil works.

Would I buy it to replace my MacBook Pro if it broke? Absolutely. Because of the apple ecosystem a new device is up-and-running very quickly and sometimes a laptop is preferred over a mobile device.

Is the MacBook Air M1 a good work laptop? Yes. When we return to the office we need to bring the laptop and that is exactly where the MacBook Air M1 outperforms my MacBook Pro – lighter and two-day battery.

Posted in Uncategorized | Leave a comment

Azure Devops Server – register agent for selenium tests

We are migrating from TFS2017 to Azure Devops Server. Everything seems to work after the test migration – except for running selenium tests. The old method with the test agent installation has been deprecated. Microsoft expects us to register an (release/build) agent that can run the selenium tests. Here is how we got it working.

Register an agent

First we need to register an agent for running the selenium tests.

  • Get a personal access token (PAT) from Azure Devops Server. You can find this by clicking on your avatar and going into Security. There you can choose Personal access tokens and “New Token”. Make sure to give Full access – you can remove the PAT after you’ve registered the agent.
  • In Azure Devops Server go to Collection settings > Pipelines > Agent pools. Here you click “Add pool” to create a new agent pool for you selenium agent. Give it the name of your team.
  • Open the new created agent pool and go to tab Agents. Click “New agent” and download the Agent (=zip containing the software)
  • Now remote desktop to the machine you plan to use as the agent to run your selenium tests.
    • Unzip the downloaded agent – we use the D drive and tfsagent folder
    • Start a powershell with administrative rights and run the config.cmd in the D:\tfsagent folder
    • Provide the details for you environment – we started with NETWORK SERVICE as the account to run the agent (and the tests) and changed it after the installation via the services feature in Windows
  • The team agent pool in Azure Devops Server now contains the machine as an agent. Edit the capabilities and add “vstest” to the user-defined capabilities – this is needed to run the unittests that drive our selenium tests

Edit the release pipeline

In the (old) TFS2017 we would use the testagent install task and drive the selenium tests from any release agent. With Azure Devops Server we must run the selenium tests on an agent. That agent is registered in the previous section and now we can edit the release pipeline.

  • Add another agent job to your release and select the newly create team agent pool
  • Add the Visual Studio Test Platform Installer task – this will install the tools needed to run the unittests*
  • Add the Visual Studio Test task – this will run the unittests driving the selenium tests and publish the results*
    • for Test Platform Version use the “installed by Tools Installer”

*for details see the references

Save the release pipeline and run it. You should see a price cub and a hooray message 😉

References

https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/tool/vstest-platform-tool-installer?view=azure-devops

Posted in Uncategorized | Leave a comment

Fixed drag-and-drop on my Macbook

For some time I’ve been having trouble with drag-and-drop on my Macbook. In the finder a popup would show or the file rename action would trigger. Turns out there is a setting I missed during one of the MacOs upgrades.

Turning off “Look up & data detectors” fixed the drag-and-drop for me. I’m using the defaults – if you’re changing stuff – disabling the “one finger” action is the key.

Posted in Tooling | Tagged , | Leave a comment

Automate changes in SSIS packages

We use Sql Server Integration Services (SSIS) for importing and exporting data. The number of SSIS packages is stil growing and on writing this post we have 35 of them. Changing the database connectionstring for development means opening and editing all packages with Visual Studio until we decided to automate the changes.

In Visual Studio you have a nice visual editor for SSIS packages with the plugin from Microsoft. (https://marketplace.visualstudio.com/items?itemName=SSIS.SqlServerIntegrationServicesProjects) But have you noticed that you can view the code of the package from the solution explorer context menu? Then you see the XML that described the package. XML can be edited with a lot of tools. When we automated we prefer powershell.

We created a powershell script that uses xpath to locate the parts to change. The complete script is at the end of this post. During development of this powershell script we experienced some challenges.

How to read, edit and write XML in powershell

We use Select-Xml to read the complete XML file and locate the node with xpath. Editing can be done on the Node property of the result. To set an attribute you can use the properties of the Node.

To get the xpath to work for SSIS we had to add the DTS namespace. The uri for this can be found in the root node of the SSIS package: DTS = “www.microsoft.com/SqlServer/Dts”

After editing the Save method of the OwnerDocument is called. The parameter should be the full path of the file. The Path property of the Select-Xml result can be used for this. Just make sure you’ve passed the full path to the Select-Xml eg $fileItem.FullName.

Keep formatting to easy view changes

After the script has run you want to view the changes it has made. The default formatting of XML in the save method removes the formatting Microsoft uses when creating and editing in Visual Studio. This makes viewing changes hard.

We’ve found that XmlWriter can be configured to use formatting that resembles the Microsoft way. For this we use the XmlWriterSettings.

Powershell script

We’ve added an extra step to increase the version of the package every time the script has run. Operations will love us for this – oh wait – we are devops … make sure you have the information you need and automate everything 😉

$xpath_version = "/DTS:Executable"
# we use Package Configuration, this will update the connection that is used to load the configuration from a database
$xpath_config = "//DTS:ConnectionManager[@DTS:ObjectName='ConfigurationConnection']/DTS:ObjectData/DTS:ConnectionManager"
$config = [NEW_CONNECTIONSTRING]
$Namespace = @{
    DTS = "www.microsoft.com/SqlServer/Dts"
}

# 1. Find dstx files
$files = Get-ChildItem -Recurse -Path '[PATH_TO_SOURCES]' -Filter '*.dtsx'

foreach($fileItem in $files) {
    $file = $fileItem.FullName

    # 2. edit connection configmanager
    $x = Select-Xml -Path $file -Namespace $Namespace -XPath $xpath_config
    $x.Node.ConnectionString = $config
    $x.Node.OwnerDocument.Save($x.Path)

    # 3. increase version
    $x = Select-Xml -Path $file -Namespace $Namespace -XPath $xpath_version
    $x.Node.VersionBuild = (([int]$x.Node.VersionBuild) + 1).ToString()
    $x.Node.OwnerDocument.Save($x.Path)

    # 4. pretty print
    $xml = [xml](Get-Content -Path $file -Encoding UTF8)
    $StringWriter = New-Object System.IO.StringWriter
    $settings = New-Object System.Xml.XmlWriterSettings
    $settings.Indent = $true
    $settings.NewLineOnAttributes = $true
    $XmlWriter = [System.Xml.XmlWriter]::Create($StringWriter, $settings);
    $xml.WriteContentTo($XmlWriter)
    $XmlWriter.Flush()
    $StringWriter.Flush()
    $StringWriter.ToString() | Out-File -FilePath $file -Encoding utf8
}
Posted in Uncategorized | Tagged , , | Leave a comment

Parse IIS logs with powershell

With the log4j CVE we’re checking our IIS logs in detail. To automate this we’ve created a powershell script that parses the logs and provides query access. Below a script that searches for 502 responses and prints the fields we need for investigation.

$logFolder = 'C:\inetpub\logs\LogFiles\W3SVC1'

# sort with oldest file first
$files = Get-ChildItem -Path $logFolder -Filter '*.log' | sort name
foreach($file in $files) {
   # skip the 3 header lines and remove the #Fields: part to be able to use the first line as headers 
    $log = get-content "$logFolder\$file" | select-object -skip 3 | foreach-object { $_ -replace '#Fields: ', ''} | convertfrom-csv -Delimiter ' '
   # now search for 502 status and print the fields we need in a table
    $log | where sc-status -eq '502' | Select-Object -Property date, time, s-ip, cs-method, cs-uri-stem, cs-username, sc-status | format-table
}

No screenshot – because of security 😉

Posted in Uncategorized | Tagged | Leave a comment