Developing inside a Container

I’m contributing to Sonarqube-Sql-Plugin (https://github.com/erictummers/sonar-sql-plugin) and creating pull requests to merge back my changes. Getting my environment setup was easy using homebrew (https://brew.sh) but still required some work. Here is how VSCode makes this much easier with developing inside a container.

Container

With docker you can start an isolated environment that is created to do one thing – host the software. The container is setup with all the dependencies and settings needed during the initial build, so we can just use them. Installation is easy with docker desktop (https://www.docker.com/products/docker-desktop)

VSCode now has an extension that you can use to run the development environment inside a container. I installed the remote containers extension (https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) and added the .devcontainer folder. The dockerfile contents I copied from Microsoft’s sample container for java (https://github.com/microsoft/vscode-dev-containers/tree/main/containers/java/.devcontainer) Now I can dev – build – repeat in a container.

VSCode project

After reopening the project in the container (F1 > Remote-Containers: Rebuild and Reopen in Container) VSCode started loading everything I needed to develop. Me being a Microsoft developer, I didn’t know the pom.xml contained the build target information. The maven build targets are offered with a run button next to them for easy use. I was able to build the jar file and test it within minutes.

The .devcontainer folder can be added to Github so everyone can use it. No more “works on my machine” 😉

References

developing inside a container – vscode (https://code.visualstudio.com/docs/remote/containers)

Posted in Development, Tooling | Tagged , , , , | Leave a comment

Code Notes snippet manager

Like many developers I have some snippets with code. These snippets are dear to me and I use them all the time. Without a real place to store them I would be lost. This is where Code Notes comes in.

image from https://lauthieb.github.io/code-notes/

Code Notes (https://lauthieb.github.io/code-notes/) is a standalone application that stores my snippets. It is simple and intuitive to use and deploy – just put the files in your My Documents and use it from there. There is the option of connecting to github for their gist feature – but I have no internet connection from work :(. Maybe someday …

You never miss this tool until you’ve used it. Now I can’t work without it. Go download it, use it and contribute to it (https://github.com/lauthieb/code-notes).

Posted in Development, Tooling | Tagged , | Leave a comment

macOS Big Sur update without issues

My macbook became my work-from-home device and I was hesitant to update to Big Sur. The question today is “when to update, not if”. So I made a backup, had my trusty macbook 2009 ready for emergencies and started the installer.

TLDR: no issues on my MacBook Pro (Retina, 15-inch, Mid 2015) except for Little Snitch 4 which is incompatible and I had to upgrade to version 5 for €25

Hardware

MacBook Pro (Retina, 15-inch, Mid 2015)
Processor 2,2 GHz Intel Core i7
Memory 16 GB 1600 MHz DDR3
SSD 256 Gb
Graphics Intel Iris Pro 1536 MB

I’ve written about my update to 10.14 (Mojave update without issues) and updated to 10.15 sometime ago without any issues.

Installation

After the download of 10+ Gb the installation completed in about an hour. I left the scene and don’t have an exact number. It was ready when I returned.

Success

After the first login I was presented with some privacy pop-ups and some blocked extensions. Again I had to upgrade little snitch just like with Mojave. I allow the app store to just update everything when available (like on iOS) and all other apps were up-to-date.

All other apps just worked. I was most happy about Camtasia 2 and Alfred 3 still working. Check https://roaringapps.com for compatibility of your favourite apps. But keep an eye on the versions – since Little Snitch is supported on Big Sur but not version 4.

Failure

Unfortunately Parallels Desktop 11 had stopped working. Never really used it, so I won’t be buying the upgrade.

Work around

Logitech Control Center was a disappointment. I only needed my Logitech Performance MX mouse extra button(s) configured to trigger Mission Control and that was not possible (for me). Turns out that is part of macOS settings > Mission Control > Mouse Button xxx.

On Monday I can work-from-home with an updated macbook. Everything just works.

Why upgrade?

1. I want to have my macbook running the latest and greatest macOS (patches)
2.New control centre and other improvements – https://www.apple.com//macos/big-sur/

Posted in Tooling | Tagged , | Leave a comment

DataGrip TFS plugin setup

We use DataGrip for developing postgres on Greenplum. To get our sources into TFS we’re using the Azure DevOps plugin (https://plugins.jetbrains.com/plugin/7981-azure-devops). Since the documentation is very little and the requests for help are many, we share our setup.

We assume you’ve already got DataGrip installed. In the commands we use the following:

ParameterValue
usernamejohn
passwordsecret
workspacegreenplum
tfs projectdashboard
tfs server urlhttps://localtfs/tfs/defaultcollection
parameter values for commands

Team Explorer Everywhere

The plugin relies on the team explorer everywhere software from https://github.com/Microsoft/team-explorer-everywhere/releases. This requires a java runtime to be installed. That is already installed with DataGrip (so first install that)

We edited the tf.cmd file to use less memory for java. Maybe you’ll need to do the same. Details on github: https://github.com/microsoft/azure-devops-intellij/issues/45#issuecomment-268790878

  1. Unzip the latest release of team explorer everywhere in your C:\Users\[USER]\Documents folder
  2. Start a command prompt with administrative rights (run as Administrator)
  3. Go to the folder of step 1
  4. Create a new workspace
    tf -login:john,secret -server:https://localtfs/tfs/defaultcollection workspace -new greenplum
  5. Map a local folder with a tfs project in the workspace (make sure to create the local folder)
    tf workfold -map -workspace:greenplum $/dashboard c:\dashboard
  6. Get the sources
    tf get -recursive c:\dashboard

DataGrip project

Now comes the “hacking” of a DataGrip project so it can use the TFS plugin.

  1. In DataGrip open menu File > New > Project – supply the name dashboard
  2. File > Close Project
  3. Close DataGrip
  4. Go to C:\Users\[USER]\AppData\Roaming\JetBrains\DataGrip2020.3\projects\dashboard with the Windows Explorer
  5. Move the .idea folder to C:\dashboard
  6. Open DataGrip
  7. File > Open Project
  8. In the Open dialog remove the “old” dashboard project as it points to the old location
  9. Browse to the C:\dashboard and open it as a project

TFVC plugin

Finally the installation of the plugin.

  1. In DataGrip open File > Settings
  2. Click the Plugins section, click the wheel > Install plugin from disk and supply the path to the zip file downloaded from https://plugins.jetbrains.com/plugin/7981-azure-devops
datagrip settings to install plugin
  1. Restart DataGrip
  2. File > Settings
  3. Open the Version Control section and click TFVC > supply the location of tf.cmd in your C:\Users\[USER]\Documents\TEE-CLC-14.135.0 folder
  4. Uncheck the Use built-in reactive client if possible
  5. Visual Studio TF client was already found since we have VS2019 installed
  6. Click OK
  7. File > Attach Directory to Project – supply c:\dashboard
  8. Git > Enable Version Control Integration – choose TFVC
  9. Click OK
  10. A dialog prompts for the password to connect to TFS – supply secret
  11. Click OK
  12. Now DataGrip shows the folder structure on the right and options to use TFVC like Checkout and Update (= get latest).

These are the steps we used to get it working. Hope you’re able to achieve great things with this knowledge.

Posted in Development, Tooling | Tagged , , , | Leave a comment

Speed up table readout in codedUI

We still use codedUI in our releases for testing. Migration to selenium is added to the backlog. Until it is added to the sprint we are working with codedUI and we must maintain it. We noticed the table readout was very slow and that is something we do a lot. Time to speed things up.

By adding trace writelines and the stopwatch (both in System.Diagnostics) we pinpointed the bottleneck in the iteration for getting the data from the cells. Looking at the code we noticed that the slow part used the index. The foreach for getting the fields was fast. So we rewrote the index to a foreach:

public DataTable GetDataTableWaarde() {
   var control = FindHtmlTable(); // search with the id
   var datatable = new DataTable();
 
   // 1. header contains the fields > add columns
   var headerRow = control.GetRow(0);
   foreach(HtmlHeaderCell header in headerRow.GetChildren()) {
      datatable.Columns.Add(header.InnerText);
   }
 
   // 2. rows contain the data > add rows
   foreach(HtmlRow row in control.Rows.Skip(1)) {
      var newRow = datatable.NewRow();
      try {
         // this is slow:
         //for (int i = 0; i < datatable.Columns.Count; i++) {
         //   newRow[i] = row.GetCell(i).InnerText;
         //}
         // this is fast:
         foreach(HtmlCell cell in row.Cells) {
            newRow[cell.ColumnIndex] = cell.InnerText;
         }
         datatable.Rows.Add(newRow);
      } catch (Exception) {
      // handle exception
      }
   }
 
   return datatable;
}

After this change the table readout went from 40 seconds down to 4 seconds (including the validation) The total test time went down from 56 minutes to 26 minutes by changing two lines of code and some kaizen blitzen.

Posted in Development, Test | Tagged , | Leave a comment