I’m Eric Tummers, Software Architect at Statistics Netherlands, and this is how I work


My jobs description says Software Architect. I’m guiding my team through the technical hurdles and do some programming/testing/analysing myself. We follow the scrum guide and are working towards continuous integration and deployment. We do this in small steps. Every sprint we complete means the retrospective is done in the pub.

I expect the best and I give the best. Here’s the beer. Here’s the entertainment. Now have fun. That’s an order!
RasczakLocation: Sittard, NL
Current Gig: Devops Teamlead, CBS
Word that best describes how you work: Teamwork
Current mobile device: iPhone se
Current computer: MacBook 15 inch

What apps/software/tools can’t you live without? Why?

Wunderlist: work tasks, chores, groceries, planning, every task is there. I’m aware of the acquisition by Microsoft and the planned end-of-life.
Evernote: I’m a fan for years now.
Parallels 11: Running Windows on my MacBook is a must. And of course visual studio, team foundation server, build, release manager, sql server management studio, remote desktop, powershell, and some other tools I need for work.
Alfred: keyboard shortcuts for everything. I bought the powerpack and advise you to do the same. Still on v2 though.

What’s your workspace setup like?

Work at the office is on a thin client with 24 inch screen and (wired) mouse and keyboard. The desk and chair comply with all regulations. We have a great coffee machine.

dekstop_2017

My home workspace is still my Macbook 15 inch. I’ve a new setup with logitech keyboard and mouse (MX800) and Benq 24 inch monitor (BL2420PT). Nothing fancy but the extra screen space is very welcome.

What’s your best time-saving shortcut/life hack?

Timebox. Start on a task and spent the time you’ve got to get the best result. Get someone to look at it, get feedback. Then decide if the result is final or to spent some more time.

Besides your phone and computer, what gadget can’t you live without and why?

I replaced my Magellan Echo with the Garmin FR235. It has smart watch features and a optical heart rate monitor. My phone is on mute since the Garmin notifies me of everything.
My Apple Airpods. Easy to use, small and good sound. Never leave the house without them.

What everyday thing are you better at than everyone else? What’s your secret?

Learning new things. My current project lets me implement new things (joy) Also I try to learn the things I know to my team or anyone who listens.
I have a basic understanding of how things work and try to map new things on there. For the details I have a Pluralsight subscription and black belt google skills.

What do you listen to while you work?

My alarm clock plays classical music to wake me up in the morning. The car stereo plays about everything (grunge, rock, kids stories) driving to work. When I need some focus I play drum and bass on my headphones. My ringtone is still Run riot by Camo & Krooked, although it is muted since I got the Garmin.

What are you currently reading?

The Hard Thing About Hard Things. It gives an insight into the problems a CEO has and how to overcome these. I enjoyed reading it on my last vacation and plan to read it again after I finish it.

How do you recharge? What do you do when you want to forget about work?

Spending quality time with my wife and daughters. Phone on silent, no screens, no work. Mostly piggyback riding and tea parties
Also sports like running, fitness, climbing and snowboarding to keep me fit and healthy.

Fill in the blank: I’d love to see _________ answer these same questions.

DC Rainmaker (Ray) has been on my reading list for years. His reviews about sport gadgets is amazing. If you don’t know who this is I urge you to click the link. For the rest; you know why.

What’s the best advice you’ve ever received?

someecards.com - Make a shit first draft you cannot edit a blank page
I believe this is a variant on a Hemingway quote.

Is there anything else you’d like to add that might be interesting to readers?

Learn Powershell. There is so much possible with Powershell. If you can learn one thing this year pick Powershell.

Original idea from Lifehacker.com.

Posted in Uncategorized | Tagged , , , , , , , | Leave a comment

macOS Big Sur update without issues

My macbook became my work-from-home device and I was hesitant to update to Big Sur. The question today is “when to update, not if”. So I made a backup, had my trusty macbook 2009 ready for emergencies and started the installer.

TLDR: no issues on my MacBook Pro (Retina, 15-inch, Mid 2015) except for Little Snitch 4 which is incompatible and I had to upgrade to version 5 for €25

Hardware

MacBook Pro (Retina, 15-inch, Mid 2015)
Processor 2,2 GHz Intel Core i7
Memory 16 GB 1600 MHz DDR3
SSD 256 Gb
Graphics Intel Iris Pro 1536 MB

I’ve written about my update to 10.14 (Mojave update without issues) and updated to 10.15 sometime ago without any issues.

Installation

After the download of 10+ Gb the installation completed in about an hour. I left the scene and don’t have an exact number. It was ready when I returned.

Success

After the first login I was presented with some privacy pop-ups and some blocked extensions. Again I had to upgrade little snitch just like with Mojave. I allow the app store to just update everything when available (like on iOS) and all other apps were up-to-date.

All other apps just worked. I was most happy about Camtasia 2 and Alfred 3 still working. Check https://roaringapps.com for compatibility of your favourite apps. But keep an eye on the versions – since Little Snitch is supported on Big Sur but not version 4.

Failure

Unfortunately Parallels Desktop 11 had stopped working. Never really used it, so I won’t be buying the upgrade.

Work around

Logitech Control Center was a disappointment. I only needed my Logitech Performance MX mouse extra button(s) configured to trigger Mission Control and that was not possible (for me). Turns out that is part of macOS settings > Mission Control > Mouse Button xxx.

On Monday I can work-from-home with an updated macbook. Everything just works.

Why upgrade?

1. I want to have my macbook running the latest and greatest macOS (patches)
2.New control centre and other improvements – https://www.apple.com//macos/big-sur/

Posted in Tooling | Tagged , | Leave a comment

DataGrip TFS plugin setup

We use DataGrip for developing postgres on Greenplum. To get our sources into TFS we’re using the Azure DevOps plugin (https://plugins.jetbrains.com/plugin/7981-azure-devops). Since the documentation is very little and the requests for help are many, we share our setup.

We assume you’ve already got DataGrip installed. In the commands we use the following:

ParameterValue
usernamejohn
passwordsecret
workspacegreenplum
tfs projectdashboard
tfs server urlhttps://localtfs/tfs/defaultcollection
parameter values for commands

Team Explorer Everywhere

The plugin relies on the team explorer everywhere software from https://github.com/Microsoft/team-explorer-everywhere/releases. This requires a java runtime to be installed. That is already installed with DataGrip (so first install that)

We edited the tf.cmd file to use less memory for java. Maybe you’ll need to do the same. Details on github: https://github.com/microsoft/azure-devops-intellij/issues/45#issuecomment-268790878

  1. Unzip the latest release of team explorer everywhere in your C:\Users\[USER]\Documents folder
  2. Start a command prompt with administrative rights (run as Administrator)
  3. Go to the folder of step 1
  4. Create a new workspace
    tf -login:john,secret -server:https://localtfs/tfs/defaultcollection workspace -new greenplum
  5. Map a local folder with a tfs project in the workspace (make sure to create the local folder)
    tf workfold -map -workspace:greenplum $/dashboard c:\dashboard
  6. Get the sources
    tf get -recursive c:\dashboard

DataGrip project

Now comes the “hacking” of a DataGrip project so it can use the TFS plugin.

  1. In DataGrip open menu File > New > Project – supply the name dashboard
  2. File > Close Project
  3. Close DataGrip
  4. Go to C:\Users\[USER]\AppData\Roaming\JetBrains\DataGrip2020.3\projects\dashboard with the Windows Explorer
  5. Move the .idea folder to C:\dashboard
  6. Open DataGrip
  7. File > Open Project
  8. In the Open dialog remove the “old” dashboard project as it points to the old location
  9. Browse to the C:\dashboard and open it as a project

TFVC plugin

Finally the installation of the plugin.

  1. In DataGrip open File > Settings
  2. Click the Plugins section, click the wheel > Install plugin from disk and supply the path to the zip file downloaded from https://plugins.jetbrains.com/plugin/7981-azure-devops
datagrip settings to install plugin
  1. Restart DataGrip
  2. File > Settings
  3. Open the Version Control section and click TFVC > supply the location of tf.cmd in your C:\Users\[USER]\Documents\TEE-CLC-14.135.0 folder
  4. Uncheck the Use built-in reactive client if possible
  5. Visual Studio TF client was already found since we have VS2019 installed
  6. Click OK
  7. File > Attach Directory to Project – supply c:\dashboard
  8. Git > Enable Version Control Integration – choose TFVC
  9. Click OK
  10. A dialog prompts for the password to connect to TFS – supply secret
  11. Click OK
  12. Now DataGrip shows the folder structure on the right and options to use TFVC like Checkout and Update (= get latest).

These are the steps we used to get it working. Hope you’re able to achieve great things with this knowledge.

Posted in Development, Tooling | Tagged , , , | Leave a comment

Speed up table readout in codedUI

We still use codedUI in our releases for testing. Migration to selenium is added to the backlog. Until it is added to the sprint we are working with codedUI and we must maintain it. We noticed the table readout was very slow and that is something we do a lot. Time to speed things up.

By adding trace writelines and the stopwatch (both in System.Diagnostics) we pinpointed the bottleneck in the iteration for getting the data from the cells. Looking at the code we noticed that the slow part used the index. The foreach for getting the fields was fast. So we rewrote the index to a foreach:

public DataTable GetDataTableWaarde() {
   var control = FindHtmlTable(); // search with the id
   var datatable = new DataTable();
 
   // 1. header contains the fields > add columns
   var headerRow = control.GetRow(0);
   foreach(HtmlHeaderCell header in headerRow.GetChildren()) {
      datatable.Columns.Add(header.InnerText);
   }
 
   // 2. rows contain the data > add rows
   foreach(HtmlRow row in control.Rows.Skip(1)) {
      var newRow = datatable.NewRow();
      try {
         // this is slow:
         //for (int i = 0; i < datatable.Columns.Count; i++) {
         //   newRow[i] = row.GetCell(i).InnerText;
         //}
         // this is fast:
         foreach(HtmlCell cell in row.Cells) {
            newRow[cell.ColumnIndex] = cell.InnerText;
         }
         datatable.Rows.Add(newRow);
      } catch (Exception) {
      // handle exception
      }
   }
 
   return datatable;
}

After this change the table readout went from 40 seconds down to 4 seconds (including the validation) The total test time went down from 56 minutes to 26 minutes by changing two lines of code and some kaizen blitzen.

Posted in Development, Test | Tagged , | Leave a comment

Unittesting R scripts

We’re building a solution that uses some R scripts for data analysis and cleanup. The R scripts are tested during the integration phase when the database is available. We would like to test the scripts when new versions are pushed to source control without the need for a database. This is where unittests come in.

The scripts all follow the same steps:

  • setup,
  • read data from database,
  • process data,
  • write data to database,
  • report

First we need to split logic, flow and parameters. The easiest way to do this is to implement functions with the names of the steps listed above and call the functions from a new script. Some code will speak a thousand words.

# script with logic and flow functions
library(iterators)
library(foreach)
library(data.table)

setup <- function() {  }
read_data <- function(db, param) {  }
process_data <- function(data) { data.table(mean_price = mean(data$price))  }
write_data <- function(db, data) {  }
report <- function(db, param) {  }
do_work <- function(db, param) {
   setup()
   unprocessed_data <- read_data(db, param)
   iterator <- iter(unprocessed_data$data)
   foreach (row = iterator) %do% { 
      processed_row <- process_data(row)
      write_data(db, processed_row)
   }
   report(db, param)
}

# script with parameters
source('script_with_functions.R')
connection <- odbcDriverConnect(connectionstring = '...')
year <- 2016
do_work(connection, year)
odbcClose(connection)

During execution of the script with parameters the functions are loaded and the outcome will be the same as the initial script. This code refactoring enabled us to write unittests for the functions where mock objects can be used to mimic the external dependencies. Again some code to explain what we’re talking about

library(testthat)
library(mockery)
library(data.table)

source('script_with_functions.R')
describe('process_data', {
   it('calculates_mean_price', {
      # data table with 4 rows with price 10
      four_rows_price_10 <- data.table(price = rep(10, 4))
      result <- process_data(four_rows_price_10)
      expect_equal(result$mean_price, 10)
   })
})
# other functions only read/write data: no unittest needed, since no logic
describe('do_work', {
   it('calls setup', {
      # create mock object for setup function
      fake_setup = mock()
      # replace setup function with mock for calls to do_work
      stub(do_work, 'setup', fake_setup)
      # call the 'flow' function
      fake_db = mock()
      do_work(fake_db, 2016)
      # verify setup was called
      expect_called(fake_setup, 1)
   })
   it('calls process_data 4 times', {
      # create mock object for setup function
      fake_process_data = mock()
      # replace process_data function with mock for calls to do_work
      stub(do_work, 'process_data', fake_process_data)
      # return 4 sets of data to process (with 3 rows each)
      four_sets_of_data <- tibble(g=1:4, data=list(data.table(price=1:3)))
      stub(do_work, 'read_data', four_rows_with_data)
      # call the 'flow' function
      fake_db = mock()
      do_work(fake_db, 2016)
      # verify process_data was called 4 times
      expect_called(fake_process_data, 4)
   })
})    

By splitting the logic and flow into functions we’re able to write unittests that check the logic in the process_data function and the flow in the do_work function. The functions not unittested all need a database to work – we could use sqlite inmemory – but that is a database too – so we leave those tests for the integration tests.

Running the unittests above will result in rainbows and a smiley. Complete working code in my github.

Test passed 🌈
Test passed 🌈
Test passed 😀
Posted in Development, Test | Tagged , , | Leave a comment

The Unicorn Project

During lockdown I’ve been reading The Unicorn Project. The follow-up to The Phoenix Project. Again a very entertaining writing about the rocky road to get the IT department to the next level.

I’ve read this book after I’ve read The Phoenix Project for the second time. It is situated during the same timespan and sometimes the storylines overlap. This makes clear not one team’s effort has saved the company, but multiple teams with the same endgoal.

The story is written with enough technical background to be based on true life and everything learned can be applied in the now. I can relate to a lot of the problems faced even though I’m not working at a commercial company. But keep in mind that the book is fictional.

After reading The Unicorn Project I’m more convinced that devops has its advantages, but might not be easily applied to every company. It requires a lot of agility of the entire company and a group of enthousiasts that are willing to go the extra mile.

Read more here: https://itrevolution.com/the-unicorn-project/

Posted in Uncategorized | Leave a comment