We use SSIS packages to import files. These SSIS packages are run with sqljobs. During deployment the sqljobs are created with a script. That is the plan.
While running the script to create the sqljobs we got an error:
Error 14274: Cannot add, update, or delete a job (or its steps or schedules) that originated from an MSX server
Image courtesy of jesadaphorn / FreeDigitalPhotos.net
Most solutions found on the internet talk about the name of the server. When you’ve renamed your Sql Server (yeah right) then you need to update the originating_server column in the sysjobs. This was not the case for us.
Over at msdn someone suggested to look at the variables. The first create sqljob would set the @jobid variable and the second create statement would use that value as input, where it is intended as output. The suggested SET @jobid = NULL before each call to sp_add_job did the trick.
The SDN – Software Development Network – is a special interest group for dutch developers. Four times a year they organise an event where people present and talk about their passion.
As a member of the SDN you are aware of the latest developments. You are part of a network of professional developers who assist each other in word and deed. This means there is a technical helpdesk at your fingertips so you can book considerable time savings in solving problems.
sdn.nl with Google translate
Here are the talks I attended.
Customer identities in an Enterprise scenario
With slides and demo’s Jurgen showed us the Azure Active Directory Business 2 Consumer (AD B2C) solution from Microsoft. It differs from AD that you don’t know how many users you need in advanced and thus the costs are mainly based on the number of authentications, where AD charges for the amount of users.
The features include self-service (password reset, signup, …), usage of (social) identity providers and extensive login page customisation.
* Image from blogs.msdn.microsoft.com
Zero-downtime applications with containers
Docker is used for highly available solutions. In the demo a docker swarm was used to host a .net core application. With a visualizer container the setup (1 master and 2 slaves) was visualised on a website.
Using blue-green / rollout scenario’s the upgrade can be performed. Tips for database scheme: support version n-1 and do an upgrade after the last n-1 version of your application is replaced.
Fun fact: presentation had to be done with webcam from laptop #1 pointed to screen of laptop #2 because the projector couldn’t handle the resolution of laptop #2. 😎 slides
Progressive web apps and the future of the web
Progressive web apps (PWA) are defined as Reliable (offline), Fast (install/usage) and Engaging (capabilities). It uses a Service worker, which is installed from the website. The Service Worker reacts to events, retrieves data from the internet when online or from the cache when offline.
The presentation was recorded and is available on youtube:
Conclusion
A lot is happening and a single developer can’t keep up with all new things. This SDN event is (again) a great way to see what’s happening and get first hand information from experts.
After some searching I ended up using sslforfree.com since I could verify my domain by adding a TXT-record. Other sites needed me to host files or open ftp to them with username and password 😯
To verify my domain using a TXT-record I noticed the google support page was not providing the right solution. The DNS entry should have _acme_challenge for the Host name:
Type
Host
Value
TTL
TXT-Record
_acme_challenge
R4nD0m57R1n9
1 min
The provided certificate is valid for 90 days. Safari trusts this CA and the certificate. I’m happy again. 🙂
Our build server is TFS2015 and Microsoft promises that we can customize code coverage analysis. But with runsettings setup to exclude assemblies from code coverage exceptions fail the build. Not nice.
Today we figured out how to work around this and exclude an assembly from code coverage. The key was in the msdn article; the pdb file was needed to get code coverage. Would this mean that when the pdb file was not there no code coverage was registered?
The assembly we wanted to exclude was Common.Logging. The nuget includes the pdb file for debugging purposes. We’re about to delete it in our build sequence:
The delete files step removes the Common.Logging.pdb before the Visual Studio Test step. Now the code coverage is not calculated for Common.Logging. Mission accomplished.
We use the data grid from Syncfusion and it is awesome! It serialises the data on the server and puts it in the page as json. Now more postbacks to get extra data or for sorting / filtering / other expected user feature.
Unfortunately the formatting of datetime field was broken in the version we are using. After some digging we found that altering the json serialisation would fix the issue. See code below
using Newtonsoft.Json;
using Newtonsoft.Json.Converters;
using Syncfusion.JavaScript.Shared.Serializer;
using System;
namespace Utils {
/// <summary>
/// big data serializer, uses Newtonsoft.Json to get over 4Mb limit
/// and serializes datatime fields for easy formatting
/// </summary>
public class BigDataSerializer : IDataSourceSerializer {
// format DateTime as milliseconds from 1-1-1970 (java thing)
public class MyDateTimeFormat : DateTimeConverterBase {
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer) {
var milliseconds = ((DateTime)value).Subtract(new DateTime(1970, 1, 1)).TotalMilliseconds;
writer.WriteRawValue(string.Format("new Date({0})", milliseconds));
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer) {
throw new NotImplementedException();
}
}
// uses the MyDateTimeFormat class
public string Serialize(object obj) {
var result = JsonConvert.SerializeObject(obj,
Formatting.None,
new MyDateTimeFormat());
return result;
}
}
}
In the Razor file that uses the data grid we put the following code at the start. This makes sure the BigDataSerializer is used and DataTime is formatted correct.
@{
Syncfusion.JavaScript.Shared.Serializer.DataManagerConverter.Serializer = new Utils.BigDataSerializer();
}