Mentale Modelle für bessere Zusammenarbeit

Wir bei Mayflower schreiben uns auf die Fahne, dass wir mit ehrlicher Kooperation, Agilität und Kompetenz gute Software bauen. Was bedeutet das genau?

Ehrliche Kooperation im agilen Kontext bedeutet, es existieren gemeinsame mentale Modelle, man hat ein gemeinsames Verständnis. Wir nehmen das agile Manifest ernst, sprechen direkt zueinander, unterhalten uns, teilen Erfahrungen … und bauen somit ein gemeinsames Verständnis auf.

Dude … we have to talk about this (in JavaScript)

Recently I stumbled about this post about The complete elimination and eradication of JavaScript’s this.

To give you a TL;DR: The author states that almost no JavaScript developer understands this, that it is a terrible concept, and introduces a library that allows developers to avoid using this in many circumstances. So, we had a twitter discussion with pretty opposing standpoints: I think this in JavaScript is quite simple, versatile and really useful.

But in the next days, it got me thinking …


TypeScript 3.0 – das sind die Neuerungen

Am 30. Juli 2018 wurde TypeScript 3.0 veröffentlicht. Die quelloffene Transpiler-Sprache aus dem Hause Microsoft feiert damit ihr knapp sechsjähriges Bestehen und da ich selbst ein großer Fan von TypeScript bin, möchte ich Euch in diesem Artikel die wichtigsten Neuerungen dieser Version kurz vorstellen.


Symfony 4 Using Private Recipes with Flex

There will be a situation, you want to benefit from the new Flex-Plugin for composer when using Symfony 4 and there is an internal Library which should not be published to public.

Good News: There is a possibility for a private Recipe Server which is still in BETA but works already very well.


LSH – an efficient approach to nearest neighbour search

In Image similarity search with LIRE we explained how to compare and find similar images using the Java library LIRE. The idea was to transform the complicated problem of comparing a large bunch of pixels to the simpler problem of comparing vectors representing histograms and other higher-level properties. In other words, if we can compress the information inherent in a bunch of pixels to a point in n-dimensional space (an array with n entries – the so called feature vector of the image), we can regard the distance between two such points as a similarity measure for the corresponding images. We can then find the images similar to a search image by selecting those images whose feature vectors have a small distance to the feature vector of the search image.

However, a naive approach to the problem – comparing the feature vector of the search image to all feature vectors in the database – is rather slow if our database is large. In this article, we show how to implement a fast similarity search for even very large databases.


Faster programming (with ES6): Magic Getters, Setters and Variable Function Names (Part 3/4)

The aim of this series is to show you how to program faster by writing less code. In part one you get an overview what I mean by writing less code, part 2 shows how destructuring works, how you can use the arrow function and default Params to reduce code. Here in part 3 you’ll learn more about magic getters/setters and the Variable Function Names. Weiterlesen

Faster Programming (with ES6): Destructuring and Arrow Function (Part 2/4)

This series is about how to program „faster“. It is not, however, about typing code faster but to write less code in the first place. It saves time but also you have to read less code and the effort to maintain it sinks in the long run (more about it in the introduction). In part 2 I’ll show you how to use destructuring and the Arrow Function.  Weiterlesen

Faster Programming (with ES6/TypeScript): Introduction

Our daily business as a JavaScript/TypeScript developer is to work with lots of code. The bigger the project, the more code you need to read – instead of writing it. The idea for this four-part series is that in the long term a project always benefits from writing less. Make the code-base smaller, cleaner, so that you and others don’t need to read so much. This allows faster programming and less code increases maintainability.


FENNEC – das Projekt zur Bachelorarbeit

Im Rahmen meiner Doktorarbeit am Lehrstuhl für Bioinformatik an der Universität Würzburg habe ich den FENNEC – Functional Exploration of Natural Networks and Ecological Communities – entwickelt, eine Webapplikation die es Biologen ermöglicht große Datenmengen automatisiert auszuwerten.