19.12. Test Driven JavaScript Development

A couple of weeks ago, I was attending a three-day workshop for agile developer skills. The workshop was split into five topics: Colaboration, Refactoring, Design & Architecture, Continious Integration and Test Driven Development. Especially the session about Test Driven Development was very interesting. Although I know the principles of TDD I was really impressed by the demonstration of solving a simple exercise (a coding kata) done in Java by the instructors of the workshop. It was not so much the coding in Java that was interesting for me, it was the combination of writing a test, executing the test with a shortcut from the IDE, see the test fail, write the implementation and re-start the test again within the IDE. You will say “that´s test driven development- it´s nothing new!” and you are right! But is there a way to do Test Driven Development in the same way for JavaScript? I mean writing a test, execute the tests with a shortcut from the IDE, see the test fail, implement the method and re-start the test? Yes, there is a way! So let me show you what I have done to do the same coding kata (called Fizzbuzz) with JavaScript.


03.12. Setting up an own QA Environment for Javascript

Nearly every PHP project comes up with a great set of tools to assure the quality of source code; unit tests are a no longer a „nice to have“ feature, they are common components in new projects. While unit tests help you to provide solid interfaces and proof the functionality of a certain method, there is a long list of tools that check the quality of the source code itself with many different metrics. Each of these mentioned tools reports its results into a XML file, which can easily be interpreted by a continuous integration server like Jenkins. The benefit is enormous: After every commit to your version control system your continuous integration server triggers the execution of the tests and the source code quality analysis and shows the result in meaningful diagrams. As a developer you get a direct feedback and you can make a prediction on the status of the entire project, because if even the smallest units work fine, there is a good chance that the whole system runs stable.

JavaScript Pitfalls for PHP-Developers

A couple of years ago, PHP developers only had to have strong knowledge in the language itself and experience in some PHP frameworks and libraries, additionally skills in MySQL were expected.
If we take a look at our current job advertisement, these knowledge is still important, but also skills in JavaScript are asked and strongly desired. If you wonder why JavaScript is so popular at these times, my answer is quite simple: The browser is no longer a stupid instrument to view some static websites on the internet- the browser turned into an (Web-) Application Platform that provides more content then plain text. This results in more and more complex sites and therefore in much more source code. One of the reasons for this progress is AJAX, which is the art of exchanging data with a server, and updating only parts of a website (with JavaScript). Therefore JavaScript is the big winner on this evaluation, because it is available on nearly every browser, needs no plug in or any additional software and it is easy to learn. In the early days of the internet we used JavaScript to create some lousy effects or did some simple checks for any input fields. These days are gone, it is not unusual to have more JavaScript code than PHP code in large projects. Coding with JavaScript is not the same as coding with PHP, that´s the reason why I will write down some typical pitfalls for a PHP developer whose coding JavaScript.

Data types

There are two major data types in JavaScript, the primitive one (Boolean, String, Number, …) and Object (Array, Function). To check the data type the keyword typeof keyword might help you. But be aware! Objects are the fundamental units of JavaScript and virtually everything in JavaScript is an object and takes advantage of that fact. Even the primitive data types have so called “primitive object wrapper” and there´s great potential for wrong usage of the typeof keyword and checking the data type. Here is a small overview what you will get if you use the typeof keyword with several variables:


typeof Variable

{company: ‚Mayflower‘}






a simple String”






new String(“Mayflower”)


new Employe( )


You can see, you will not always get what you expect. Therefore it´s not always a good choice to check the data type with typeof. A better approach is to use the constructor property to identify the data type of a variable.

var myArr = [ ];
if (myArr.constructor === Array) {console.log(“yes, I am an Array!”)}

The downside of this approach is the needed comparison to a known data type. If you create your Array with the new operator, you can get the data type with the prototype property, like this:

var myArr = new Array();
console.log(myArr.prototype); // Array()

Array !== Array

In the PHP world an array is a wildly used data type and we are differ between the zero indexed, comma separated array and hash tables of key-value pairs, better known as associative arrays. In JavaScript an Array is just a simple zero indexed list of values- nothing more. There is no associative array in JavaScript. There is no trailing comma support for IE (causes an error)

var myArr = ['Mayflower','PHP','JavaScript'];

In JavaScript is something like an associative array, but this has nothing to do with the Array data type. The JavaScript world calls this an object literal and is a collection of key value pairs.

var obLiteral = {company: “Mayflower”, offices: ['München','Würzburg'] };

By the way, you are right if you say “the syntax looks like for a JSON string”. The syntax for an object literal and a JSON string are pretty similar with one main difference: For a JSON string the property names need to be wrapped in quotes to be valid. For object literals the quotes are only required if the property name have spaces or any other not valid identifiers.

for (… in … ) !== foreach

When you are working with Arrays the you surely know the foreach construct to iterate over the array. Against the common opinion, the for (… in … ) construct is not the JavaScript equivalent of foreach and it is not the best way to iterate over an array in JavaScript.

DO: Use the for (…) loop for iterations over array objects
Every JavaScript array provides a length property, so it is the best practice for iterating over a JavaScript using a simple for construct.

DON´T: Use the for (… in …) loop for iterations over array objects
It is not impossible to use the for (…. in … ) loop with an array (you remember, arrays are objects, too), but it can lead to some ugly errors because JavaScript is a powerful language and you can add functionality and properties to nearly every existing object (this is called augmentation). If you iterate over an augmented array object you will not get only the elements of the array, you will even get the augmented functionality and this may occur in an error.

The thing with the semicolon

Have you ever heard about “line termination with semicolon insertion”? JavaScript provides a mechanism that adds a semicolon to the end of a line if you forget it. At a first look this may be helpful but it can be the reason for endless debugging sessions. Imagine the following situation:

function user(){



company: “Mayflower”,

business: “PHP Development”



The user function will run without any error and will return a valid value (Ok, it´s undefined but it´s a valid return value). It´s not the desired object literal because the semicolon insertion terminates the function after the return.

I´m sure there are a lots of more pitfalls for PHP Developers, if you are interested in real crazy JavaScript I recommend a visit at http://wtfjs.com/.

Minimizing your JavaScript Code

my last projects were mostly web 2.0 online applications I had to do
a lot of javascript programming. Using JS-Frameworks like JQuery or
Dojo the size of all necessary javascript files had added up to more
than one megabyte. Even today with highspeed internet connections one
megabyte needs several seconds to be transferred, so I had to find a
way to reduce the size of the code. For all my previous projects I
Edwards Packer
. This amazing packer was able
to reduce it to appr. a fourth of the original size. Getting this
nice result I was wondering if the tool delivered with Dojo – the
– or other minification tools
and the
could do this as well. After some
tests, however, I realized that they couldn’t.

the web for more information about the best way I found a cool tool /
website that shows the transfer sizes of well known javascript
libraries after minification / packing. It is also possible to let
the tool calculate the results for your own piece of code:

addition, the tool also calculates the size of a transfer with
http-compression (gzip). Before I found this website I did not see
the size after minification / packing and compression in combination.
Perhaps I should have done this before…

the results of CompressorRater it seems that Dean Edwards Packer
might really be the best choice – even with zip compression. Very
interesting is the fact that the results of packing and minification
is nearly the same after zip compression.

the size of javascript files and their transfer speed is only one
aspect which affects the loading delay of websites. The other aspect
is the time the browser needs to interpret those scripts. As packed
files have to be interpreted twice – first to eliminate the eval
and second to interpret the generated code – their loading delay is
higher than those of minified files.

taking a closer look at the loading time of those minimized
javascript files it turned out that the Packer is not always
the best choice: In case of bigger javascript files, the Packer
needs up to several seconds to d’eval the code. This doesn’t matter
so much in web applications where javascript is only loaded and
interpreted once at the beginning but it will slow down a typical
website even more because javascript has to be interpreted at each
site request, even if it is in the browser’s cache. Using a minified
script avoids this.

I suggest to use Packer only if it is not possible to realize an

Marc Andreessen analyzes the Facebook platform

Dear API developers and API providers,

I just came accross Marc Andreessen’s blog posting where he analyzes the Facebook platform and gives his opinion on several aspects. One thing was very remarkable about the way Facebook boosts the usage of the 3rd party applications that have been registered on the platform:

And then, on top of that, Facebook is providing a highly viral distribution engine for applications that plug into its platform. As a user, you get notified when your friends start using an application; you can then start using that same application with one click. At which point, all of your friends become aware that you have started using that application, and the cycle continues. The result is that a successful application on Facebook can grow to a million users or more within a couple of weeks of creation.

This is a really cool viral distribution model for applications based on an API of a „Web2.0 social foobar application“. It presumes that 3rd party applications have to register on the platform (which is usually a good idea for social network platforms as there are of course data privacy concerns of the platform’s users) and that users have to select which applications they want to use. This selection will be displayed to your friends/contacts and so enables them to see which 3rd party applications of the platform you are using. The consequence is that they might of course be interested to use this application, too, which is the viral boost for the 3rd party application.

That explains why 3rd party applications like iLike grew like hell from under 40,000 facebook users to now more than 3 million facebook users.


Marc’s analysis is definitively worth a read.

Absurdistan 2.0 – plappadu.com

Soeben erreichte mich folgende Pressemitteilung. Meine erster Gedanke: „Oh man…“ – Leute, wo bleiben die Innovationen? Wie viele Twitter-Clones gibt es alleine in DE?


Sehr geehrte Damen und Herren,

die Spatzen zwitschern es von den Dächern: Der aktuelle Internet-Trend ist die Live-Kommunikation a la Twitter. Ein Pendant des beliebten Web-Dienstes steht ab sofort auch in deutscher Sprache zur Verfügung: Bei plappadu.com teilen die Nutzer dem Rest der Online-Welt im SMS-Stil mit, was sie gerade tun, wo sie gerade sind und was sie bewegt.

„plappadu.com ist wie Bloggen – nur viel kürzer und absolut live“, sagt Marcus Veigel, Geschäftsführer der Cynapsis GmbH, die das Portal für die F.N. Media GmbH aus Münster umsetzt. Während Blogger sich hinsetzen und ihre Gedanken und Erlebnisse mehr oder weniger ausführlich niederschreiben, erzählt sich die plappadu-Community in Echtzeit, was gerade anliegt.

Alle weiteren Informationen entnehmen Sie bitte der angehängten Pressemitteilung.

Testen können Sie das neue Angebot unter www.plappadu.com. Presseinformationen und druckfähiges Bildmaterial finden sich auch auf der plappadu-Webseite im Menüpunkt „Presse“.


Ich begreif’s nicht … und der Name erst …

sevenload geht ab

Gestern hatten Johann und ich die schöne Gelegenheit, Thomas Bachem (sevenload) in München zum Mittagessen zu treffen. Mit Tom hatte ich im letzten Jahr ein Videointerview gemacht, zudem hatten wir im PHP Magazin auch schon mal einen Artikel darüber. Die Plattform verwendet unter anderem lighttpd mit seinem Flash Streaming Support zum Ausliefern der Videos und kann u.a. damit hervorragend skalieren. Wie wir gestern erfahren haben, wird sevenload auf mittlerweile 120 Servern betrieben. Ich freue mich schon auf die API, die hoffentlich bald kommen wird. :-) So könnte man die Infrastruktur, die sevenload aufgebaut hat, als Content-Delivery Network (gegen Entgelt) nutzen und müsste sich nicht um den Aufbau eigener Videodistributions-Infrastruktur kümmern. Wir sind sehr gespannt auf die weitere Entwicklung dieser Plattform, insbesondere den Drive, den die neuen Investoren in das Unternehmen bringen werden. Die dargestellten Wachstumschancen hören sich sehr spannend an.


sevenload ist ein sehr schönes Beispiel für ein gelungenes Web2.0 Projekt. Mit illustren Investoren wie Ströer, Burda, Martin Varsawsky und vielleicht noch dem Einen oder Anderen, der in Zukunft folgen wird, sieht sich sevenload gut gerüstet für die Zukunft. So wie es aussieht, verbreitert sevenload auch in Zukunft die Revenue-Streams, was ich für eine gute Sache halte (mehr kann ich dazu allerdings nicht sagen). Was ja bisher auch schon ein gutes Geschäft war (und wenig bekannt war), sind B2B White-Label Lösungen, die sevenload anderen Unternehmen auf Basis seiner Technik anbietet. Zudem muss ich persönlich sagen, dass ich die Oberfläche viel besser und intuitiver finde als zum Beispiel die von YouTube – aber das ist nur mein persönlicher Eindruck.


Ähnlich wie das Team um Xing bemühen sich auch die Jungs von sevenload, Verbesserungen in der Architektur und GUI schrittweise und sehr behutsam durchzuführen, um die Nutzer nicht zu sehr zu verwirren. Das finde ich vorbildlich! So gibt es zum Beispiel seit einigen Monaten bei sevenload kleine animierte GIFs bei den Video-Thumbnails: wenn man beim Browsen durch die Videos mit der Maus über ein Thumbnail geht, werden die ersten Frames des Videos angezeigt. Eine kleine, aber feine Verbesserung des Systems.


Aus Community- und „Schmoozing„-Sicht muss ich sagen, dass Ibo, Thomas & die restlichen Jungs sehr angenehm sind. Ein Treffen im Biergarten oder zum Schnitzel essen (insider, Tom :-) ) macht Spaß. Ich freue mich über den Erfolg dieses Teams.


Disclaimer: sevenload ist einer unserer Kunden.

Dojo goes offline

With the advent of Web2.0 and more and more AJAXian applications popping up, there has been a tremendous trend to move from classical desktop applications to web based applications. While you can use an approach of offloading the whole application to the desktop (i.e. with Microweb and the like), more and more JavaScript toolkit libraries offer „offline“ functionality. That means there will be functionality for offloading the data to the client and care for the synchronisation to the server later on.


These features are especially useful for application usage where you don’t always have Internet connection, but need to synchronize the data later to the server. Dojo, a very powerful JavaScript toolkit, now gives you handy offline features. According to the website:

Dojo Offline is a free, open source toolkit that makes it easy for web applications to work offline. It consists of two pieces: a JavaScript library bundled with your web page and a small (~300K) cross-platform, cross-browser download that helps to cache your web application’s user-interface for use offline.

Dojo offline with its small cross-platform, cross-browser download is licensed as BSD, so it’s very easy to integrate it into your applications. There’s also a video that shows you the usage of the offline tool. Dojo offline is currently marked as Beta. Overall, great work, guys!

Playing with the Xing API

Xing is Europe’s leading Business Network with more than 2 million members. Recently, Xing announced that they would come up with an API later this year to get access to the network. As far as I know, Xing was developed by ePublica using Perl and MySQL.


Having an API is essential in these mashup days. I was invited to the private alpha test and implemented a reference implementation of an API client via PHP5 which behaves like SOAPClient (but uses ReST as the transport mechanism) and overloads the methods that are available.


Here’s an example of how to call it currently (API is subject to change as it is in an alpha state):

Currently, for security and data privacy reasons (I have to say that Xing has always been an example of how to protect their user’s data/privacy), the API lets me only access my direct business contacts and furthermore scrambles the personal data of the contacts.


In the last few days I had a bit time during a cold and wrote my own mashup – marking my Xing contacts on a Google Map. Click on continue to see some screenshots and a bit of explanation …


Twitter-Klone, Web2.0 Apps & Co: sicher genug?

Nachdem nun auch im deutschsprachigen Raum der eine oder andere twitter Klon aus dem Boden gestampft wurde, sollte man die Diskussion um die ganzen Copycats der so genannten Web2.0 Communities/Portale/Applikation auch noch einmal aus einem anderen Blickwinkel betrachten: der Sicherheit.


Wenn Dinge „schnell, schnell“ aus dem Boden gestampft werden, ist die Gefahr immer hoch, dass der Sicherheitsaspekt darunter leiden mag. Sicherheit hat ihren Preis und im Übrigen auch dank des Internets und seiner Vielfalt auch eine immer höhere Komplextität. Von Dingen wie UTF-7 XSS Bugs mag ich gar nicht reden, aber ein Mindestmaß an Filterung/Validierung, um zumindest die gröbsten Schnitzer à la

„><script>alert(document.cookie);</script><a href=“

zu vermeiden, sollte schon drin sein. Speziell, wenn es sich um die so genannten „Web2.0 Communities“ handelt, bei denen eine Heerschar an Nutzern ihre Profile pflegt.


Das Problem ist doch ganz einfach folgendes: