Just an example on what you can code that looks pretty correct… until it stops working
Why would a JavaScript application stop working moving from 2007 to 2008? Why would a PHP script display January to July and then skip August and September, to then continue correctly with October through December?Recently i fixed two bugs that had nearly the same reason, one in PHP and one in Javascript. I found one application would not display dates correctly starting in 2008, the other application would display months january through July but not August and September.
The PHP bug resulted from this snippet
function monthname($mon) {
// checks and i18n removed for simplicity
array $months = (
01 => „January“,
02 => „February“,
…
08 => „August“,
09 => „September“,
…
12 => „December“);
return $months[$mon];
}
the Javascript one resulted from this:
// this was also more complicated, being a function that converts the date from the dd-mmm-yy format to a real date in JS
date = „25-Sep-08“;
parts = date.split(‚-‚);
year = parseInt(parts[2]) + 2000;
// year is now 2000
So what was the reason? PHP aswell as JavaScript interpret numbers starting with a 0 as octal, and there is no octal 08 nor 09 – 01 through 07 are the same as in decimal, so no deviation was visible before.
I solved the two issues by giving up the pretty formatting in PHP (i could also have changed the keys to be strings and used sprintf() to access the array) and providing a second parameter to parseInt(), the base.
Schreibe einen Kommentar