Javascript Date.UTC() function is off by a month?

JavascriptDatetime

Javascript Problem Overview


I was playing around with Javascript creating a simple countdown clock when I came across this strange behavior:

var a = new Date(), 
now = a.getTime(),
then = Date.UTC(2009,10,31),
diff = then - now,
daysleft = parseInt(diff/(24*60*60*1000));
console.log(daysleft );

The days left is off by 30 days.

What is wrong with this code?

Edit: I changed the variable names to make it more clear.

Javascript Solutions


Solution 1 - Javascript

The month is zero-based for JavaScript.

Days and years are one-based.

Go figure.

UPDATE

The reason this is so, from the creator of JavaScript, is

> JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.

http://www.jwz.org/blog/2010/10/every-day-i-learn-something-new-and-stupid/#comment-1021

Solution 2 - Javascript

As Eric said, this is due to months being listed as 0-11 range.

This is a common behavior - same is true of Perl results from localtime(), and probably many other languages.

This is likely originally inherited from Unix's localtime() call. (do "man localtime")

The reason is that days/years are their own integers, while months (as a #) are indexes of an array, which in most languages - especially C where the underlying call is implemented on Unix - starts with 0.

Solution 3 - Javascript

It's an old question but this is still a problem today (or a feature as some might say - and they are wrong).

JS is zero-based month, why? Because.

That means the months range from 0-11 (only the months, the others are normal)

How can you fix this? Add a month, obviously, BUUUUT:

Don't do this :

let date: Date = new Date();
date.setMonth(date.getMonth() + 1);

Why you might ask? Because it won't work as expected, Date in JS is terrible.

You have to make a ... let's call it not so beautiful function to translate the JS date to a normal date

formatJsDateToNormalDate(Date date): string | null {
  if(date !== null) {
        const realMonth: number = date.getMonth() + 1;
        let month: string = (realMonth < 10) ? '0' + realMonth : String(realMonth);
        let day: string = (date.getDate() < 10) ? '0' + date.getDate() : String(date.getDate());
        
        return [date.getFullYear(), month, day].join('-');
  } else {
    return null;
}

Again, if you ask me this is the equivalent of hammering a screw, it's not the right way, but there is no right way here, it's a bug that has been going on for 27 years and more to come.

Solution 4 - Javascript

date1 = new Date();
//year, month, day [, hrs] [, min] [, sec]
date1 = new Date.UTC(date1.getFullYear(),date1.getMonth()+1,date1.getDate(),date1.getHours(),date1.getMinutes(),date1.getSeconds());

date2 = new Date();
date2 = date2.getTime();

alert(date1)
alert(date2)

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionpicardoView Question on Stackoverflow
Solution 1 - JavascriptEric J.View Answer on Stackoverflow
Solution 2 - JavascriptDVKView Answer on Stackoverflow
Solution 3 - Javascriptlate1View Answer on Stackoverflow
Solution 4 - JavascriptpeterView Answer on Stackoverflow