Why does Javascript getYear() return 108?

JavascriptDate

Javascript Problem Overview


Why does this javascript return 108 instead of 2008? it gets the day and month correct but not the year?

myDate = new Date();
year = myDate.getYear();

year = 108?

Javascript Solutions


Solution 1 - Javascript

It's a Y2K thing, only the years since 1900 are counted.

There are potential compatibility issues now that getYear() has been deprecated in favour of getFullYear() - from quirksmode: >To make the matter even more complex, date.getYear() is deprecated nowadays and you should use date.getFullYear(), which, in turn, is not supported by the older browsers. If it works, however, it should always give the full year, ie. 2000 instead of 100. > >Your browser gives the following years with these two methods:

* The year according to getYear(): 108
* The year according to getFullYear(): 2008

There are also implementation differences between Internet Explorer and Firefox, as IE's implementation of getYear() was changed to behave like getFullYear() - from IBM:

>Per the ECMAScript specification, getYear returns the year minus 1900, originally meant to return "98" for 1998. getYear was deprecated in ECMAScript Version 3 and replaced with getFullYear(). > >Internet Explorer changed getYear() to work like getFullYear() and make it Y2k-compliant, while Mozilla kept the standard behavior.

Solution 2 - Javascript

Since getFullYear doesn't work in older browsers, you can use something like this:

Date.prototype.getRealYear = function() 
{ 
    if(this.getFullYear)
        return this.getFullYear();
    else
        return this.getYear() + 1900; 
};

Javascript prototype can be used to extend existing objects, much like C# extension methods. Now, we can just do this;

var myDate = new Date();
myDate.getRealYear();
// Outputs 2008

Solution 3 - Javascript

Check the docs. It's not a Y2K issue -- it's a lack of a Y2K issue! This decision was made originally in C and was copied into Perl, apparently JavaScript, and probably several other languages. That long ago it was apparently still felt desirable to use two-digit years, but remarkably whoever designed that interface had enough forethought to realize they needed to think about what would happen in the year 2000 and beyond, so instead of just providing the last two digits, they provided the number of years since 1900. You could use the two digits, if you were in a hurry or wanted to be risky. Or if you wanted your program to continue to work, you could add 100 to the result and use full-fledged four-digit years.

I remember the first time I did date manipulation in Perl. Strangely enough I read the docs. Apparently this is not a common thing. A year or two later I got called into the office on December 31, 1999 to fix a bug that had been discovered at the last possible minute in some contract Perl code, stuff I'd never had anything to do with. It was this exact issue: the standard date call returned years since 1900, and the programmers treated it as a two-digit year. (They assumed they'd get "00" in 2000.) As a young inexperienced programmer, it blew my mind that we'd paid so much extra for a "professional" job, and those people hadn't even bothered to read the documentation. It was the beginning of many years of disillusionment; now I'm old and cynical. :)

In the year 2000, the annual YAPC Perl conference was referred to as "YAPC 19100" in honor of this oft-reported non-bug.

Nowadays, in the Perl world at least, it makes more sense to use a standard module for date-handling, one which uses real four-digit years. Not sure what might be available for JavaScript.

Solution 4 - Javascript

It must return the number of years since the year 1900.

Solution 5 - Javascript

use date.getFullYear().

This is (as correctly pointed out elsewhere) is a Y2K thing. Netscape (written before 2000) originally returned, for example 98 from getYear(). Rather than return to 00, it instead returned 100 for the year 2000. Then other browsers came along and did it differently, and everyone was unhappy as incompatibility reigned.

Later browsers supported getFullYear as a standard method to return the complete year.

Solution 6 - Javascript

This question is so old that it makes me weep with nostalgia for the dotcom days!

That's right, Date.getYear() returns the number of years since 1900, just like Perl's localtime(). One wonders why a language designed in the 1990s wouldn't account for the century turnover, but what can I say? You had to be there. It sort of made a kind of sense at the time (like pets.com did).

Before 2000, one might have been tempted to fix this bug by appending "19" to the result of getYear() resulting in the http://www.theregister.co.uk/2000/01/04/transmeta_screws_up_on_y2k/">"year 19100 bug". Others have already answered this question sufficiently (add 1900 to the result of getDate()).

Maybe the book you're reading about JavaScript is a little old?

Thanks for the blast from the past!

Solution 7 - Javascript

You should, as pointed out, never use getYear(), but instead use getFullYear().

The story is however not as simple as "IE implements GetYear() as getFullYear(). Opera and IE these days treat getYear() as getYear() was originally specified for dates before 2000, but will treat it as getFullYear() for dates after 2000, while webkit and Firefox stick with the old behavior

This outputs 99 in all browsers:

javascript:alert(new Date(917823600000).getYear());

This outputs 108 in FF/WebKit, and 2008 in Opera/IE:

javascript:alert(new Date().getYear());

Solution 8 - Javascript

It's dumb. It dates to pre-Y2K days, and now just returns the number of years since 1900 for legacy reasons. Use getFullYear() to get the actual year.

Solution 9 - Javascript

I am using date.getUTCFullYear(); working without problems.

Solution 10 - Javascript

The number you get is the number of years since 1900. Don't ask me why..

Solution 11 - Javascript

As others have said, it returns the number of years since 1900. The reason why it does that is that when JavaScript was invented in the mid-90s, that behaviour was both convenient and consistent with date-time APIs in other languages. Particularly C. And, of course, once the API was established they couldn't change it for backwards compatibility reasons.

Solution 12 - Javascript

BTW, different browsers might return different results, so it's better to skip this function altogether and and use getFullYear() always.

Solution 13 - Javascript

var date_object=new Date(); var year = date_object.getYear(); if(year < 2000) { year = year + 1900; } //u will get the full year ....

Solution 14 - Javascript

it is returning 4 digit year - 1900, which may have been cool 9+ years ago, but is pretty retarded now. Java's java.util.Date also does this.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionctrlShiftBryanView Question on Stackoverflow
Solution 1 - JavascriptConroyPView Answer on Stackoverflow
Solution 2 - JavascriptFlySwatView Answer on Stackoverflow
Solution 3 - JavascriptskiphoppyView Answer on Stackoverflow
Solution 4 - JavascriptPaige RutenView Answer on Stackoverflow
Solution 5 - JavascriptDanView Answer on Stackoverflow
Solution 6 - JavascriptjjohnView Answer on Stackoverflow
Solution 7 - JavascriptArveView Answer on Stackoverflow
Solution 8 - JavascriptjoelhardiView Answer on Stackoverflow
Solution 9 - JavascriptD3vitoView Answer on Stackoverflow
Solution 10 - JavascriptNils PipenbrinckView Answer on Stackoverflow
Solution 11 - Javascriptuser11318View Answer on Stackoverflow
Solution 12 - JavascriptMilan BabuškovView Answer on Stackoverflow
Solution 13 - JavascriptvinothView Answer on Stackoverflow
Solution 14 - Javascriptuser17163View Answer on Stackoverflow