The first thing I notice about the Date object is that it keeps track of time in milliseconds since January 1, 1970. That's almost the way the Unix operating system keeps track of time. Almost.
However, Unix keeps time in seconds since 1970 rather than milliseconds.
The next thing I notice is that the Date object has over 15 get methods that allow you to get the time of day in various formats. For example, there is the getFullYear() method which will calculate the current year from the Date object. Here's how to print the current year using this method:
var dt = new Date(); alert(dt.getFullYear());
You can run the above 2 lines of code in a JavaScript sandbox. Just do a search on JavaScript sandbox using your favorite search engine.
The getTime() method returns a huge number, the number of milliseconds since January 1, 1970:
var dt = new Date(); alert(dt.getTime());
Currently, in 2013, that's over 1 trillion milliseconds.
So basically all the get methods are really conversion functions. They convert time from a raw number, such as milliseconds since 1970, to another number, such as the number of years since 1970.
I'm curious as to why JavaScript stores time in milliseconds rather than seconds. Why does Unix store time in seconds while JavaScript stores time in milliseconds? I don't have an answer to this question.
Ed Abbott