When I go to my browser development tools and write new Date() in the console, it gives me the following
Mon Dec 18 2017 17:11:29 GMT+0200
I thought it suppose to return the UTC time.
The issue is that I have a server on AWS which writes UTC time to a DB. it writes it as a string and on the client side I do the following
const updatedMilliAgo = new Date() - new Date(timeStrFromDb);
For some reason the diff is two hours even due I check it right after the write in the server.
What am I doing wrong here?
When I go to my browser development tools and write new Date() in the console, it gives me the following
Mon Dec 18 2017 17:11:29 GMT+0200
I thought it suppose to return the UTC time.
The issue is that I have a server on AWS which writes UTC time to a DB. it writes it as a string and on the client side I do the following
const updatedMilliAgo = new Date() - new Date(timeStrFromDb);
For some reason the diff is two hours even due I check it right after the write in the server.
What am I doing wrong here?
Share Improve this question asked Dec 18, 2017 at 15:18 Edgar BarberEdgar Barber 3851 gold badge3 silver badges14 bronze badges 3- Did you look at the documentation for Date and see that the UTC methods are? – epascarello Commented Dec 18, 2017 at 15:22
- Yes, I couldn't find the answer there... – Edgar Barber Commented Dec 18, 2017 at 15:27
- Does this answer your question? How do you convert a JavaScript date to UTC? – Michael Freidgeim Commented Jan 9, 2022 at 4:09
3 Answers
Reset to default 10When you use new Date()
, you are constructing a Date
object using the current value of the system clock where the code is executing. The Date
object internally stores only a number of milliseconds since 1970-01-01T00:00:00Z (without consideration of leap seconds). In other words, Date
objects are always representing the UTC time.
However - there are many functions and parameters on the Date
object that work in local time. For example, when you call .toString()
on a Date
object, the puter's local time zone is applied to the internal UTC-based value, in order to generate a string that reflects local time.
In the case of console.log
- a standard object like Date
cannot be directly logged. Instead, most implementations will log a string value. How that value is created is entirely implementation specific, and not defined by the ECMAScript specification. Many implementations will return the same local-time based value that .toString()
returns. Some (FireFox, for example) will return the same UTC based value that .toISOString()
returns. It would be reasonable for an implementation to return the actual number of milliseconds stored (.valueOf()
), or some other representation. If you need consistency, don't just log the Date
object. Instead, log the output of one of its functions that returns a string or a number.
You also asked about subtracting two date objects. That will implicitly call .valueOf()
on each object, subtracting their UTC-based internal values and giving you the number of milliseconds between them. The most likely problem you are encountering is with how you construct the second Date
object. You didn't give an example of what timeStrFromDb
consists of, but understand that how that string is formatted directly relates to how the Date
object is constructed. If you aren't using a standardized format, or you aren't clear on whether the value is based on UTC or a specific offset from UTC, your string may be parsed differently than you expect.
Try to use
var d = new Date();
var n = d.toUTCString();
console.log(n)
I am late, but I didn't find the answer and I had to fight for hours vs the code... finally my solution was as simple as that:
TimeZone timeZone = TimeZone.getTimeZone(ZoneId.of("UTC"));
timeZone.setDefault(timeZone);
When you set the default timezone, all dates you create will be in that ZoneId that you specify.
I hope to help to someone.