I add a datepicker with jQuery datepicker and make use of the altFormat '@' --> see
// Function datepicker
$("#obsDate").datepicker({
altField: '#actualDate',
altFormat: '@', // Gives a timestamp dateformat
dateFormat: "dd-mm-yy",
showOn: "button",
buttonImage: $("#datePickerImg").val(),
buttonImageOnly: true,
});
When the user picks a value the unix timestamp is set. Like : 1312840800000
This is in miliseconds so i id do /1000
But when i convert the timestamp with the function in C#
private static DateTime ConvertFromUnixTimestamp(double timestamp)
{
var origin = new DateTime(1970, 1, 1, 0, 0, 0, 0);
return origin.AddSeconds(timestamp);
}
I get always one day ealier..
What i'm doing wrong?
UPDATED: When i use the build in function of javascript gettime()
var ts = Math.round((new Date().getTime() / 1000));
I get the right timestamp...
- Example with getTime() i get: 30-08-2011 --> 1314628036
Example with the datepicker i get : 29-08-2011 --> 1314568800.
This is also with ticks (!) in the datepicker.
I add a datepicker with jQuery datepicker and make use of the altFormat '@' --> see http://docs.jquery.com/UI/Datepicker/formatDate
// Function datepicker
$("#obsDate").datepicker({
altField: '#actualDate',
altFormat: '@', // Gives a timestamp dateformat
dateFormat: "dd-mm-yy",
showOn: "button",
buttonImage: $("#datePickerImg").val(),
buttonImageOnly: true,
});
When the user picks a value the unix timestamp is set. Like : 1312840800000
This is in miliseconds so i id do /1000
But when i convert the timestamp with the function in C#
private static DateTime ConvertFromUnixTimestamp(double timestamp)
{
var origin = new DateTime(1970, 1, 1, 0, 0, 0, 0);
return origin.AddSeconds(timestamp);
}
I get always one day ealier..
What i'm doing wrong?
UPDATED: When i use the build in function of javascript gettime()
var ts = Math.round((new Date().getTime() / 1000));
I get the right timestamp...
- Example with getTime() i get: 30-08-2011 --> 1314628036
Example with the datepicker i get : 29-08-2011 --> 1314568800.
This is also with ticks (!) in the datepicker.
Share Improve this question edited Sep 25, 2012 at 12:33 Eli 14.8k5 gold badges61 silver badges77 bronze badges asked Aug 29, 2011 at 13:42 amernovamernov 6361 gold badge8 silver badges18 bronze badges 7 | Show 2 more comments4 Answers
Reset to default 4This obviously IS a timezone problem.
getTime()
This function returns milliseconds since 'epoch', meaning you get Unix timestamp * 1000
as seen from local computer.
See If javascript “(new Date()).getTime()” is run from 2 different Timezones.
datepicker({altFormat: '@'})
From what I see in jQuery
library, datepicker
internally uses formatDate
function, that takes timezone into account
(I started here: jQuery.datepicker.formatDate and timezone offset...)
So, on my PC I get 2 hours difference. I can't think of an easy way to resolve this, but you could try the following: datetimepicker getDate to return Date / Time in UTC format
Its likely that .NET doesn't know what timezone it is. You will have to define this.
In relation to your code you must do next:
private static DateTime ConvertFromUnixTimestamp(double timestamp)
{
var origin = new DateTime(1970, 1, 1, 0, 0, 0, 0);
return origin.AddSeconds(timestamp - origin.getTime());
}
This is because Jan 1th in most of the timezones is correspond to a unixtime value that is greater than zero or one. So, more correctly you must set (not add) the timestamp to the Datetime
object (if you have a setter).
Round Trip Precision
Unix times (milliseconds since 1 Jan 1970) are less precise than .Net DateTime's so for round-trip to .Net code, or if comparing to a .Net DateTime, this is the precision to expect. E.g. .Net -> Unix -> .Net can be off by this amount.
var unixEpochPrecision = new TimeSpan(TimeSpan.TicksPerMillisecond);
It might not really matter, but you should at least be aware of it.
Convert to Unix Epoch milliseconds
UTC dates are required when you start, or you will otherwise be off by a few hours. This makes sense because Unix times, irrespective of time zone, are always expressed in UTC. If only .net was that way.
DateTime UnixEpochBaseDate = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
Int64 ToUnixTime(DateTime utcValue)
{
if (utcValue.Kind != DateTimeKind.Utc)
throw new ArgumentException("Value must be in Utc.");
var result = (utcValue - UnixEpochBaseDate).Ticks / TimeSpan.TicksPerMillisecond;
return result;
}
Convert from Unix Epoch to CLR DateTime
Just use the milliseconds as-is. This makes the JavaScript easier to code also.
DateTime ToClrDateTime(Int64 unixEpochMilliseconds)
{
var clrDateTime = UnixEpochBaseDate +
new TimeSpan(unixEpochMilliseconds * TimeSpan.TicksPerMillisecond);
return clrDateTime;
}
In JavaScript
var date = new Date(1381367291665);
new DateTime(1970, 1, 1);
? – Daniel A. White Commented Aug 29, 2011 at 13:48new DateTime(1, 1, 1, 0, 0, timestamp)
give the right date? – Blazemonger Commented Aug 29, 2011 at 13:58