I am currently using NodaTime based on my frustrations dealing with timezones in C#'s DateTime
class. So far, I'm really pleased.
public static string nodaTimeTest(string input)
{
var defaultValue = new OffsetDateTime(new LocalDateTime(2000, 1, 1, 0, 0), Offset.Zero);
var pattern = OffsetDateTimePattern.Create("yyyy-MM-dd'T'HH:mm:sso<m>", CultureInfo.InvariantCulture, defaultValue);
var result = pattern.Parse(input).Value;
return result.ToString();
}
I have three specific questions. Above is the method I use where I parse in dateTime strings. I have a format
string which allows me how to parse the input. My questions are:
Does it matter what my LocalDateTime(..)
is? The method I used is Matt Johnson's Stack example, and his came with the date 2000, 1, 1, 0, 0
. I thought that was odd, since most date classes I know use the Epoch time 1970, 1, 1, 0 ,0
, so I changed my method to contain the Epoch date, but the outputs were the same:
How do I convert the time to a Unix timestamp? It does not appear there's a built-in method to do so.
Using this method:
public static string nodaTimeTest6(string input, int timeZone)
{
// var defaultValue = new OffsetDateTime(new LocalDateTime(2000, 1, 1, 0, 0), Offset.Zero);
var defaultValue = new OffsetDateTime(new LocalDateTime(2000, 1, 1, 0, 0), Offset.FromHours(timeZone));
var pattern = OffsetDateTimePattern.Create("yyyy-MM-dd'T'HH:mm:sso<m>", CultureInfo.InvariantCulture, defaultValue);
var result = pattern.Parse(input);
return result.Value.ToString();
}
I'm testing out the abilities of NodaTime with this method -- specifically, I was wondering if I can parse in a date/time that HAS offset defined inside, and at the same time, my timeZone
input also allows input of timezones/offsets. Interestingly enough, my input timeZone
gets ignored, so the offset in my output of nodaTimeTest6
is the input date string:
Is this desired behavior?
The
OffsetDateTimePattern.Create
method requires a default value. It's only used if the parsing were to fail and you didn't checkresult.Success
before usingresult.Value
.The other patterns have an overload that doesn't require a default value (see issue #267). I chose the particular default value of
2000-01-01T00:00:00.0000000+00:00
because it's similar to what the other patterns use when you don't specify a default explicitly.There really isn't any significance though. You can use any default you wish.
The
result.Value
is anOffsetDateTime
. TheInstant
type uses the Unix epoch, so you can do this:Note that Unix timestamps are precise to the nearest second. If you're passing to JavaScript, you'd want to use
TicksPerMillisecond
and return it in along
.Sorry, but I don't fully understand what you're asking here. Can you please clarify?
From the code you provided, it looks like you are confusing the offset for the default value with the offset for the input string. The default value is only used if parsing fails.
If you want to control the offset instead of including it in the input, then use a
LocalDateTimePattern
instead of anOffsetDateTimePattern
to do the parsing. After it's parsed, you can associated it with a particular zone.Also, watch your naming conventions.
int timeZone
doesn't make sense (that's an offset, not a time zone). Perhapsint offsetHours
, or better yet,Offset timeZoneOffset
.