How would I work out the difference for two Date() objects in JavaScript, while only return the number of months in the difference?
Any help would be great :)
How would I work out the difference for two Date() objects in JavaScript, while only return the number of months in the difference?
Any help would be great :)
Calculate the difference between two dates include fraction of month (days).
There are two approaches, mathematical & quick, but subject to vagaries in the calendar, or iterative & slow, but handles all the oddities (or at least delegates handling them to a well-tested library).
If you iterate through the calendar, incrementing the start date by one month & seeing if we pass the end date. This delegates anomaly-handling to the built-in Date() classes, but could be slow IF you're doing this for a large number of dates. James' answer takes this approach. As much as I dislike the idea, I think this is the "safest" approach, and if you're only doing one calculation, the performance difference really is negligible. We tend to try to over-optimize tasks which will only be performed once.
Now, if you're calculating this function on a dataset, you probably don't want to run that function on each row (or god forbid, multiple times per record). In that case, you can use almost any of the other answers here except the accepted answer, which is just wrong (difference between
new Date()
andnew Date()
is -1)?Here's my stab at a mathematical-and-quick approach, which accounts for differing month lengths and leap years. You really should only use a function like this if you'll be applying this to a dataset (doing this calculation over & over). If you just need to do it once, use James' iterative approach above, as you're delegating handling all the (many) exceptions to the Date() object.
}
anyVar = (((DisplayTo.getFullYear() * 12) + DisplayTo.getMonth()) - ((DisplayFrom.getFullYear() * 12) + DisplayFrom.getMonth()));