SQLServer datetime format is stored as 8 bytes where the first four bytes are number of days since Jan 1, 1900 and the other four bytes are number of ticks since midnight. And the tick is 1/300 of the second.
I'm wondering why is that? Where is that 1/300 came from? There must be some historic reason for that.
Yes, there is a historical reason: UNIX !
For details, read this excelent article by Joe Celko.
Here is the detail you're looking for:
Temporal data in T-SQL
used to be a prisoner of UNIX system clock ticks and could only go
to three decimal seconds with rounding errors. The new ANSI/ISO data
types can go to seven decimal seconds, have a true DATE and TIME data
types. Since they are new, most programmers are not using them yet.