I need to optimize a query fetching all the deals in a certain country before with access by users before a certain datetime a certain time.
My plan is to implement the following index
add_index(:deals, [:country_id, :last_user_access_datetime])
I am doubting the relevance and efficientness of this index as the column last_user_access_datetime can have ANY value of date ex: 13/09/2015 3:06pm and it will change very often (updated each time a user access it). That makes an infinite number of values to be indexed if I use this index?
Should I do it or avoid using 'infinite vlaues possible column such as a totally free datetime column inside an index ?
If you have a query like this:
Then the best index is the one you propose.
If you have a heavy load on the machine in terms of accesses (and think many accesses per second), then maintaining the index can become a burden on the machine. Of course, you are updating the last access date time value anyway, so you are already incurring overhead.
The number of possible values does not have an effect on the value. A database cannot store an "infinite" number of values (at least on any hardware currently available), so I'm not sure what your concern is.
The index will be used. Time for UPDATE and INSERT statements just take that much longer, because the index is updated each time also. For tables with much more UPDATE/INSERT than SELECTs, it may not be fruitful to index the column. Or, you may want to make an index that looks more like the types of queries that are hitting the table. Include the IDs and timestamps that are in the SELECT clause. Include the IDs and timestamps that are in the WHERE clause. etc.
Also, if a table has a lot of DELETEs, a lot of indices can slow down operations a lot.