I have a 12 varchar(50) fields in a table of about 90 million rows that I need to increase to varchar(100) in length. Each of these fields has an index (with only the one field as a member). If I increase the length of the varchar fields, will the indexes either need to be rebuilt (or would be rebuilt automatically), or would the statistics become out of date?
I don't have a reasonably sized test system test (or perhaps dont know how to see if the indexes were rebuilt or statistics need to be recomputed).
Thanks
Statistics are based on what is in columns, not their size, so your statistics won't become out of date.
Bear in mind that sometimes when you alter a column, SQL server will add a new column to the table and drop the existing one, so it will have to rebuild the index in that case anyway.
Despite searching I can't categorically say whether you have to rebuild indexes, but I would rebuild regardless (as this should be occuring as part of your normal maintenance, assuming this table is not read-only or very low writes)
BTW, having a seperate index on each of those columns might not be optimal. Have you profiled your query workload?