Someone recently asked me this question and I thought I'd post it on Stack Overflow to get some input.
Now obviously both of the following scenarios are supposed to fail.
#1:
DECLARE @x BIGINT
SET @x = 100
SELECT CAST(@x AS VARCHAR(2))
Obvious error:
Msg 8115, Level 16, State 2, Line 3
Arithmetic overflow error converting expression to data type varchar.
#2:
DECLARE @x INT
SET @x = 100
SELECT CAST(@x AS VARCHAR(2))
Not obvious, it returns a * (One would expect this to be an arithmetic overflow as well???)
Now my real question is, why??? Is this merely by design or is there history or something sinister behind this?
I looked at a few sites and couldn't get a satisfactory answer.
e.g. http://beyondrelational.com/quiz/sqlserver/tsql/2011/questions/Why-does-CAST-function-return-an-asterik--star.aspx
http://msdn.microsoft.com/en-us/library/aa226054(v=sql.80).aspx
Please note I know/understand that when an integer is too large to be converted to a specific sized string that it will be "converted" to an asterisk, this is the obvious answer and I wish I could downvote everyone that keeps on giving this answer. I want to know why an asterisk is used and not an exception thrown, e.g. historical reasons etc??