What I am doing is getting the pixel size of a string and converting it to hundredths of an inch (ie. pixels/DPI = inches, inches * 100 = hundredths of an inch). Here is my code:
private static SizeF TextSize(string text, Font txtFnt)
{
SizeF txtSize = new SizeF();
// The size returned is 'Size(int width, int height)' where width and height
// are the dimensions of the string in pixels
Size s = System.Windows.Forms.TextRenderer.MeasureText(text, txtFnt);
// Value based on normal DPI settings of 96
txtSize.Width = (float)Math.Ceiling((float)s.Width / 96f * 100f);
txtSize.Height = (float)Math.Ceiling((float)s.Height / 96f * 100f);
return txtSize;
}
Now, using Arial font this all works fine for fonts smaller that 12, but after that characters start getting cut off because the calculated size is smaller that the actual size. I know that my DPI settings are set at 96. My fonts are all defined the same with the variation in font size:
Font myFont = new Font("Arial", <font size>, FontStyle.Regular, GraphicsUnit.Point);
I believe that I have to use GraphicsUnit.Point
because of the custom control I am drawing the strings to, but does the GraphicsUnit
matter?
Is the MeasureText
function even working correctly, or is there something else going on?
EDIT
I am drawing to a custom print preview control. The units in the print preview control are 'Inches/100' (hence the conversion). I believe the text, images, etc are drawn with the Printer graphics object, but I'm not entirely sure.