I tried this aproach without any success
the code I'm using:
// File name
String filename = String.Format("{0:ddMMyyHHmm}", dtFileCreated);
String filePath = Path.Combine(Server.MapPath("App_Data"), filename + ".txt");
// Process
myObject pbs = new myObject();
pbs.GenerateFile();
// pbs.GeneratedFile is a StringBuilder object
// Save file
Encoding utf8WithoutBom = new UTF8Encoding(true);
TextWriter tw = new StreamWriter(filePath, false, utf8WithoutBom);
foreach (string s in pbs.GeneratedFile.ToArray())
tw.WriteLine(s);
tw.Close();
// Push Generated File into Client
Response.Clear();
Response.ContentType = "application/vnd.text";
Response.AppendHeader("Content-Disposition", "attachment; filename=" + filename + ".txt");
Response.TransmitFile(filePath);
Response.End();
the result:
It's writing the BOM no matter what, and special chars (like Æ Ø Å) are not correct :-/
I'm stuck!
My objective is create a file using UTF-8 as Encoding and 8859-1 as CharSet
Is this so hard to accomplish or I'm just getting a bad day?
All help is greatly appreciated, thank you!
Well it writes the BOM because you are instructing it to, in the line
true
means that the BOM should be emitted, usingwrites no BOM.
Sadly, this is not possible, either you write UTF-8 or not. I.e. as long as the characters you are writing are present in ISO Latin-1 it will look like a ISO 8859-1 file, however as soon as you output a character that is not covered by ISO 8859-1 (e.g. ä,ö, ü) these characters will be written as a multibyte character.
To write true ISO-8859-1 use:
Edit: After balexandre's comment
I used the following code for testing ...
And the file looks perfectly well. Obviously, you should use the same encoding when reading the file.