I have an issue with encoding of process.standartInput encoding. i am using some process in my windows form application but input should be UTF-8. Process.StandardInput.Encoding is read only so i can't set it to UTF-8 and it gets windows default encoding which deteriorate native characters which are good in UTF-8. 2 processes are used in the program one writes output to a file and other reads. Since i can set up output encoding to UTF-8 that part is working properly but reading back is the part where i am having problems. I'll include the part where i use the process.
ProcessStartInfo info = new ProcessStartInfo("mysql");
info.RedirectStandardInput = true;
info.RedirectStandardOutput = false;
info.Arguments = mysqldumpstring;
info.UseShellExecute = false;
info.CreateNoWindow = true;
Process p1 = new Process();
p1.StartInfo = info;
p1.Start();
string res = file.ReadToEnd();
file.Close();
MessageBox.Show(p1.StandardInput.Encoding.EncodingName); //= where encoding should be Encoding.UTF8;
p1.StandardInput.WriteLine(res);
p1.Close();
got it working now set my application output type to console application and managed to hide the console window appears before the forms. It basically works like normal only when program run, a console windows pops and hides.
I've just encountered this problem and was unable to use the
Console.InputEncoding
technique because it only seems to work in console applications.Because of this I tried Victor's answer, however I encountered the same issue as the commenter MvanGeest where by the BOM was still being added. After a while I discovered that it is possible to create a new instance of UTF8Encoding that has the BOM disabled, doing this stops the BOM from being written. Here is a modified version of Victor's example showing the change.
Hope this saves someone some time.
Another solution is to set the Console InputEncoding before you create the process.
Using of StreamWriter created the next way (instead of StandardInput) gives desired result: