Exactly as explained here, https://stackoverflow.com/a/30624755/294884
[It is a mistake to think that] the JPEG representation of an image is a UTF-8-encoded string. It's not. It's arbitrary binary data. UTF-8 strings aren't arbitrary sequences of bytes; there are rules about what bytes can follow other bytes.
In fact, how do you best make a utf8 data from an image, in the Data
era?
The everyday use case is in a http form.
I do something like this,
formData.append( "Content-Disposition etc".(using: .utf8)! )
formData.append( "Content type etc".(using: .utf8)! )
// now let's add a #@@%$%$% image
let trueBinary = UIImageJPEGRepresentation(i, 0.10)!
print("the size seems to be \(trueBinary.count)")
let mystere = trueBinary.base64EncodedString(
options: .lineLength76Characters)
formData.append( mystere.data(using: .utf8)! )
But
A, that seems poxy, and probably doesn't work anyway.
B, the "lineLength" concept means nothing to me, so I just guess stuff, and type something in there.
Note that, incredibly confusingly, if you have a Data
x and you add to x a few strings as utf8 using append
..... And then you add a 'real' binary data, such as a jpeg representation to x
...... in fact it nils "x" - ouch! (It doesn't just "not add the binary one", it scratches x to nothing.)