I am trying to implement RSA encryption with Base64 encoding. The sequence is:
String -> RSA encrypt -> Base64 encoder -> network -> Base64 decoder* -> RSA decrypt > String
I'm sending the base64 encoded string with a over the network and read it as a string on the other side, after all Base64 is text, right?
Now For Some Reason when I decode the Base64, I am getting more bytes out than which I originally sent.
On the sender side, my RSA string is 512 bytes. After Base64 encoding its 1248 long (this varies each time). On the receiver side, my Base64 encoded received string is still 1248 long but when I decode it then I suddenly get 936 bytes. Then I can not decipher it with RSA because the ciper.doFinal method hangs.
I am assuming this has something todo with byte to unicode string conversion but I cannot figure out in which step this happens and how I can fix it.
Sender side code:
cipher = Cipher.getInstance("RSA/NONE/OAEPWithSHA256AndMGF1Padding");
cipher.init(Cipher.ENCRYPT_MODE, getPublicKey());
byte[] base64byes = loginMessage.getBytes();
byte[] cipherData = cipher.doFinal(base64byes);
System.out.println("RSA: " + cipherData.length); //is 512 long
//4. Send to scheduler
Base64PrintWriter base64encoder = new Base64PrintWriter(out);
base64encoder.writeln(new String(cipherData)); //send string is 1248 long
base64encoder.flush();
Receiver side code:
System.out.println("Base 64: " + encodedChallenge.length()); //1248 long
byte[] base64Message = encodedChallenge.getBytes();
byte[] rsaEncodedMessage = Base64.decode(base64Message);
System.out.println("RSA: " + rsaEncodedMessage.length); //936 long
cipher = Cipher.getInstance("RSA/NONE/OAEPWithSHA256AndMGF1Padding");
cipher.init(Cipher.DECRYPT_MODE, privateKey);
cipherData = cipher.doFinal(rsaEncodedMessage); //hangs up
System.out.println("Ciper: " + new String(cipherData));
P.S. Base64PrintWriter is a PrintWriter that I have decorated to convert every output to base64 before writing it out.