A client and server hold a matching certificate.
When the server sends a communication, it decodes the serial number in the certificate and passes that to the client. The client can then obtain the serial from its copy of the certificate and compare that with the one presented by the server. These should match.
The string representation of the serial in the certificate is displayed as follows: -
58 17 9B 11 9E 0E F3 86 4A 41 DF A2 EE 60 92 08 58 17 9B 11 9E 0E F3 86 4A 41 DF A2 EE 60 92 08
The Windows server extracts the bytes with the method: X509Certificate.GetSerialNumber. The bytes extracted are seen to be: -
8 146 96 238 162 223 65 74 134 243 14 158 17 155 23 88 8 146 96 238 162 223 65 74 134 243 14 158 17 155 23 88
On OS X (the client), using the Core Foundation function SecCertificateCopySerialNumber, the bytes extracted return: -
88 23 -101 -09 14 -13 -122 74 65 -33 -94 -18 96 -110 8 88 23 -101 17 -98 14 -13 -122 74 65 -33 -94 -18 96 -110 8
Clearly, these do not match. In addition, using Qt, it is possible to use QSslCertificate and obtain the serial by calling serialNumber(). However, this returns the following bytes: -
53 56 58 49 55 58 57 98 58 49 49 58 57 101 58 48 101 58 102 51 58 56 54 58 52 97 58 52 49 58 100 102 58 97 50 58 101 101 58 54 48 58 57 50 58 48 56 58 53 56 58 49 55 58 57 98 58 49 49 58 57 101 58 48 101 58 102 51 58 56 54 58 52 97 58 52 49 58 100 102 58 97 50 58 101 101 58 54 48 58 57 50 58 48 56
What is going on here, why do none of the byte arrays match up and how can I extract the serial number on OSX that matches the serial from the Windows server?
String, unsigned chars, hexadecimal representation
Windows, unsigned chars, decimal representation with reverse order
OS X, signed chars, decimal representation
All three representation equals with each other