This SO question ECDSA sign using OpenSSL without ASN1 encoding the hash states the OpenSSL perfoms ASN1 encoding to the hash before signing it.
In other words it states that OpenSSL performs the following steps when for an Elliptic curve key -sign is called: a. Calculate H = Hash(M) b. Encode H into ASN1 standard- H’ c. Sign H’
And thus to avoid applyting step b it's neccessary to first calculate the digest, and then sign the digest using raw signing - pkeyutl for elliptic curver keys
However when I run BOTH -sign and -dgst+ -pkeyutl I am able to verify the signature using -verify in both cases. This implies that ASN1 encoding is NOT being applied to the hash.
Can anyone throw some light on this topic? I was not able to find documentation in the OpenSSL documentation.
OpenSSL applies ASN.1 DER encoding to the output of the signature.
https://en.wikipedia.org/wiki/Elliptic_Curve_Digital_Signature_Algorithm#Signature_generation_algorithm
The problem is that the ECDSA algorithm ends with math, not with bytes. Two different conventions have arisen for how to turn that pair of numbers into bytes. (Contrast with RSA, whose last step says how to turn the mathematical value back into bytes and declares that byte sequence to be the signature).
Let's assert that we produced a signature (against secp256r1) with
r = 67432751043532511959904657272700966685609390316545000351652696368910338707793
ands = 15800012655857962601029927988066555130680701005265153794330961
.ASN.1 DER
Used by X.509/PKIX and OpenSSL. First declared (that I can find, anyways) by RFC 3279, sec 2.2.3:
DER works best if we build it inside out, so let's encode
r
.r
is an integer,67432751043532511959904657272700966685609390316545000351652696368910338707793
in decimal, or951595A548D156D51655159654ADA548D156D5165195159654ADA54D156D5151
hex. ITU-T-REC-X.690-201508 says that the integer gets encoded as a signed big endian value. Since the most significant byte (0x95
) has the high bit set this is a negative number, so we need to insert an extra0x00
to keep the number positive. Sor
takes 32+1 = 33 bytes:s
has decimal value15800012655857962601029927988066555130680701005265153794330961
, or hexadecimal value9D51655159654ADA548D156D5165195159654ADA54D156D5151
. While it starts with hex 9 it's actually0x09
, so no padding byte is required.s
only takes 27 bytes of content, because it's so small compared tor
.And now we can calculate the size of the containing SEQUENCE to be 63 bytes:
Or, linearized:
IEEE P1363
Used by Windows.
In this format
r
ands
are taken to be big integers of the same byte size asn
, then concatenated. Sincer
uses all 32 bytes it's good to go.s
only uses 26 bytes, so it needs 6 leading0x00
bytes.Conclusion
So, OpenSSL applies an ASN.1 encoding to the signature, not the hash. The ASN.1 encoding is "more common" (in that it's what's used in ECC certificates). The Windows/IEEE way is easier. The ASN.1 way usually ends up about 6 bytes larger (or 7 bytes larger on average for secp521r1); but could (1 in 2^32 chance) end up the same size, or (1 in 2^40 chance) smaller.
Also, if you ever find yourself on a committee designing a new signature scheme, remember to add the step declaring the wire representation.