I am trying to port the AquaticPrime framework for Mac to Windows.
On the Mac, it uses the opensll library, and I try to understand how to port this to Windows, where I have to use the CryptoAPI, I guess.
I mainly need the code for validation of the generated signature with a given public key.
Here's how verification is done with openssl:
- inputs: license data, public key and signature, both 128 bytes long.
- A SHA1 digest is calculated from the license data.
- A RSA context is set up with the public key data
- RSA_public_decrypt() is called, given the RSA key and the signature, which returns a 20 byte long SHA1 digest - is this digest equal the one from step 2, the signature is valid.
So, how do I do this with CryptoAPI? I've gotten this far:
- Start with CryptAcquireContext(ctx, 0, 0, PROV_RSA_FULL, CRYPT_VERIFYCONTEXT)
- Use CryptImportKey with the help of this posting, with pubexp=3 and bitlen=1024. That all works, i.e. I get no errors, and I looked at the binary data to verify that it matches what the MSDN article shows.
- Create a SHA1 digest from the license data. I've retrieved the resulting 20 byte hash value and see that it matches what I get with openssl on the Mac.
At this point, I call:
CryptVerifySignature (hashHdl, sig, sigLen, keyHdl, 0, 0)
This fails with error code ERROR_INVALID_PARAMETER.
The odd thing is that when I first accidentally had stored a twice as large public key into the PUBLICKEYBLOB structure, I received a NTE_BAD_SIGNATURE error instead. This might suggest that now the public key I am passing is correct.
Why the ERROR_INVALID_PARAMETER error now, then? I've verified that the hash value is correct, and the key appears to be accepted, too. And the "sig" parameter is just a pointer to the 128 bytes of the signature, and sigLen is 128.
So, what am I missing here?