Problems combining openssl as console and library

2019-08-02 04:16发布

I would like to hide textual information (GLSL shaders in my case) used as strings in a C/C++ program, as they are directly readable within the binary. Therefore, I thought about encrypting the files during compile/build time and decrypt the data during runtime to continue with the reconstructed shaders.

However, I have some trouble in getting openssl on the console work together with the library (evp) in the C program. I have to admit that I am by no means an expert in cryptography but have to go in for this topic now...

Here's what I have tried:

// on the console:
openssl enc -aes-256-cbc -salt -in shader.frag -out shader.frag.enc

// ...

// in the program:

//// read enc file ////     
int lengthIN;
char * buffer_encIN;

ifstream is2;
is2.open( "/Path/To/My/Shader/shader.frag.enc", ios::binary );

// get length of file:
is2.seekg( 0, ios::end );
lengthIN = is2.tellg();
is2.seekg( 0, ios::beg );

// allocate memory:
buffer_encIN = new char[ lengthIN ];

// read data as a block:
is2.read( buffer_encIN, lengthIN );
is2.close();


//// decryption ////

char mykey[EVP_MAX_KEY_LENGTH] = "changeit"; // also tried: unsigned char mykey[] = {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15};
char iv[EVP_MAX_IV_LENGTH] = "01020304"; // also tried: unsigned char iv[] = {1,2,3,4,5,6,7,8};
int tmp_len = 0, in_len, out_len=0;
EVP_CIPHER_CTX ctx;

in_len = strlen( buffer_encIN );
char * buffer_dec = new char[ in_len ];

// decrypt
EVP_DecryptInit( &ctx, EVP_aes_256_cbc(), (unsigned char *)mykey, (unsigned char *)iv );
EVP_DecryptUpdate( &ctx, (unsigned char *)buffer_dec, &out_len, (unsigned char *)buffer_encIN, in_len );
tmp_len += out_len;
EVP_DecryptFinal( &ctx, (unsigned char *)&buffer_dec[ out_len ], &out_len );

printf( "Output:\n%s\n", buffer_dec );

I am stuck here with two problems. First, most of the things work out nicely only if I use the -nosalt option, which is not applicable for deployment. At least I get EVP_DecryptInit and *Update to return 1, but *Final results in 0: several bytes at the end are messed up then. Second, using the full version (i.e. with salt) I cannot get things up and running at all :(

In a nutshell: is this the right approach and I just have to do my homework (help esp. on salt/IV appreciated ;)), or is this just spending hours and getting no more security than applying some ROT13 scheme to hide the string?

Any help and comments much appreciated!
Matthias

1条回答
走好不送
2楼-- · 2019-08-02 05:07

Coming from the side of reverse engineering, I'd suggest to not bother. Your keys will have to be stored inside your app as well, and it's only marginally harder to find where you're storing keys and how you're encrypting your shaders than it is to just get at the shaders directly. In my experience, shaders don't have that much proprietary code in them as it is, so I'd suggest you just embed it in clear text.

Doing ROT13 would obviously be easier and thwart the simplest of attacks of people just searching your binaries for 'vec3' or the like.

The question you need to ask yourself is: who are you trying to prevent from looking at your shader source? The casual observer? In that case, a ROT13 might be sufficient. A skilled reverse engineer? Then your in-process encryption won't pose much of a protection.

If you are trying to protect your data in earnest and are writing a network-enabled application, consider sending your shaders over the wire and clearing your memory once they're sent to the GPU.

查看更多
登录 后发表回答