IOS and .NET produce different AES256 results

I have been doing this for several days now. My initial (and possible) goal was to use CommonCrypto on iOS to encrypt a password with a given IV and key, and then successfully decrypt it using .NET. After a lot of research and crashes, I narrowed down my goal - just to create the same encrypted bytes on iOS and .NET, and then from there.

I created simple test projects in .NET (C #, framework 4.5) and iOS (8.1). Please note that the following code is not intended to provide security, but rather displays variables in a larger process. Also, iOS is a variable here. . The final .NET encryption code will be deployed by the client, so I need to align iOS encryption. If this is not confirmed, it is impossible that the .NET code does not change.

Relevant .NET Encryption Code:

static byte[] EncryptStringToBytes_Aes(string plainText, byte[] Key, byte[] IV) { byte[] encrypted; // Create an Aes object // with the specified key and IV. using (Aes aesAlg = Aes.Create()) { aesAlg.Padding = PaddingMode.PKCS7; aesAlg.KeySize = 256; aesAlg.BlockSize = 128; // Create an encryptor to perform the stream transform. ICryptoTransform encryptor = aesAlg.CreateEncryptor(Key, IV); // Create the streams used for encryption. using (MemoryStream msEncrypt = new MemoryStream()) { using (CryptoStream csEncrypt = new CryptoStream(msEncrypt, encryptor, CryptoStreamMode.Write)) { using (StreamWriter swEncrypt = new StreamWriter(csEncrypt)) { //Write all data to the stream. swEncrypt.Write(plainText); } encrypted = msEncrypt.ToArray(); } } } return encrypted; } 

Corresponding iOS encryption code:

 +(NSData*)AES256EncryptData:(NSData *)data withKey:(NSData*)key iv:(NSData*)ivector { Byte keyPtr[kCCKeySizeAES256+1]; // Pointer with room for terminator (unused) // Pad to the required size bzero(keyPtr, sizeof(keyPtr)); // fetch key data [key getBytes:keyPtr length:sizeof(keyPtr)]; // -- IV LOGIC Byte ivPtr[16]; bzero(ivPtr, sizeof(ivPtr)); [ivector getBytes:ivPtr length:sizeof(ivPtr)]; // Data length NSUInteger dataLength = data.length; // See the doc: For block ciphers, the output size will always be less than or equal to the input size plus the size of one block. // That why we need to add the size of one block here size_t bufferSize = dataLength + kCCBlockSizeAES128; void *buffer = malloc(bufferSize); size_t numBytesEncrypted = 0; CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding, keyPtr, kCCKeySizeAES256, ivPtr, data.bytes, dataLength, buffer, bufferSize, &numBytesEncrypted); if (cryptStatus == kCCSuccess) { return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted]; } free(buffer); return nil; } 

Corresponding code for passing pass, key and IV to .NET and the print result:

 byte[] c_IV = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16 }; byte[] c_Key = { 16, 15, 14, 13, 12, 11, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1 }; String passPhrase = "X"; // Encrypt byte[] encrypted = EncryptStringToBytes_Aes(passPhrase, c_Key, c_IV); // Print result for (int i = 0; i < encrypted.Count(); i++) { Console.WriteLine("[{0}] {1}", i, encrypted[i]); } 

Corresponding code for passing parameters and printing the result in iOS:

 Byte c_iv[16] = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16 }; Byte c_key[16] = { 16, 15, 14, 13, 12, 11, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1 }; NSString* passPhrase = @"X"; // Convert to data NSData* ivData = [NSData dataWithBytes:c_iv length:sizeof(c_iv)]; NSData* keyData = [NSData dataWithBytes:c_key length:sizeof(c_key)]; // Convert string to encrypt to data NSData* passData = [passPhrase dataUsingEncoding:NSUTF8StringEncoding]; NSData* encryptedData = [CryptoHelper AES256EncryptData:passData withKey:keyData iv:ivData]; long size = sizeof(Byte); for (int i = 0; i < encryptedData.length / size; i++) { Byte val; NSRange range = NSMakeRange(i * size, size); [encryptedData getBytes:&val range:range]; NSLog(@"[%i] %hhu", i, val); } 

After running the .NET code, it issues the following bytes after encryption:

[0] 194
[1] 154
[2] 141
[3] 238
[4] 77
[5] 109
[6] 33
[7] 94
[8] 158
[9] 5
[10] 7
[11] 187
[12] 193
[13] 165
[14] 70
[15] 5

Conversely, iOS prints the following after encryption:

[0] 77
[1] 213
[2] 61
[3] 190
[4] 197
[5] 191
[6] 55
[7] 230
[8] 150
[9] 144
[10] 5
[11] 253
[12] 253
[13] 158
[14] 34
[15] 138

I canโ€™t let my life determine what makes this difference. Some things that I have already confirmed:

  • Both iOS and .NET can successfully decrypt their encrypted data.

  • Lines of code in a .NET project:

    aesAlg.Padding = PaddingMode.PKCS7;
    aesAlg.KeySize = 256;
    aesAlg.BlockSize = 128;

Do not influence the result. They can be commented on, and the result is the same. I assume this means that they are the default value. I just left them to make it obvious. I map iOS encryption properties as close as possible to this example.

  1. If I print the bytes in the iOS NSData objects "ivData" and "keyData", it creates the same list of bytes with which I created them, so I donโ€™t think it is C <> ObjC bridge problem for the source parameters.

  2. If I print bytes in the iOS variable "passData", it prints the same single byte as .NET (88). Therefore, I am sure that they start encryption using the same data.

Because of how clear the .NET code is, I have run out of obvious experimentation possibilities. My only thought is that someone might point out a problem in my method AES256EncryptData: withKey: iv :. This code has been modified from the ubiquitous iOS AES256 code floating around because the key we provided is a byte array, not a string. I have studied quite a bit in ObjC, but not so convenient with C bullshit, so of course, maybe I was looking for the necessary changes.

Any help or suggestions are welcome.

+6
source share
2 answers

I notice that you are using AES256, but have a 128-bit key! 16 byte x 8 bit. You cannot count on various functions to apply a key in the same way, i.e. undefined.

+2
source

You are probably dealing with a string encoding problem. In the iOS code, I see that you pass the string as UTF-8, which will result in a single-byte string "X" .. NET by default uses UTF-16, which means that you have a double-byte string "X".

You can use How to convert string to UTF8? to convert your string to a UTF-8 byte array in .NET. You can try writing an array of bytes of a plain text string in both cases to determine that you are actually transmitting the same bytes.

0
source

Source: https://habr.com/ru/post/980948/


All Articles