I feel pretty stupid asking about this, but since I don't know the answer, I go anyway.
I am trying to execute some authentication code and want to know why the byte array that I get from Rfc2898DeriveBytes must be converted to HEX and back to the byte array again so that it correctly initializes my HMACSHA1 object. I feel like I'm doing something stupid or just skipping something obvious.
My client code is a crypto-js based javascript function;
var key256bit = Crypto.PBKDF2(passwordEntered, saltBytes, 32, { iterations: 1000 }); var hmacBytes = Crypto.HMAC(Crypto.SHA1, url, key256bit, { asBytes: true }); var base64Hash = Crypto.util.bytesToBase64(hmacBytes);
My server code is as follows:
Rfc2898DeriveBytes rfc2898 = new Rfc2898DeriveBytes(password, encoding.GetBytes(salt), 1000); byte[] key = rfc2898.GetBytes(32); // Don't think I should need to do this. // However, it wont work if I initialise HMACSHA1 // with the rfc2898.GetBytes(32) string test = ByteArrayToString(key); HMACSHA1 hmacSha1 = new HMACSHA1(encoding.GetBytes(test)); byte[] computedHash = hmacSha1.ComputeHash(encoding.GetBytes(requestUri)); string computedHashString = Convert.ToBase64String(computedHash);
My ByteArrayToString method, which I hired from the Internet:
private static string ByteArrayToString(byte[] ba) { StringBuilder hex = new StringBuilder(ba.Length * 2); foreach (byte b in ba) hex.AppendFormat("{0:x2}", b); return hex.ToString(); }
So, I see that I get 32 ββbytes from my call to rfc2898.GetBytes(32)
. I converted this to HEX using the ByteArrayToString method to confirm that it matches what I see in my Javascript key256bit variable. Now my test variable is a string of length 64, and when I pass it to the HMACSHA1 constructor using encoding.GetBytes (test), this is a byte array of length 64.
Doco is a bit absent in crypto-js, but I thought it was a call to Crypto.PBKDF2 with parameter 32, and it created a key 32 bytes long (or 256 bits).
Any clarification is greatly appreciated.