I have an X.509 certificate that contains a dataset with the following IMPLICIT [0] tag:
A0 81 C6 (value)...
And I have this excerpt from the standards document:
The IMPLICIT [0] tag is not used for DER encoding, rather EXPLICIT The SET tag is used. That is, DER encoding of the EXPLICIT SET OF tag, not the IMPLICIT [0] tag, MUST be included along with the length and octets of the contents of the value.
I searched a lot, but I can’t understand what exactly the standard requires. I am looking for some clarification.
EDIT: Here is the standard I follow: http://tools.ietf.org/html/rfc3852
I am trying to verify the signature of X.509 and I need to calculate the message digest in order to do this. This certificate includes the optional SignedAttributes in the SignerInfo type. I havehed the signed content and verified that the message digest in SignedAttributes is correct. The standard states that if SignedAttributes are present, it must be hashed and encrypted to create a certificate signature. The standard also states that the SignedAttributes tag should be changed as described in the original question.
Here is the Asn.1 grammar for SignerInfo:
SignerInfo ::= SEQUENCE {
version CMSVersion,
sid SignerIdentifier,
digestAlgorithm DigestAlgorithmIdentifier,
signedAttrs [0] IMPLICIT SignedAttributes OPTIONAL,
signatureAlgorithm SignatureAlgorithmIdentifier,
signature SignatureValue,
unsignedAttrs [1] IMPLICIT UnsignedAttributes OPTIONAL }
SignerIdentifier ::= CHOICE {
issuerAndSerialNumber IssuerAndSerialNumber,
subjectKeyIdentifier [0] SubjectKeyIdentifier }
SignedAttributes ::= SET SIZE (1..MAX) OF Attribute
UnsignedAttributes ::= SET SIZE (1..MAX) OF Attribute
Attribute ::= SEQUENCE {
attrType OBJECT IDENTIFIER,
attrValues SET OF AttributeValue }
AttributeValue ::= ANY
SignatureValue ::= OCTET STRING
source
share